MultiDb Configuration with Spring r2dbc always uses only one Database - java

I have a simple Multi Database setup to try out Multi Database configuration with r2dbc.
However, it is not working as expected, it always uses the first Database.
#Configuration
#EnableR2dbcRepositories(databaseClientRef="postgreDbClient". basePackages={"com.x.y.repo.postgresql"})
public class PostgreSqlConfiguration extends AbstractR2dbcConfiguration{
#Bean(name="postgresqlConnectionFactory")
ConnectionFactory connectionFactory(){
return ConnectionFactories.get("r2dbc:postgresql://<host>:5432/<database>");
}
#Bean(name="postgreDbClient")
DatabaseClient databaseClient(){
return DatabaseClient.create(this.connectionFactory());
}
}
#Configuration
#EnableR2dbcRepositories(databaseClientRef="mssqlDbClient". basePackages={"com.x.y.repo.mssql"})
public class PostgreSqlConfiguration extends AbstractR2dbcConfiguration{
#Bean(name="mssqlConnectionFactory")
ConnectionFactory connectionFactory(){
return ConnectionFactories.get("r2dbc:mssql://<host>:1433/<database>");
}
#Bean(name="mssqlDbClient")
DatabaseClient databaseClient(){
return DatabaseClient.create(this.connectionFactory());
}
}
com.x.y.repo.postgresql
-EmployeeRepository.java
-DepartmentRepository.java
com.x.y.repo.mssql
-PuchaseRepository.java
-SalesRepository.java
public interface EmployeeRepository extends R2dbcRepository<Employee, Integer>{
}
public interface PuchaseRepository extends R2dbcRepository<Purchase, Integer>{
}
The above is the simple representation of my code.
My requests go to Postgresql always, though basepackage is configured for mssql package com.x.y.repo.mssql

Not sure which version you are using, I also encountered the same issue when using the latest spring boot 2.4.0-M2/Spring Data R2dbc 1.2.0-M2.
Using AbstractR2dbcConfiguration is problematic here, check this quesiton. I was using MySQL and Postgres in the single application.
I finally resolved it by creating custom config and giving up AbstractR2dbcConfiguration, check the sample codes.

Related

NoSuchBeanDefinitionException with reactive mongo repository: required a bean of type that could not be found

I have an issue: repository bean couldn't be found when it's placed in outer package. It causes nested UnsatisfiedDependencyException which is due to NoSuchBeanDefinitionException (expected at least 1 bean which qualifies as autowire candidate).
After I copied the class to my project, it works perfectly. But I would like to use it as a dependency on external module.
This is repository class:
#Repository
public interface PersonRepository extends ReactiveMongoRepository<Person, String> {
}
And classes from project that should use the repository:
#Configuration
#ComponentScan("outer.package.repository")
#EnableMongoRepositories(basePackages = {
"outer.package.repository"
//"local.package.repository" // temporary solution, should be external
})
public class MyConfig {
}
#Service
#RequiredArgsConstructor
public class PersonService {
private final PersonRepository personRepository;
// do some stuff
}
As you see, I have all needed annotations on beans (#Repository, #Service, #Configuration), I registered mongo repositories (#EnableMongoRepositories) and even provided the directory to scan (#ComponentScan). Do you have any ideas what I've missed?
UPD: I'm using maven and project structure is like this:
src
main
java
com
example
configuration
MyConfig.java
controller
PersonController.java
repository
PersonRepository.java
service
PersonService.java
MainApplication.java
resources
test
pom.xml
I've tried to reproduce the issue and it seems that changing the annotation
#EnableMongoRepositories(basePackages = {
"outer.package.repository"
//"local.package.repository" // temporary solution, should be external
})
public class MyConfig {}
to its reactive equivalent:
#EnableReactiveMongoRepositories(basePackages = {
"outer.package.repository"
//"local.package.repository" // temporary solution, should be external
})
public class MyConfig {}
solved the issue.
More on that in the documentation
MongoDB uses two different drivers for imperative (synchronous/blocking) and reactive (non-blocking) data access. You must create a connection by using the Reactive Streams driver to provide the required infrastructure for Spring Data’s Reactive MongoDB support. Consequently, you must provide a separate configuration for MongoDB’s Reactive Streams driver. Note that your application operates on two different connections if you use reactive and blocking Spring Data MongoDB templates and repositories.

how to Load Parameters from database when the application start and keep it in memory

I need to load some data from a database table and save it in memory in order to be used when it's needed in the web applicaction. I'm using springboot and JPA/Hibernate. I guess the idea it's to run the query on boot and keep then on session or some kind of cache..
I'd like to know the apropiate way to do that and some examples if it's possible.
I did something similar but without spring and JPA and I'm not sure how to apply it here.
BTW, I'm pretty new on that (springboot and jpa/hibernate)
Thanks in advance
There're many ways to achieve this like you can use PostConstruct to pull, use application bootstrap event, you can use a bean that implements SmartLifeCycle/Lifecycle.
PostContruct
#Component
public class SomeService {
#PostConstruct
public void init(){
// pull data from JPA repository and store it
}
}
Using Lifecycle interface
#Component
public class SomeService implements Lifecycle{
#Override
public void start(){
// pull data using JPA and store it
}
//...
}
Using ApplicationReadyEvent listener
#Component
public class SomeService implements ApplicationListener<ApplicationReadyEvent> {
#Override
public void onApplicationEvent(ApplicationReadyEvent event) {
// pull data using JPA and store it
}
}
You can use other combinations as well.

using elasticsearchoperations vs elasticsearchtemplate whats the difference?

I am trying to figure out why I have to set my bean name to elasticsearchTemplate. Without it, my application crashes. I have the code below to configure my Rest client. The issue is if I don't add the elasticsearchTemplate as the bean name, it fails and says it cannot find elasticsearchTemplate. Any idea on why it does this and also what is the difference of using elasticsearchoperations vs elasticsearchtemplate?
Using Spring-Data-Elasticsearch Version 3.2
Using Java High-Level Rest Client Version 6.8.0
Works
#Bean("elasticsearchtemplate")
public ElasticsearchOperations elasticsearchTemplate() throws Exception {
return new ElasticsearchTemplate(client());
}
Doesn't Work
public ElasticsearchOperations elasticsearchTemplate() throws Exception {
return new ElasticsearchTemplate(client());
}
Maybe because the startup configuration (application.properties) is missing the configuration related to elasticsearch.
You need to define some elastic search properties in your application.properties file such as cluster-nodes, cluster-names which are used by ElasticsearchTemplate and ElasticsearchRepository to connect to the Elasticsearch engine.
as follows
You can manually configure rest client by extending AbstractElasticsearchConfiguration.
#Configuration
public class RestClientConfig extends AbstractElasticsearchConfiguration {
#Override
public RestHighLevelClient elasticsearchClient() {
return RestClients.create(ClientConfiguration.localhost()).rest();
}
}
what is the difference of using elasticsearchoperations vs elasticsearchtemplate?
The ElasticsearchTemplate is an implementation of the ElasticsearchOperations interface using the Transport Client.
https://docs.spring.io/spring-data/elasticsearch/docs/3.2.0.RELEASE/reference/html/#elasticsearch.operations.resttemplate

Create new SOLR collection at runtime

I am using a SOLR 7.1.0 Server with an JAVA spring boot application.
To communicate with the SOLR server I am using "springframework.data.solr"
I have a "template" schema from which I want to create new cores at the runtime.
The goal I want to achieve is to create a new core for each customer, while keeping the schema the same.
This is how my SolrConfig looks like:
#Configuration
#EnableSolrRepositories(basePackages = "com.my.repository", multicoreSupport = true)
#ComponentScan
public class SolrConfig {
#Bean
public SolrClient solrClient() {
return new HttpSolrClient("http://localhost:8983/solr");
}
#Bean
#Scope("prototype")
public SolrTemplate solrTemplate(SolrClient client) throws Exception {
return new SolrTemplate(client);
}
}
my repository interface:
public interface OpenItemsDebtorsRepository extends CustomOpenItemsDebtorsRepository, SolrCrudRepository<OpenItemDebtor, String> {
void setCore(String core);
#Query("orderNumber:*?0*~")
List<OpenItemDebtor> findByOrderNumber(String orderNumber);
}
I am looking for something like this:
solrTemplate.CreateNewCore(String coreName)
Do you have any suggestions?
I would strongly suggest to use the native Solr client (SolrJ) for your spring boot project. Have a service component created that would provide you an instance of the the Solr Server (CLoudSolrClient).
SolrJ has all the components that you would need to create and manage cores and collection.
I know this is not a straight answer but I hope this helps.

Run Flyway Java-based callbacks with Spring Boot

Is there a way to run Flyway Java-based callbacks with Spring boot?
I'm converting an existing project that after each migration updates some view definitions, and this is done by Java as it needs some extra logic. I know it could be done in pl/pgsql (we are using Postgres) but it is already done and tested in Java.
Spring boot docs says it is possible, but it is listed that the callback scripts should live in same dir as migrations, maybe this works just for SQL based callbacks.
This code works without Spring Boot:
Flyway flyway = new Flyway();
flyway.setDataSource(this.getDataSource());
flyway.setLocations("/db/migration");
flyway.setCallbacks(new LogMaintenanceFlywayCallback());
flyway.migrate();
I have several migrations in /db/migration and after each one I need to execute my callback. It works in my current project and I need to do the same (or another way to get the same behavior) in Spring Boot.
You can have a configuration like this and it will work:
#Configuration
public class FlywayFactory {
#Bean
public FlywayMigrationInitializer flywayInitializer(Flyway flyway, FlywayCallback flywayCallback) {
flyway.setCallbacks(flywayCallback);
return new FlywayMigrationInitializer(flyway);
}
#Bean
public FlywayCallback flywayCallback() {
return new LogMaintenanceFlywayCallback();
}
}
Since method setCallbacks(Callback... callbacks) of the Flyway has been deprecated and will be removed in Flyway 6.0, you can use new API and FlywayConfigurationCustomizer to set up custom Java-based callbacks. Then the configuration is as below:
#Configuration
public class FlywayFactory {
#Bean
public FlywayConfigurationCustomizer flywayConfigurationCustomizer() {
return configuration -> configuration.callbacks(new LogMaintenanceFlywayCallback());
}
}
There seems to be no possibility to set the callbacks in the Spring Boot autoconfiguration (See FlywayAutoConfiguration.java)
There are 2 things you can do:
Create your own Flyway instance in one of your Configuration classes. Spring Boot will not create his instance in case you do that.
Autowire the Flyway instance in one of your Configuration classes and call the setCallbacks method in a PostConstruct method (But it might be tricky to make sure you call the setter before the migration starts)
You can override the Flyway migration stragtey
#Component
public class CallbackFlywayMigrationStrategy implements FlywayMigrationStrategy {
#Override
public void migrate(Flyway flyway) {
flyway.setCallbacks(new LogMaintenanceFlywayCallback());
flyway.migrate();
}
}
You can define a bean of type org.flywaydb.core.api.callback.Callback as follows:
#Bean
public Callback logMaintenanceFlywayCallback() {
return new LogMaintenanceFlywayCallback();
}

Categories

Resources