Spring: Define AbstractRoutingDataSource which's targetDataSources are filled from seperate (H2-)DataSource - java

I try to setup a spring/spring-boot project which has an AbstractRoutingDataSource whichs targetDataSources should be filled from a separate DataSource (actually embedded H2-DataSource).
I have tried many different things (configuring multiple EntityManagerFactories and TransactionManagers) but somewhere I always either get a circular reference or the Repository-Bean which should provide the details for my targetDataSources is null when autowiring it into the RoutingDataSource.
The thing seems to be that I can't use any DataSource dependency inside initialization of my RoutingDataSource, because it's a DataSource itself and therefore will be created before any Repository-Beans.
Can you give me a hint how the approach would look to configure
A H2-Datasource which is initialized first
A Repository for this (and only this) H2-DataSource
A RoutingDataSource which depends on the Repository and loads its targetDataSources from it

Related

Spring Initialisation Order with Configuration

I have a Spring project, and I need to configure Flyway.
Using the default FlywayAutoConfiguration, Flyway migration is immediatly executed before everything else (caching, PostConstruct annotations, services). This is the behavior I expected (in term or startup workflow)
Unfortunatly, I need to override the default FlywayAutoConfiguration because I use a custom Flyway implementation, but that's not the my main problem here, my issue is really related to Spring Priority for configuration and initialization sequence.
So do use my own flyway, I first copied FlywayAutoConfiguration to my maven module, name it CustomFlywayAutoConfiguration and just adapt imports. I also change the property "spring.flyway.enabled" to false and create another one "spring.flywaycustom.enabled" to be able to activate mine and not the default one.
Doing that fully change the sequence of startup. Now, flyway is executed at the end of the startup sequence (after caching and other #PostConstruct that are in my project)
The following bean defined in CustomFlywayAutoConfiguration is now created onyl at the end of the startup sequence. With the default FlywayAutoConfiguration , was well created on the beginning.
#Bean
#ConditionalOnMissingBean
public FlywayMigrationInitializer flywayInitializer(Flyway flyway,
ObjectProvider<FlywayMigrationStrategy> migrationStrategy) {
return new FlywayMigrationInitializer(flyway, migrationStrategy.getIfAvailable());
}
I tryied to play a lot with ordering (HIGHEST_PRECEDENCE, and LOWEST_PRECEDENCE)
#AutoConfigureOrder on configuration class
#Order on components
Try to Autowire FlywayMigrationInitializer to force the bean initialisation earlier
It's doesn't change anything, looks like Spring ignore #Order and #AutoConfigureOrder
Any idea why when Configuration is inside spring-boot-autoconfigure dependencies, it started as first priority, and when the same Configuration code is inside my project, I don't have the same ordering?
Thank you so much for your help.
Thanks to your answer, Focusin my attention FlywayMigrationInitializerEntityManagerFactoryDependsOnPostProcessor help me to solve the issue.
#Configuration(proxyBeanMethods = false)
#AutoConfigureOrder(Ordered.HIGHEST_PRECEDENCE)
#EnableConfigurationProperties({ DataSourceProperties.class, FlywayProperties.class })
#Import({ FlywayMigrationInitializerEntityManagerFactoryDependsOnPostProcessor.class }) // Very important to force flyway before
public class CustomFlywayConfiguration {
#Bean
public Flyway flyway(FlywayProperties properties, DataSourceProperties dataSourceProperties,
ResourceLoader resourceLoader, ObjectProvider<DataSource> dataSource,
#FlywayDataSource ObjectProvider<DataSource> flywayDataSource,
ObjectProvider<FlywayConfigurationCustomizer> fluentConfigurationCustomizers,
ObjectProvider<JavaMigration> javaMigrations, ObjectProvider<Callback> callbacks) {
FluentConfiguration configuration = new FluentConfiguration(resourceLoader.getClassLoader());
DataSource dataSourceToMigrate = configureDataSource(configuration, properties, dataSourceProperties, flywayDataSource.getIfAvailable(), dataSource.getIfUnique());
checkLocationExists(dataSourceToMigrate, properties, resourceLoader);
configureProperties(configuration, properties);
List<Callback> orderedCallbacks = callbacks.orderedStream().collect(Collectors.toList());
configureCallbacks(configuration, orderedCallbacks);
fluentConfigurationCustomizers.orderedStream().forEach((customizer) -> customizer.customize(configuration));
configureFlywayCallbacks(configuration, orderedCallbacks);
List<JavaMigration> migrations = javaMigrations.stream().collect(Collectors.toList());
configureJavaMigrations(configuration, migrations);
return new CustomFlyway(configuration);
}
/**
* FlywayAutoConfiguration.FlywayConfiguration is conditioned to the missing flyway bean. #Import annotation are not executed in this case.
*
* So if we declare our own Flyway bean, we also need to create this bean to trigger the flyway migration.
*
* The main issue is now that bean is create after the #PostConstruct init method in MyInitComponent.
*
*/
#Bean
public FlywayMigrationInitializer flywayInitializer(Flyway flyway,
ObjectProvider<FlywayMigrationStrategy> migrationStrategy) {
return new FlywayMigrationInitializer(flyway, migrationStrategy.getIfAvailable());
}
/**
* Thanks to that, it's now working, because make sure it's required before to create EntityManager
*/
// #ConditionalOnClass(LocalContainerEntityManagerFactoryBean.class)
// #ConditionalOnBean(AbstractEntityManagerFactoryBean.class)
static class FlywayMigrationInitializerEntityManagerFactoryDependsOnPostProcessor
extends EntityManagerFactoryDependsOnPostProcessor {
FlywayMigrationInitializerEntityManagerFactoryDependsOnPostProcessor() {
super(FlywayMigrationInitializer.class);
}
}
...
So that confirm we can't easily override the Flyway bean without copy some logic from FlywayAutoConfiguration.
I created a small project to reproduce the bug, with the solution I found.
https://github.com/w3blogfr/flyway-issue-demo
I don't know if it's necessary to fix it or not in spring auto-configuration project?
I checked the history to find back the commit
https://github.com/spring-projects/spring-boot/commit/795303d6676c163af899e87364846d9763055cf8
And the ticket was this one.
https://github.com/spring-projects/spring-boot/issues/18382
The change looks technical and probably he missed that was an #ConditionalOnClass
None of the annotations that you've described have an impact on runtime semantics:
#AutoConfigureOrder works only on auto-configurations (so if you've copied the auto-configuration as a user config, it won't even be considered). This is used to order how auto-configurations are parsed (typical use case: make sure an auto-config contribute a bean definition of type X before another one check if a bean X is available).
#Order orders components of the same type. Doesn't have any effect as to "when" something occurs
Forcing initialisation using an injection point is a good idea but it'll only work if the component you're injecting it into is initialised itself at the right time.
The auto-configuration has a bunch of post-processors that create links between various components that use the DataSource and the FlywayMigrationInitializer. For instance, FlywayMigrationInitializerEntityManagerFactoryDependsOnPostProcessor makes sure that FlywayMigrationIntializer is a dependency of the entity manager so that the database is migrated before the EntityManager is made available to other components. That link is what's making sure Flyway is executed at the right time. I don't know why that's not working with your copy, a sample we can run ourselves shared on GitHub could help us figure out.
With all that said, please don't copy an auto-configuration in your own project. I'd advise to describe your use case in more details and open an issue so that we consider improving the auto-configuration.

Spring bean creation lifecycle : Why having multiple interaction points?

I'm learning spring and I'm stuck with the bean lifecycle !
When creating a bean, spring gives us several maners to interact with it. We can use one or all of :
BeanFactoryPostProcessor
BeanPostProcessor#postProcessBeforeInitialization
#PostConstruct
InitializingBean#afterPropertiesSet
#Bean#initMethod
BeanPostProcessor#postProcessAfterInitialization
Spring calls them in the order above
My question is : why all these points of interaction ? and when to use each of them (the use case) ?
A BeanFactoryPostProcessor and a BeanPostProcessor are quite different beast and also apply to different things.
A BeanFactoryPostProcessor will operate on the metadata for a bean (I like to call it the recipe) and will post process that. A well know example is the PropertySourcesPlaceholderConfigurer which will inject/replace all #Value in configuration with the value. A BeanFactoryPostProcessor operates on the metadata and thus before any bean has been created.
The BeanPostProcessor can be applied to a bean and can even replace a bean. Spring AOP uses this quite a lot. An example is the PersistenceExceptionTranslationPostProcessor, when a bean has been created it will pass through this BeanPostProcessor and when it is annotated with #Persistence the bean will be replaced by a proxy. This proxy adds exception translation (i.e. it will convert JpaException and SQLException to the Spring DataAccessException). This is done in a BeanPostProcessor. And can be be done before the init-callbacks are called (the postProcessBeforeInitializationor after they have been called thepostProcessAfterInitialization). The PersistenceExceptionTranslationPostProcessorwill run in thepostProcessAfterInitialization` as it needs to be certain the bean has been initialized.
The ServletContextAwareProcessor will run right after the object has been created to inject the ServletContext as early as possible as the initializing of a bean might depend on it.
The 3 callbacks for initializing a bean are more or less the same but are called in sequence because they have been included in later versions. It starter with only an interface InitializingBean and init-method in xml (later also added to #Bean and the annotation support was added when annotations became a thing.
You need init methods to initialize a bean, you might want to check if all properties have been set (like a required datasource) or you might want to start something. A DataSource (especially a connection pool) is a good example to initialize. After all dependencies have been injected you want to start the pool so it will open the connections. Now as you probably cannot modify the code you want to use the init-method if you control the code you probably want to add #PostConstruct. If you are writing an framework that depends on Spring I would use the InitializingBean.
Next to those 3 init methods you also have the destruction counter-parts. The DisposableBean interface (and destroy-method) and the #PreDestroy annotation. Again when you stop your application you also want to close the connections in your connection pool and you want to probably call the close method on the DataSource. A perfect sample of a destruction callback.

How to set up custom behavior of spring data on a CDI environment

So, I have an application running on WildFly10, which uses JSF, Spring (DI), JPA, Spring Data;
Right now we're trying to move it to CDI and remove Spring(DI). For now we'll keep Spring Data.
So, I set up CDI and made an EntityManager producer.
#Produces
#Dependent
#PersistenceContext
public EntityManager entityManager;
So, I'm able to inject repositories with CDI and all.
However on my original environment we had a custom repository factory,that was defined in my SpringConfiguration like this:
#EnableJpaRepositories(basePackages = {"com.foo.repository" }, repositoryFactoryBeanClass=CustomJpaRepositoryFactoryBean.class)
So, the question is, how can I define this repositoryFactoryBeanClass=CustomJpaRepositoryFactoryBean.class on a CDI environment?
The current implementation doesn't allow the configuration of the JpaRepositoryFactoryBean as one can see from the code where it gets instantiated.
So I guess you have the following options:
reimplement the instantiation process.
open a feature request.
do 2. and 1. in a reusable way and submit the result as a PR for the issue.
When trying to solve this problem I found that the custom impl was not being picked up. The solution suggested in this question helped me however: https://stackoverflow.com/a/38541669/188624
Basically is uses Java8 default interface method to provide the additional functionality. I had to use the "CDI.current().select" method to get hold of the entity manager though as property injection of course won't work.
I tested with Spring Data JPA 2.0.0

how can I limit the context of a #Primary Bean?

I have a bean defined in a appContext named MyBean in MyProject1.
I have in other app that injects all the bean definitions of MyProject1 (including MyBean).
Now I need to override that bean but there is no easy way , so in MyProject2 I make
<bean primary="true" class=MyBean />
It works great, my question is....
What will happen with all others that were using MyBean? Will now use the new bean with the primary=true or how can I specify which should use this new bean and which should keep using the old one?
Following spring documentation primary bean is used only if there are some candidates to be injected. All created beans are in the context.
If you need only your primary bean to be injected, you can use:
#Autowired
private MyBean myBean;
So all your old beans will be replaced with primary one.
If you need to handle all MyBean beans (for example you do Chain Of Responsibilities) you can inject:
#Autowired
private List<MyBean> myBeans;
and injected object would contain all your bean instances (primary and nonprimary). As Usual primary bean can be accessed from list by 0 index: myBeans.get(0). All alternative markers (for example filter by vendorType...) to detect bean that you needs you should specify and handle in you code filtering collection, but in usual way if project architecture doesn't have issues you have no needs make alternative markers filtering bean objects in collection.
Do not forget about singleton if you need only one bean in your context.
If you have specified different bean unique names you can inject with #Qualifier (sometimes using #Resource from java API javax.annotation.Resource) specifying correspond name as parameter to detect bean by name.
Spring provides ability to inject properties using SpEL. May be they'll provide new functionality to inject beans using SpEL also (it could help you in your issue in best way).
Use profiles or #Conditional. #Qualifier is meant for resolving conflict for autowire, not for deactivating all but one. When using #Qualifier, all the candidate beans are still active.
Using profiles, when you want to use the bean from project2, simply start the app with a profile=project2; the link above describes multiple ways to do that.
See my answer here for a complete example of using #Conditional.

How to dynamically manage multiple datasources

Similar topics have been covered in other threads, but I couldn't find a definitive solution to my problem.
What we're trying to achieve is to design a web app which is able to:
read a datasource configuration at startup (an XML file containing multiple datasource definitions, which is placed outside the WAR file and it's not the application-context or hibernate configuration file)
create a session factory for each one of them (considering that each datasource is a database with a different schema)
switch at runtime to different datasources based on user input (users can select which datasource they want to use)
provide the correct dao object to manage user requests.
At the moment we have a DAO Manager object which is able to read the datasource configuration file and instantiate multiple session factories, saving them in a map. Each session factory is created with a configuration containing the proper hibernate mapping classes (different for each database schema). Moreover we have multiple DAO interfaces with their implementations, used to access "their database".
At this point we would need a way to get from the DAO Manager a specific DAO object, containing the right session factory attached, all based on the user request (basically a call from the above service containing the datasource id or a custom datasource object).
Ideally the service layer should use the DAO Manager to get a DAO object based on the datasource id (for instance), without worrying about it's actual implementation: the DAO Manager would take care of it, by creating the correct DAO object and injecting in it the right session factory, based on the datasource id.
My questions are:
Is this a good approach to follow?
How can I use Spring to dynamically inject in the DAO Manager multiple DAO implementations for each DAO interface?
Once the session factories are created, is there a way to let Spring handle them, as I would normally do with dependency injection inside the application-context.xml?
Would the 2nd Level Cache still work for each Session Factory?
Is this a good approach to follow?
It's probably the only possible approach. So, yes.
How can I use Spring to dynamically
inject in the DAO Manager multiple DAO
implementations for each DAO
interface?
Dynamically? I thought you wanted to do it at startup time. If so, just provide an accessor with a list or array:
public void setMyDaos(List<Mydao> daos){
this.daos = daos;
}
Once the session factories are
created, is there a way to let Spring
handle them, as I would normally do
with dependency injection inside the
application-context.xml?
This one's tough. I'd say you will probably have to store your sessionFactory bean in scope=session

Categories

Resources