I have 2 datasources. A main one (default) and secondary one. I have an AbstractRoutingDatasource implementation which chooses a datasource based on ThreadLocal variable and falls back to the default datasource if the ThreadLocal is empty. Very vanilla "like-tutorial" situation. In addition to that I have a bit of custom JdbcTemplate configuration. It's needed to make some String<->jsonb conversion work. Problem is that this configuration only takes effect if I fall back to the main datasource. As soon as the AbstractRoutingDatasource picks the non-main datasource, I get errors which I should not get because of the custom converters.
Why this is and what can I do to make the custom configuration work on all datasources no matter what the AbstractRoutingDatasource ends up picking?
application.yaml:
ds1:
username: smth1
password: smth1
host: smth1
ds2:
username: smth2
password: smth2
host: smth2
RoutingDatasource.kt:
class RoutingDatasource(
mainDs: DataSource,
customDatasources: Map<String, DataSource>
) :
AbstractRoutingDataSource() {
companion object {
private val MAIN_DS_NAME = "mainDs"
public val holder = ThreadLocal<String>()
}
init {
setDefaultTargetDataSource(mainDs)
setTargetDataSources(customDatasources + (MAIN_DS_NAME to mainDs))
}
override fun determineCurrentLookupKey() = holger.get() ?: MAIN_DS_NAME
}
CustomJdbcConfiguration.kt:
#Configuration
#EnableJdbcRepositories(
basePackages = ["my.package"],
jdbcOperationsRef = "mainJdbcTemplate",
dataAccessStrategyRef = "mainDataAccessStrategy"
)
class CustomJdbcConfiguration(
private val readingConverters: List<Converter<PGobject, *>>,
private val writingConverters: List<Converter<*, JdbcValue>>
) {
#Bean("ds1")
fun ds1(): DataSource {
// build ds1
}
#Bean("ds2")
fun ds2(): DataSource {
// build ds2
}
#Bean
#Primary
fun routingDataSource(
#Qualifier("ds1") ds1: DataSource,
#Qualifier("ds2") ds2: DataSource
): DataSource {
return RoutingDataSource(ds1, mapOf("secondary" to ds2))
}
#Bean("mainJdbcConversions")
#Primary
fun jdbcCustomConversions(): JdbcCustomConversions {
return JdbcCustomConversions(
listOf(
JsonbReadingConverter(),
JsonbWritingConverter(),
JsonReadingConverter()
) + readingConverters + writingConverters
)
}
#Bean
#Primary
fun jdbcTemplate(dataSource: DataSource, properties: JdbcProperties): JdbcTemplate {
val jdbcTemplate = JdbcTemplate(dataSource)
val template = properties.template
jdbcTemplate.fetchSize = template.fetchSize
jdbcTemplate.maxRows = template.maxRows
if (template.queryTimeout != null) {
jdbcTemplate.queryTimeout = template.queryTimeout.seconds.toInt()
}
return jdbcTemplate
}
#Bean("mainJdbcTemplate")
#Primary
fun namedParameterJdbcTemplate(jdbcTemplate: JdbcTemplate): NamedParameterJdbcTemplate {
return NamedParameterJdbcTemplate(jdbcTemplate)
}
#Bean("mainJdbcConverter")
#Primary
fun jdbcConverter(
mappingContext: JdbcMappingContext,
#Qualifier("mainJdbcTemplate") operations: NamedParameterJdbcOperations,
#Lazy relationResolver: RelationResolver,
#Qualifier("mainJdbcConversions") conversions: JdbcCustomConversions,
dialect: Dialect
): JdbcConverter {
val jdbcTypeFactory = DefaultJdbcTypeFactory(operations.jdbcOperations)
return BasicJdbcConverter(
mappingContext, relationResolver, conversions, jdbcTypeFactory, dialect.identifierProcessing
)
}
#Primary
#Bean("mainDataAccessStrategy")
fun dataAccessStrategyBean(
#Qualifier("mainJdbcTemplate") operations: NamedParameterJdbcOperations,
#Qualifier("mainJdbcConverter") jdbcConverter: JdbcConverter,
context: JdbcMappingContext,
dialect: Dialect
): DataAccessStrategy {
return DefaultDataAccessStrategy(
SqlGeneratorSource(context, jdbcConverter, dialect),
context,
jdbcConverter,
operations,
SqlParametersFactory(context, jdbcConverter, dialect),
InsertStrategyFactory(operations, BatchJdbcOperations(operations.jdbcOperations), dialect)
)
}
}
Now if I save an object which Kotlin object has a string field, but in the DB it's a jsonb and the holder is not set then all works well. If I set the holder to point to non-main database then converters don't work anymore and I get an error like this:
ERROR: column "your_column" is of type jsonb but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
Position: 174
Even if I flip the datasources, the effect is the same. Always works with the fallback one, never works when it actually routes to a secondary one.
P.S. In my testcase the actual database and the connection properties are exactly the same. So the problem is not on the database side.
This had nothing to do with multiple datasources and custom mappings directly.
Issue was that my end data-sources were Hikari datasources and I forgot to specify the property data-source-properties: stringtype=unspecified for them. As soon as I did that, everything started working correctly.
Related
I set up a small Spring Boot project that I use for certain monitorings. It has various endpoints to receive data via webservice calls and is supposed to store them into two different databases.
While everything works fine forthe primary data base, I cannot save data to the secondary database. Reading the existing data is working fine, but I cannot save anything. I'm always getting an exception:
javax.persistence.TransactionRequiredException: no transaction is in progress
This is the coding:
Everything related to Light, Motion and Temperature (LMT) is in one database. The information for Power (P) is stored in a second database (where I have the issues with).
My package set up is as follows:
- [Configuration]
- [Controller]
--- [Light]
--- [Motion]
--- [Temperature]
--- [Power]
- [Model]
--- [LMT-Models]
--- [P-Models]
- [Repository]
--- [lmt]
--- [PowerMonitoring]
In my Configuration package I have two persistence classes, one for LMT and one for Power, both handling the database connections for the respective repositories. The LMT one is the primary one, the Power one the secondary (or rather the non-primary):
Primary data source:
#Configuration
#EnableJpaRepositories(
basePackages = "com.delta.Monitoring.Repository.lmt",
entityManagerFactoryRef = "entityManagerFactory",
transactionManagerRef = "lmtTransactionManager"
)
public class LmtPersistenceConfiguration {
#Bean( name = "lmtDataSourceProperties")
#ConfigurationProperties("spring.datasource")
#Primary
public DataSourceProperties lmtDataSourceProperties() {
return new DataSourceProperties();
}
#Bean( name = "lmtDataSource")
#ConfigurationProperties("spring.datasource.configuration")
#Primary
public DataSource lmtDataSource() {
return lmtDataSourceProperties().initializeDataSourceBuilder()
.type(BasicDataSource.class).build();
}
#Bean(name = "entityManagerFactory")
#Primary
public LocalContainerEntityManagerFactoryBean entityManagerFactory(
EntityManagerFactoryBuilder builder) {
return builder
.dataSource(lmtDataSource())
.packages(new String[]{"com.delta.Monitoring.Model.lmt"})
.build();
}
#Bean (name = "lmtTransactionManager")
#Primary
public PlatformTransactionManager lmtTransactionManager(
final #Qualifier("entityManagerFactory") LocalContainerEntityManagerFactoryBean lmtEntityManagerFactory) {
return new JpaTransactionManager(lmtEntityManagerFactory.getObject());
}
}
Secondary data source:
#Configuration
#EnableJpaRepositories(
basePackages = "com.delta.Monitoring.Repository.PowerMonitoring",
entityManagerFactoryRef = "entityManagerFactorySecondary",
transactionManagerRef = "powerTransactionManager"
)
public class PowerPersistenceConfiguration {
#Bean( name = "powerDataSourceProperties")
#ConfigurationProperties("spring.powerdatasource")
public DataSourceProperties powerDataSourceProperties() {
return new DataSourceProperties();
}
#Bean( name = "powerDataSource")
#ConfigurationProperties("spring.powerdatasource.configuration")
public DataSource powerDataSource() {
return powerDataSourceProperties().initializeDataSourceBuilder()
.type(BasicDataSource.class).build();
}
#Bean(name = "entityManagerFactorySecondary")
public LocalContainerEntityManagerFactoryBean entityManagerFactory(
EntityManagerFactoryBuilder builder) {
return builder
.dataSource(powerDataSource())
.packages(new String[]{"com.delta.Monitoring.Model.PowerMonitoring"})
.build();
}
#Bean( name = "powerTransactionManager")
public PlatformTransactionManager powerTransactionManager(
final #Qualifier("entityManagerFactory") LocalContainerEntityManagerFactoryBean powerEntityManagerFactory) {
return new JpaTransactionManager(powerEntityManagerFactory.getObject());
}
}
This is what one of the Power-Repositories looks like. Since I'm not doing anything fancy with it (yet), I just extended the JpaRepository:
#Transactional("powerTransactionManager")
public interface PowerDaysRepository extends JpaRepository<PowerDay, PowerDayId> {
}
And finally, the controller, where I always get the exception:
#RestController
#RequestMapping("power")
public class PowerController {
private static final Logger myLog = LogManager.getLogger(PowerController.class);
#Autowired
PowerDayDetailsRepository powerDayDetailsRepository;
#PostMapping("/powerdaydetails")
public boolean insertPowerDayDetails(#RequestBody List<PowerDayDetails> powerDayDetails){
myLog.info("POST /power/powerdaydetails");
powerDayDetailsRepository.deleteAll();
//List<PowerDayDetails> lines = powerDayDetailsRepository.saveAllAndFlush(powerDayDetails);
List<PowerDayDetails> lines = powerDayDetailsRepository.saveAll(powerDayDetails);
myLog.info("Update size: " + lines.size());
myLog.info("Received data: " + powerDayDetails.size());
return lines.size() == powerDayDetails.size();
}
}
When I call the /powerdaydetails-endpoint I wanted to save the respective data in the database. First delete all and then save the newly received data.
When I use the saveAll() method, I am not getting the exception, however also nothing in stored in the database. This is the log:
2022-10-20 10:00:12.067 INFO 22842 --- [http-nio-8321-exec-88] c.m.H.Controller.Power.PowerController : POST /power/powerdaydetails
2022-10-20 10:00:12.639 INFO 22842 --- [http-nio-8321-exec-88] c.m.H.Controller.Power.PowerController : Update size: 582
2022-10-20 10:00:12.639 INFO 22842 --- [http-nio-8321-exec-88] c.m.H.Controller.Power.PowerController : Received data: 582
When I use the saveAllAndFlush() method, the above-mentioned exception occurs and - needless to say - also nothing gets stored in the database.
I've read a lot about the #Transactional and also the #Modifying topic, but that also never solved my problem. The primary data source and the respective repositories and controllers work perfectly and save all the data directly. It just seems that I have made a mistake with the second data source or haven't told Spring yet where/how to find a transaction for this.
It's quite simple and was in front of my eyes the whole time:
In the secondary data source configuration I named the entity manager factory bean "entityManagerFactorySecondary" but in the transaction manager method I only referred to "entityManagerFactory".
This is how they must look like:
return builder
.dataSource(powerDataSource())
.packages(new String[]{"com.delta.Monitoring.Model.PowerMonitoring"})
.build();
}
#Bean( name = "powerTransactionManager")
public PlatformTransactionManager powerTransactionManager(
final #Qualifier("entityManagerFactorySecondary") LocalContainerEntityManagerFactoryBean powerEntityManagerFactory) {
return new JpaTransactionManager(powerEntityManagerFactory.getObject());
}
I have the following question regarding the structure of my application architecture.
Assume my application consists of the following components
First of all a #ConfigurationProperties, which initializes my needed properties (here only a type to select a provider).
In addition, a bean of the type "Provider" is to be registered at this point depending on the specified type. This type is implemented as an interface and has two concrete implementations in this example (ProviderImplA & ProviderImplB).
#Configuration
#Profile("provider")
#ConfigurationProperties(prefix = "provider")
#ConditionalOnProperty(prefix = "provider", name = ["type"])
class ProviderConfiguration {
lateinit var type: String
#Bean(name = ["provider"])
fun provider(): Provider {
return when (type) {
"providerA" -> ProviderImplA()
"providerB" -> ProviderImplB()
}
}
}
Next, only the two concrete implementation of the interface.
class ProviderImplA: Provider {
#Autowired
lateinit var serviceA: ServiceA
}
class ProviderImplB: Provider {
#Autowired
lateinit var serviceA: ServiceA
#Autowired
lateinit var serviceB: ServiceB
#Autowired
lateinit var serviceC: ServiceC
}
And last but not least the interface itself.
interface Provider{
fun doSomething()
}
Now to the actual problem or better my question:
Because my concrete implementations (ProviderImplA and ProviderImplB) are no valid defined beans (missing annotation e.g. #Component), but they have to use their own #Service components, it is not possible to use #Autworie at this point. I would like to avoid different profiles/configurations if possible, therefore the initialization by property. How can I still use the individual #Service's within my implementations and still create the provider beans manually, depending on the configuration (only one provider exists at runtime)? Maybe you have other suggestions or improvements?
When instantiating objects directly, spring cannot control its life cycle, so you must create an #Bean for each 'Provider'
#Bean(name = ["providerA"])
#Lazy
fun providerA(): Provider {
return ProviderImplA()
}
#Bean(name = ["providerB"])
#Lazy
fun providerB(): Provider {
return ProviderImplB()
}
IdentityProviderConfiguration class
IdentityProviderConfiguration {
var context: ApplicationContext
...
fun provider(): Provider {
return when (type) {
"providerA" -> context.getBean("providerA")
"providerB" -> context.getBean("providerB")
}
}
}
I have a Spring Boot project that is going to be using Quartz to manage the running of some scripts. The project layout is as follows:
scheduler
|
|__scheduler-api
| |
| |__quartz-bean
|
|__scheduler-composition
|
|__service-to-schedule-quartz-jobs-using-quartz-bean
The api module is a Spring Boot application where the quartz bean lives. The composition module is where my services live that will be used to add jobs and triggers to Quartz. The problem I am running into is that my Quartz bean is not accessible from the composition module, therefore I am not able to schedule jobs in my service like I'd want to. My Quartz bean is defined as follows:
#Configuration
class QuartzScheduler {
#Autowired
private val applicationContext: ApplicationContext? = null
#Autowired
private val databaseConfiguration: DatabaseConfiguration? = null
#Bean
fun springBeanJobFactory(): SpringBeanJobFactory {
val jobFactory = AutoWiringSpringBeanJobFactory()
jobFactory.setApplicationContext(applicationContext!!)
return jobFactory
}
#Bean
#Throws(SchedulerException::class)
fun scheduler(#Qualifier("schedulerFactoryBean") factory: SchedulerFactoryBean): Scheduler {
val scheduler = factory.scheduler
scheduler.start()
return scheduler
}
#Bean
#Throws(IOException::class)
fun schedulerFactoryBean(): SchedulerFactoryBean {
val factory = SchedulerFactoryBean()
factory.setDataSource(databaseConfiguration!!.dataSource())
factory.setJobFactory(springBeanJobFactory())
factory.setQuartzProperties(quartzProperties())
return factory
}
#Throws(IOException::class)
fun quartzProperties(): Properties {
val propertiesFactoryBean = PropertiesFactoryBean()
propertiesFactoryBean.setLocation(ClassPathResource("/quartz.properties"))
propertiesFactoryBean.afterPropertiesSet()
return propertiesFactoryBean.getObject()!!
}
}
A couple things I've tried include moving the Quarts bean to the composition module, but then it doesn't have access to the database configuration it needs. I also tried importing the api module into the composition module but it created a circular dependency. Can someone help me access the Quartz bean from my composition module? I'm new to Spring Boot so I am not really sure where I am going wrong or what my options are even. Thanks!
Edit
My service looks like this:
class QuartzService {
#Autowired
private var quartzScheduler: QuartzScheduler? = null
fun upsertJob(job: JobEntity) {
var jobExists = quartzScheduler!!.scheduler().checkExists(JobKey.jobKey(job.id.toString()))
if (!jobExists) {
quartzScheduler!!.scheduler().addJob(
newJob().ofType(EnqueueJob::class.java).storeDurably().withIdentity(JobKey.jobKey(job.id.toString())).build(),
true
)
}
}
}
The error that appears is that the type QuartzScheduler cannot be found (my QuartzScheduler class from scheduler-api)
I had a couple of problems going on. First, my Quartz service was auto wiring the scheduler improperly. I changed it to this:
class QuartzService {
#Autowired
private lateint var scheduler: Scheduler
fun upsertJob(job: JobEntity) {
var jobExists = scheduler.checkExists(JobKey.jobKey(job.id.toString()))
if (!jobExists) {
scheduler.addJob(
newJob().ofType(EnqueueJob::class.java).storeDurably().withIdentity(JobKey.jobKey(job.id.toString())).build(),
true
)
}
}
}
Next, I had to change the class that was using the Quartz service to auto wire the service, I accidentally just instantiated it as a normal object:
#Autowired
private lateinit var quartzService: QuartzService
Thanks everyone for the help!
I am having issues setting up ACL through Java config in a Spring Boot application. I have created one small project to reproduce the issues.
I have tried a few different approaches. First issue I had was with EhCache, and after I fixed that (I assume I did) I couldn't login any more, and it looks like all the data is gone.
There are 4 classes with different configurations:
ACLConfig1.class
ACLConfig2.class
ACLConfig3.class
ACLConfig4.class
All #PreAuthorize and #PostAuthorize annotations are working as expected, except hasPermission.
Controller holds 4 endpoints: one for User, one for Admin, one Public and the last one which gives me headache #PostAuthorize("hasPermission(returnObject,'administration')")
I am pretty sure that inserts in DB are correct. This class is one of four, the last one that I have tried:
#Configuration
#EnableGlobalMethodSecurity(prePostEnabled = true, securedEnabled = true)
public class ACLConfig4 {
#Autowired
DataSource dataSource;
#Bean
public EhCacheBasedAclCache aclCache() {
return new EhCacheBasedAclCache(aclEhCacheFactoryBean().getObject(), permissionGrantingStrategy(), aclAuthorizationStrategy());
}
#Bean
public EhCacheFactoryBean aclEhCacheFactoryBean() {
EhCacheFactoryBean ehCacheFactoryBean = new EhCacheFactoryBean();
ehCacheFactoryBean.setCacheManager(aclCacheManager().getObject());
ehCacheFactoryBean.setCacheName("aclCache");
return ehCacheFactoryBean;
}
#Bean
public EhCacheManagerFactoryBean aclCacheManager() {
return new EhCacheManagerFactoryBean();
}
#Bean
public DefaultPermissionGrantingStrategy permissionGrantingStrategy() {
ConsoleAuditLogger consoleAuditLogger = new ConsoleAuditLogger();
return new DefaultPermissionGrantingStrategy(consoleAuditLogger);
}
#Bean
public AclAuthorizationStrategy aclAuthorizationStrategy() {
return new AclAuthorizationStrategyImpl(new SimpleGrantedAuthority("ROLE_ADMINISTRATOR"));
}
#Bean
public LookupStrategy lookupStrategy() {
return new BasicLookupStrategy(dataSource, aclCache(), aclAuthorizationStrategy(), new ConsoleAuditLogger());
}
#Bean
public JdbcMutableAclService aclService() {
JdbcMutableAclService service = new JdbcMutableAclService(dataSource, lookupStrategy(), aclCache());
return service;
}
#Bean
public DefaultMethodSecurityExpressionHandler defaultMethodSecurityExpressionHandler() {
return new DefaultMethodSecurityExpressionHandler();
}
#Bean
public MethodSecurityExpressionHandler createExpressionHandler() {
DefaultMethodSecurityExpressionHandler expressionHandler = defaultMethodSecurityExpressionHandler();
expressionHandler.setPermissionEvaluator(new AclPermissionEvaluator(aclService()));
expressionHandler.setPermissionCacheOptimizer(new AclPermissionCacheOptimizer(aclService()));
return expressionHandler;
}
}
What am I missing here? Why I have no data if I use ACLConfig3.class or
ACLConfig4.class. Is there any example on how this should be configured programmatically in Spring Boot?
The reason why you have no data was a bit tricky to find out. As soon as you define a MethodSecurityExpressionHandler bean in your config, there is no data in the database tables. This is because your data.sql file isn't executed.
Before explaining why data.sql isn't executed I'd first like to point out that you're not using the file as intended.
data.sql is executed by spring-boot after hibernate has been initialized and normally only contains DML statements. Your data.sql contains both DDL (schema) statements and DML (data) statements. This isn't ideal as some of your DDL statements clash with hibernate's hibernate.hbm2ddl.auto behaviour (note that spring-boot uses 'create-drop' when an embedded DataSource is being used). You should put your DDL statements in schema.sql and your DML statements in data.sql. As you're manually defining all tables you should disable hibernate.hbm2ddl.auto (by adding spring.jpa.hibernate.ddl-auto=none to applciation.properties).
That being said, let's take a look at why data.sql isn't executed.
The execution of data.sql is triggered via an ApplicationEvent that's fired via a BeanPostProcessor. This BeanPostProcessor (DataSourceInitializedPublisher) is created as a part of spring-boot's Hibernate/JPA auto configuration (see org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration, org.springframework.boot.autoconfigure.orm.jpa.DataSourceInitializedPublisher and org.springframework.boot.autoconfigure.jdbc.DataSourceInitializer).
Normally the DataSourceInitializedPublisher is created before the (embedded) DataSource is created and everything works as expected but by defining a custom MethodSecurityExpressionHandler the normal bean creation order alters.
As you've configured #EnableGlobalMethodSecurity, your're automatically importing GlobalMethodSecurityConfiguration.
spring-security related beans are created early on. As your MethodSecurityExpressionHandler requires a DataSource for the ACL stuff and the spring-security related beans require your custom MethodSecurityExpressionHandler, the DataSource is created earlier than usual; in fact it's created so early on that spring-boot's DataSourceInitializedPublisher isn't created yet.
The DataSourceInitializedPublisher is created later on but as it didn't notice the creation of a DataSource bean, it also doesn't trigger the execution of data.sql.
So long story short: the security configuration alters the normal bean creation order which results in data.sql not being loaded.
I guess that fixing the bean creation order would do the trick, but as I don't now how (without further experimentation) I propose the following solution: manually define your DataSource and take care of data initialization.
#Configuration
public class DataSourceConfig {
#Bean
public EmbeddedDatabase dataSource() {
return new EmbeddedDatabaseBuilder().setType(EmbeddedDatabaseType.H2)
//as your data.sql file contains both DDL & DML you might want to rename it (e.g. init.sql)
.addScript("classpath:/data.sql")
.build();
}
}
As your data.sql file contains all DDL required by your application you can disable hibernate.hbm2ddl.auto. Add spring.jpa.hibernate.ddl-auto=none to applciation.properties.
When defining your own DataSource spring-boot's DataSourceAutoConfiguration normally back's out but if you want to be sure you can also exclude it (optional).
#SpringBootConfiguration
#EnableAutoConfiguration(exclude = DataSourceAutoConfiguration.class)
#ComponentScan
#EnableCaching
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
This should fix your 'no data' problem. But in order to get everything working as expected you need to make 2 more modifications.
First of all, you should only define one MethodSecurityExpressionHandler bean. Currently you're defining 2 MethodSecurityExpressionHandler beans. Spring-security won't know which one to use and will (silently) use it's own internal MethodSecurityExpressionHandler instead. See org.springframework.security.config.annotation.method.configuration.GlobalMethodSecurityConfiguration#setMethodSecurityExpressionHandler.
#Configuration
#EnableGlobalMethodSecurity(prePostEnabled = true, securedEnabled = true)
public class MyACLConfig {
//...
#Bean
public MethodSecurityExpressionHandler createExpressionHandler() {
DefaultMethodSecurityExpressionHandler securityExpressionHandler = new DefaultMethodSecurityExpressionHandler();
securityExpressionHandler.setPermissionEvaluator(new AclPermissionEvaluator(aclService()));
securityExpressionHandler.setPermissionCacheOptimizer(new AclPermissionCacheOptimizer(aclService()));
return securityExpressionHandler;
}
}
The last thing you need to do is make the getId() method in Car public.
#Entity
public class Car {
//...
public long getId() {
return id;
}
//...
}
The standard ObjectIdentityRetrievalStrategy will look for a public method 'getId()' when trying to determine an object's identity during ACL permission evaluation.
(Note that I've based my answer upon ACLConfig4.)
I am currently trying to get a Hibernate Session Factory created when using an AbstractRoutingDataSource, but when it goes through the bean initialization process, it tries to determine a lookup key. Because I didn't set a default, it'll be null. I don't want to have to set a default - I'd rather delay this until I need to create a session and make an actual query.
I did find other people having the same problem. Here is an old archived post from 2005 that describes the exact same problem I am having. Unfortunately, there wasn't really an answer to it:
http://forum.spring.io/forum/spring-projects/data/108464-abstractroutingdatasource-not-routing-when-used-with-hibernate-sample-attached
If I set a default value, everything will load "fine" - but then changing the thread local value that the routing datasource depends on has zero effect on what database is used - it seems 'locked' in that point.
Any ideas?
AbstractRoutingDataSource inherits the Datasource. So it may replace a data source that has to be configured at startup.
You can load the data source list to a DataSourceProperties component:
#Component
#ConfigurationProperties(prefix = "tenants")
public class DataSourceProperties {
private Map <Object, Object> datasources = new LinkedHashMap <>();
public Map<Object, Object> getDatasources() {
return datasources;
}
public void setDatasources(Map<String, Map<String, String>> datasources) {
datasources
.forEach((key, value) -> this.datasources.put(key, convert(value)));
}
private DataSource convert(Map <String, String> source) {
return DataSourceBuilder.create()
.url(source.get("jdbcUrl"))
.driverClassName(source.get("driverClassName"))
.username(source.get("username"))
.password(source.get("password"))
.build();
}
}
Create AbstractRoutingDataSource:
public class TenantAwareRoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
return ThreadLocalStorage.getTenantName();
}
}
And configure your datasource to be AbstractRoutingDataSource:
#Configuration
public class DataSourceConfig {
private final DataSourceProperties dataSourceProperties;
public DataSourceConfig(DataSourceProperties dataSourceProperties) {
this.dataSourceProperties = dataSourceProperties;
}
#Bean
public DataSource getDataSource() {
TenantAwareRoutingDataSource tenantAwareRoutingDataSource = new TenantAwareRoutingDataSource();
tenantAwareRoutingDataSource.setTargetDataSources(dataSourceProperties.getDatasources());
tenantAwareRoutingDataSource.afterPropertiesSet();
return tenantAwareRoutingDataSource;
}
}
You should also implement the ThreadLocalStorage to store the tenant identifier and let the AbstractRoutingDataSource retrieve it to determine which data source to use.
To prevent the Spring to auto-configure the datasource at startup:
spring:
jpa:
open-in-view: false # Get Rid of OIV Warning
show-sql: true
database: postgresql # Do not Auto-Detect the Database
hibernate:
ddl-auto: none # Prevent Hibernate from Automatic Changes to the DDL Schema
properties:
hibernate:
dialect: org.hibernate.dialect.PostgreSQLDialect
datasource:
initialization-mode: never # Prevent JPA from trying to Initialize