jBPM and JPA Transaction Manager: no local transaction to join - java

I'm having a really hard time setting up a jBPM7 project. I'm trying to use the jBPM engine with an H2 in-memory database to make use of the Human Task Service. I've set up my data source, entity manager factory and transaction manager as below:
application.properties
...
jbpm.datasource.jdbc-url=jdbc:h2:mem:testdb
jbpm.datasource.username=sa
jbpm.datasource.password=
...
JbpmDataConfiguration.java
#Configuration
#EnableTransactionManagement
public class JbpmDataConfiguration {
#Bean(name = "jbpmDataSource")
#ConfigurationProperties(prefix = "jbpm.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "jbpmEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean jbpmEntityManagerFactory(EntityManagerFactoryBuilder builder, #Qualifier("jbpmDataSource") DataSource dataSource) {
Map<String, String> properties = new HashMap<>();
properties.put("hibernate.hbm2ddl.auto", "create");
return builder.dataSource(dataSource)
.mappingResources("META-INF/Taskorm.xml")
.packages("org.jbpm.services.task.impl.model")
.persistenceUnit("jbpm-persistence-unit")
.properties(properties)
.build();
}
#Bean(name = "jbpmTransactionManager")
public JpaTransactionManager jbpmTransactionManager(#Qualifier("jbpmEntityManagerFactory") EntityManagerFactory entityManagerFactory) {
JpaTransactionManager jpaTransactionManager = new JpaTransactionManager();
jpaTransactionManager.setEntityManagerFactory(entityManagerFactory);
return jpaTransactionManager;
}
}
In my resources folder I have a kmodule.xml in resources/META-INF/. All my .bpmn files are in resources/com/mydomain/flow/
<kmodule xmlns="http://jboss.org/kie/6.0.0/kmodule">
<kbase name="kbase" packages="com.mydomain.flow" />
</kmodule>
I am then creating a runtime engine using the following:
#Configuration
public class MyRuntimeEngine {
#Autowired
#Qualifier("jbpmEntityManagerFactory")
private EntityManagerFactory entityManagerFactory;
#Autowired
#Qualifier("jbpmTransactionManager")
private JpaTransactionManager transactionManager;
#Bean
public RuntimeManager runtimeManager() {
KieServices kieServices = KieServices.Factory.get();
KieContainer kieContainer = kieServices.getKieClasspathContainer();
KieBase kieBase = kieContainer.getKieBase("kbase");
RuntimeEnvironmentBuilder runtimeEnvironmentBuilder = RuntimeEnvironmentBuilder.Factory.get()
.newDefaultInMemoryBuilder()
.entityManagerFactory(entityManagerFactory)
.addEnvironmentEntry(EnvironmentName.TRANSACTION_MANAGER, transactionManager);
RuntimeEnvironment runtimeEnvironment = runtimeEnvironmentBuilder.knowledgeBase(kieBase).get();
return RuntimeManagerFactory.Factory.get().newSingletonRuntimeManager(runtimeEnvironment);
}
}
I am autowiring the RuntimeManager bean into another configuration class so I can configure global variables and work item handlers. This then exposes a KieSession bean once it's done. To then start a process, I am autowiring the KieSession into one of my controllers and calling startProcess:
kieSession.startProcess(processName, processVariables);
which results in the following error:
javax.persistence.TransactionRequiredException: No local transaction to join
at org.springframework.orm.jpa.ExtendedEntityManagerCreator$ExtendedEntityManagerInvocationHandler.doJoinTransaction(ExtendedEntityManagerCreator.java:391) ~[spring-orm-5.1.6.RELEASE.jar:5.1.6.RELEASE]
at org.springframework.orm.jpa.ExtendedEntityManagerCreator$ExtendedEntityManagerInvocationHandler.invoke(ExtendedEntityManagerCreator.java:333) ~[spring-orm-5.1.6.RELEASE.jar:5.1.6.RELEASE]
at com.sun.proxy.$Proxy160.joinTransaction(Unknown Source) ~[na:na]
at org.jbpm.process.audit.JPAWorkingMemoryDbLogger.joinTransaction(JPAWorkingMemoryDbLogger.java:318) ~[jbpm-audit-7.22.0.Final.jar:7.22.0.Final]
at org.jbpm.process.audit.JPAWorkingMemoryDbLogger.persist(JPAWorkingMemoryDbLogger.java:246) ~[jbpm-audit-7.22.0.Final.jar:7.22.0.Final]
at org.jbpm.process.audit.JPAWorkingMemoryDbLogger.afterVariableChanged(JPAWorkingMemoryDbLogger.java:133) ~[jbpm-audit-7.22.0.Final.jar:7.22.0.Final]
at org.drools.core.event.ProcessEventSupport.fireAfterVariableChanged(ProcessEventSupport.java:155) ~[drools-core-7.22.0.Final.jar:7.22.0.Final]
at org.jbpm.process.instance.context.variable.VariableScopeInstance.setVariable(VariableScopeInstance.java:114) ~[jbpm-flow-7.22.0.Final.jar:7.22.0.Final]
at org.jbpm.process.instance.AbstractProcessInstanceFactory.createProcessInstance(AbstractProcessInstanceFactory.java:59) ~[jbpm-flow-7.22.0.Final.jar:7.22.0.Final]
at org.jbpm.process.instance.ProcessRuntimeImpl.startProcess(ProcessRuntimeImpl.java:260) ~[jbpm-flow-7.22.0.Final.jar:7.22.0.Final]
at org.jbpm.process.instance.ProcessRuntimeImpl.createProcessInstance(ProcessRuntimeImpl.java:242) ~[jbpm-flow-7.22.0.Final.jar:7.22.0.Final]
at org.jbpm.process.instance.ProcessRuntimeImpl.createProcessInstance(ProcessRuntimeImpl.java:200) ~[jbpm-flow-7.22.0.Final.jar:7.22.0.Final]
at org.jbpm.process.instance.ProcessRuntimeImpl.startProcess(ProcessRuntimeImpl.java:190) ~[jbpm-flow-7.22.0.Final.jar:7.22.0.Final]
at org.jbpm.process.instance.ProcessRuntimeImpl.startProcess(ProcessRuntimeImpl.java:185) ~[jbpm-flow-7.22.0.Final.jar:7.22.0.Final]
...
How do I properly configure the transaction manager to work with jBPM?

I am also having same trouble in setting up JBPM 7.x version. Were you able to get JBPM running with JPA transaction manager...from the official documentation it seems that we need external transaction manager implementations like Narayanan or bitronix if JBPM needs to be used in embedded mode

Related

Prevent changing schema in Spring Boot with two DBs

I have two DBs in my Spring Boot app, one configured with appliation.properties:
spring.secondDatasource.url=jdbc:mysql://10.10.10:3306/db1
spring.secondDatasource.username=user
spring.secondDatasource.password=pass
spring.jpa.database-platform = org.hibernate.dialect.MySQL5Dialect
spring.datasource.driverClassName=com.mysql.jdbc.Driver
spring.jpa.hibernate.ddl-auto = update
and another via DataSource:
#Configuration
public class SecondDbConnectionConfig {
#Bean
public DriverManagerDataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setUrl("jdbc:mysql://20.20.20:3306/db2");
dataSource.setDriverClassName("com.mysql.jdbc.Driver");
dataSource.setUsername("user");
dataSource.setPassword("pass");
return dataSource;
}
}
When I run the app, both schemas are updated with my domain model.
I want to update a domain model only for app.prop configured DB.
As to DataSource configured DB, I don't want to make any changes, just read.
How to fix this config?
Don't use DriverManagerDataSource for production. It's just suitable for your testing environment. (It does not have connection pool, just create new connection on the fly)
Your datasource uses your JpaProperties config from your spring.jpa.* properties. So you need to override it:
#Bean(name = "your-entity-manager-factory")
public LocalContainerEntityManagerFactoryBean getEntityManagerFactory(EntityManagerFactoryBuilder builder,
#Qualifier("testdatasource") DataSource dataSource, // Here you must annotate your DriverManagerDataSource bean with #Bean("testdatasource")
JpaProperties jpaProperties) {
// Here you clone and modify JpaProperties
Map<String, String> hibernateConfig = jpaProperties.getHibernateProperties(dataSource);
hibernateConfig.remove("hibernate.hbm2ddl.auto");
return builder
.dataSource(dataSource)
.persistenceUnit("testPu")
.properties(hibernateConfig)
.build();
}
You may config additional transaction manager.

Spring Boot Batch With Spring Data Write meta data in different Schema (in memory: HSQL or H2)

I am writring a batch using the following thecnologies:
Spring Boot to run the application : V1.5.3.RELEASE
Spring Batch with Spring Batch Config: spring-batch-infrastructure V3.0.7.RELEASE
Spring Data for my generic DAO to the business database: >spring-data-jpa V1.11.3.RELEASE
My datasource to oracle datbase is HikariDataSource :
#Qualifier("dataSource")
#Bean(destroyMethod = "close")
#Primary
public HikariDataSource dataSource() throws SQLException {
return buildDataSource();
}
public HikariDataSource buildDataSource() throws SQLException {
HikariDataSource ds = new HikariDataSource();
ds.setMaximumPoolSize(poolSize);
ds.setDriverClassName(driverClassName);
ds.setJdbcUrl(jdbcUrl);
ds.setUsername(userName);
ds.setPassword(password);
ds.setConnectionTestQuery("SELECT 1 from DUAL");
ds.addDataSourceProperty("hibernate.show_sql", showSQL);
ds.addDataSourceProperty("hibernate.use_sql_comments", useSQLComment);
ds.addDataSourceProperty("hibernate.format_sql", formatSQL);
ds.addDataSourceProperty("hibernate.ddl-auto", "none");
return ds;
}
I want to write my meta data in another database (in memory HSQL or H2 for example) but i can't find a way because the context is writing the meta data in the same database.
The only way is to define a TransactionManager and an EntityManager and enable them to my DAO :
#Bean
PlatformTransactionManager businessTransactionManager() throws SQLException {
return new JpaTransactionManager(businessEntityManagerFactory().getObject());
}
#Bean
LocalContainerEntityManagerFactoryBean businessEntityManagerFactory() throws SQLException {
HibernateJpaVendorAdapter jpaVendorAdapter = new HibernateJpaVendorAdapter();
jpaVendorAdapter.setGenerateDdl(true);
LocalContainerEntityManagerFactoryBean factoryBean = new LocalContainerEntityManagerFactoryBean();
factoryBean.setDataSource(dataSource());
factoryBean.setJpaVendorAdapter(jpaVendorAdapter);
factoryBean.setPackagesToScan("package.of.business.model", "package.of.business.data");
return factoryBean;
}
and in my batch configuration i add :
#EnableJpaRepositories(entityManagerFactoryRef = "businessEntityManagerFactory",
transactionManagerRef = "businessTransactionManager", basePackages = {"package.of.business.model",
"package.of.business.data"})
This way it works after i define the spring default dataSource in my app.properties:
spring.datasource.url=jdbc:h2:mem:test
spring.datasource.driverClassName=org.h2.Driver
spring.datasource.username=sa
spring.datasource.password=
spring.jpa.hibernate.ddl-auto=update
What i really want to do is the exact opposite of this, i want that the default database is the business one and i want to override the datasource that writes the meta data but i can't find a way. I even tried to make a custom BatchConfigurer:
CustomBatchConfigurer extends DefaultBatchConfigurer
It works only for my meta data after i disable the initialization of my spring data for the default datasource but it doesn't write anything in my oracle business database :
batch.data.source.init=false
spring.batch.initializer.enabled=false
spring.batch.initialize.enabled=false
spring.datasource.initialize=false
spring.datasource.continue-on-error=true
Does any one have any idea how this could be done?
You'll need to create a custom implementation of the BatchConfigurer (typically by extending DefaultbatchConfigurer. That will allow you to configure the batch DataSource explicitly.
You can read more about the BatchConfigurer in the documentation here: http://docs.spring.io/spring-batch/apidocs/org/springframework/batch/core/configuration/annotation/BatchConfigurer.html

Certificates does not conform to algorithm constraints when connecting to JDBC MS SQL Server

I'm trying to connect my Spring application to a Microsoft SQL Server database but I am getting the following error:
Request processing failed; nested exception is org.springframework.transaction.CannotCreateTransactionException: Could not open JPA EntityManager for transaction; nested exception is org.hibernate.exception.JDBCConnectionException: Unable to acquire JDBC Connection] with root cause
java.security.cert.CertificateException: Certificates does not conform to algorithm constraints
I have already tried to remove the JDK's certpath setting to be blank like: jdk.certpath.disabledAlgorithms=
Here is my configuration:
#Configuration
#EnableJpaRepositories("com.abc.cet.eai.repository.sql")
#PropertySource("classpath:eai.application.properties")
public class JpaConfig {
#Bean
public DataSource dataSource() {
SQLServerDataSource dataSource = new SQLServerDataSource();
dataSource.setServerName("SERVERNAME");
dataSource.setUser("USERNAME");
dataSource.setPassword("PASSWORD");
dataSource.setDatabaseName("DATABASE_NAME");
return dataSource;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean factory = new LocalContainerEntityManagerFactoryBean();
factory.setPackagesToScan("com.abc.cet.eai.domain");
factory.setDataSource(dataSource());
JpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
factory.setJpaVendorAdapter(vendorAdapter);
factory.setJpaProperties(additionalProperties());
return factory;
}
#Bean
public PlatformTransactionManager transactionManager() {
JpaTransactionManager txManager = new JpaTransactionManager();
txManager.setEntityManagerFactory(entityManagerFactory().getObject());
return txManager;
}
public Properties additionalProperties() {
Properties properties = new Properties();
properties.setProperty("hibernate.dialect", "org.hibernate.dialect.SQLServer2008Dialect");
return properties;
}
}
Has anyone solved a similar problem?
You are basically experiencing the same issue as I did in WAGON-470. The certifacate with created with MD5 which is rejected by modern Java. You should inspect the certificate itself, the ciphers the server is offering and enable JSSE debug opts. IT is likely that you need to update/exchange your certificate wich a more secure method like SHA256.

Unable to lookup JDBC datasource by JNDI

I am using Spring Data, Hibernate and jBoss 6.x.x.
I would like to look up at the jBoss 6.x.x the already configured jdbc datasource by JNDI.
Looking at the file standalone.xml, I have found the following entries:
<profile>
...
<datasources>
...
<datasource enabled="true" jndi-name="name_of_the_ds" pool-name="name_of_the_pool" use-java-context="true">
<connection-url>connection_uri</connection-url>
<driver>driver_name</driver>
<security>
<user-name>fake_login</user-name>
<password>fake_password</password>
</security>
</datasource>
...
</datasources>
...
</profile>
Based on this, I have written the following Spring annotation configuration
class for the data base (I am using Spring Data and Hibernate as the JPA provider)
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(basePackages = "dao.repository")
#PropertySource("classpath:application.properties")
public class DataBaseContextConfiguration {
private static final Logger LOGGER = LoggerFactory.getLogger(DataBaseContextConfiguration.class);
private Database dataBase = Database.SOME_DATA_BASE;
#Value("${jpa.showSql}")
private Boolean showSql;
#Value("${name.data.source}")
private String dataSourceJndiName;
#Bean
public LocalContainerEntityManagerFactoryBean localContainerEntityManagerFactoryBean() throws NamingException {
final LocalContainerEntityManagerFactoryBean localContainerEntityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
localContainerEntityManagerFactoryBean.setDataSource(dataSource());
localContainerEntityManagerFactoryBean.setPackagesToScan(PACKAGE_WITH_DB_ENTITIES);
JpaVendorAdapter jpaVendorAdapter = new HibernateJpaVendorAdapter();
localContainerEntityManagerFactoryBean.setJpaVendorAdapter(jpaVendorAdapter);
return localContainerEntityManagerFactoryBean;
}
#Bean
public PlatformTransactionManager platformTransactionManager(EntityManagerFactory entityManagerFactory) {
JpaTransactionManager jpaTransactionManager = new JpaTransactionManager();
jpaTransactionManager.setEntityManagerFactory(entityManagerFactory);
return jpaTransactionManager;
}
public DataSource dataSource() throws NamingException {
DataSource dataSource = null;
JndiTemplate jndi = new JndiTemplate();
dataSource = (DataSource) jndi.lookup(dataSourceJndiName);
return dataSource;
}
}
The String dataSourceJndiName has the following value name_of_the_ds.
When I am trying to run simple integration test the following exception occurres:
Caused by: javax.naming.NoInitialContextException: Need to specify class name in environment or system property, or as an applet parameter, or in an application resource file: java.naming.factory.initial

Jhipster Multi-tenancy with Hibernate Second Level Caching

I've been attempting to turn my JHipster generated application into a multi-tenancy app using this - http://jannatconsulting.com/blog/?p=41 blog post as a base.
I've run into a problem with second level caching. Spring boot appears to correctly detect and set up:
DatabaseConfiguration.java
#Configuration
#EnableConfigurationProperties(JpaProperties.class)
#EnableJpaRepositories(
entityManagerFactoryRef = "masterEntityManager",
transactionManagerRef = "masterTransactionManager",
basePackages = {"com.quadrimular.nts.helium.repository.master"})
#EnableJpaAuditing(auditorAwareRef = "springSecurityAuditorAware")
#EnableTransactionManagement
#EnableElasticsearchRepositories("com.quadrimular.nts.helium.repository.search")
public class DatabaseConfiguration {
#Inject
private Environment env;
#Autowired(required = false)
private MetricRegistry metricRegistry;
#Inject
private DataSourceProperties datasourceProperties;
#Inject
private JHipsterProperties jhipsterProperties;
#Inject
private JpaProperties jpaProperties;
#Inject
private DataSource dataSource;
#Bean(destroyMethod = "close")
#ConditionalOnExpression("#{!environment.acceptsProfiles('cloud') && !environment.acceptsProfiles('heroku')}")
public DataSource dataSource(DataSourceProperties dataSourceProperties, JHipsterProperties jHipsterProperties) {
log.debug("Configuring Master Datasource");
if (dataSourceProperties.getUrl() == null) {
log.error("Your database connection pool configuration is incorrect! The application" +
" cannot start. Please check your Spring profile, current profiles are: {}",
Arrays.toString(env.getActiveProfiles()));
throw new ApplicationContextException("Database connection pool is not configured correctly");
}
HikariConfig config = new HikariConfig();
config.setDataSourceClassName(dataSourceProperties.getDriverClassName());
config.addDataSourceProperty("url", dataSourceProperties.getUrl());
if (dataSourceProperties.getUsername() != null) {
config.addDataSourceProperty("user", dataSourceProperties.getUsername());
} else {
config.addDataSourceProperty("user", ""); // HikariCP doesn't allow null user
}
if (dataSourceProperties.getPassword() != null) {
config.addDataSourceProperty("password", dataSourceProperties.getPassword());
} else {
config.addDataSourceProperty("password", ""); // HikariCP doesn't allow null password
}
//MySQL optimizations, see https://github.com/brettwooldridge/HikariCP/wiki/MySQL-Configuration
if ("com.mysql.jdbc.jdbc2.optional.MysqlDataSource".equals(dataSourceProperties.getDriverClassName())) {
config.addDataSourceProperty("cachePrepStmts", jHipsterProperties.getDatasource().isCachePrepStmts());
config.addDataSourceProperty("prepStmtCacheSize", jHipsterProperties.getDatasource().getPrepStmtCacheSize());
config.addDataSourceProperty("prepStmtCacheSqlLimit", jHipsterProperties.getDatasource().getPrepStmtCacheSqlLimit());
}
if (metricRegistry != null) {
config.setMetricRegistry(metricRegistry);
}
return new HikariDataSource(config);
}
#Bean(name = "masterEntityManager")
public LocalContainerEntityManagerFactoryBean entityManagerFactory(){
JpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
LocalContainerEntityManagerFactoryBean em = new LocalContainerEntityManagerFactoryBean();
em.setDataSource(dataSource(datasourceProperties, jhipsterProperties));
em.setPackagesToScan(new String[]{"com.quadrimular.nts.helium.domain.master"});
em.setJpaVendorAdapter(vendorAdapter);
em.setJpaProperties(additionalJpaProperties());
em.setPersistenceUnitName("master");
return em;
}
private Properties additionalJpaProperties() {
Properties properties = new Properties();
for (Map.Entry<String, String> entry : jpaProperties.getHibernateProperties(dataSource).entrySet()) {
properties.setProperty(entry.getKey(), entry.getValue());
}
return properties;
}
#Bean(name = "masterTransactionManager")
public JpaTransactionManager transactionManager(EntityManagerFactory masterEntityManager){
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(masterEntityManager);
return transactionManager;
}
}
When spring tries to configure:
MultiTenancyJPAConfiguration.java
#Configuration
#EnableConfigurationProperties(JpaProperties.class)
#EnableJpaRepositories(
entityManagerFactoryRef = "tenantEntityManager",
transactionManagerRef = "tenantTransactionManager",
basePackages = {"com.quadrimular.nts.helium.repository.tenant"})
#EnableTransactionManagement
public class MultiTenancyJpaConfiguration {
#Bean(name = "tenantEntityManager")
public LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource dataSource,
MultiTenantConnectionProvider connectionProvider,
CurrentTenantIdentifierResolver tenantResolver) {
LocalContainerEntityManagerFactoryBean emfBean = new LocalContainerEntityManagerFactoryBean();
emfBean.setDataSource(dataSource);
emfBean.setPackagesToScan("com.quadrimular.nts.helium.domain.tenant");
emfBean.setJpaVendorAdapter(jpaVendorAdapter());
Map<String, Object> properties = new HashMap<>();
properties.put(org.hibernate.cfg.Environment.MULTI_TENANT, MultiTenancyStrategy.DATABASE);
properties.put(org.hibernate.cfg.Environment.MULTI_TENANT_CONNECTION_PROVIDER, connectionProvider);
properties.put(org.hibernate.cfg.Environment.MULTI_TENANT_IDENTIFIER_RESOLVER, tenantResolver);
properties.put("hibernate.ejb.naming_strategy", "org.hibernate.cfg.ImprovedNamingStrategy");
emfBean.setJpaPropertyMap(properties);
return emfBean;
}
#Bean(name = "tenantTransactionManager")
public JpaTransactionManager transactionManager(EntityManagerFactory tenantEntityManager){
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(tenantEntityManager);
return transactionManager;
}
}
I'm getting this in my trace:
Caused by: org.hibernate.cache.NoCacheRegionFactoryAvailableException: Second-level cache is used in the application, but property hibernate.cache.region.factory_class is not given; please either disable second level cache or set correct region factory using the hibernate.cache.region.factory_class setting and make sure the second level cache provider (hibernate-infinispan, e.g.) is available on the classpath.
I have all the required properties defined in my application-dev.yml
hibernate.cache.use_second_level_cache: true
hibernate.cache.use_query_cache: false
hibernate.generate_statistics: true
hibernate.cache.region.factory_class: org.hibernate.cache.ehcache.SingletonEhCacheRegionFactory
It appears that it's being read and used correctly by spring boot for my DatabaseConfiguration.java. I can't work out why it's not detecting the property file. If I try to disable the cache by setting:
hibernate.cache.use_second_level_cache: false
DatabaseConfiguration.java detects and behaves accordingly however MultiTenancyJPAConfiguration.java still throws the same exception.
Am I missing something obvious?
The answer is to actually set the jpa property values on the entity manager. I'm not sure how I overlooked this; I thought that somehow they were already set.
Firstly I injected the main datasource and the jpa properties object provided by spring boot if I'm not mistaken.
MultiTenancyJPAConfiguration.java
#Inject
private JpaProperties jpaProperties;
#Inject
private DataSource dataSource;
I then set the values using the same method used in DatabaseConfiguration.java
MultiTenancyJPAConfiguration.java
#Bean(name = "tenantEntityManager")
public LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource dataSource,
MultiTenantConnectionProvider connectionProvider,
CurrentTenantIdentifierResolver tenantResolver) {
LocalContainerEntityManagerFactoryBean emfBean = new LocalContainerEntityManagerFactoryBean();
emfBean.setDataSource(dataSource);
emfBean.setPackagesToScan("com.quadrimular.nts.helium.domain.tenant");
emfBean.setJpaVendorAdapter(jpaVendorAdapter());
Map<String, Object> properties = new HashMap<>();
properties.put(org.hibernate.cfg.Environment.MULTI_TENANT, MultiTenancyStrategy.DATABASE);
properties.put(org.hibernate.cfg.Environment.MULTI_TENANT_CONNECTION_PROVIDER, connectionProvider);
properties.put(org.hibernate.cfg.Environment.MULTI_TENANT_IDENTIFIER_RESOLVER, tenantResolver);
properties.put("hibernate.ejb.naming_strategy", "org.hibernate.cfg.ImprovedNamingStrategy");
emfBean.setJpaPropertyMap(properties);
emfBean.setJpaProperties(additionalJpaProperties());
return emfBean;
}
private Properties additionalJpaProperties() {
Properties properties = new Properties();
for (Map.Entry<String, String> entry : jpaProperties.getHibernateProperties(dataSource).entrySet()) {
properties.setProperty(entry.getKey(), entry.getValue());
}
return properties;
}
Using the method additionalJpaProperties() to get all the hibernate jpa properties for my main datasource. I then set the hibernate property map properties after the hard coded ones. Clearly not the cleanest solution I plan to set all jpa values from the .yml file.

Categories

Resources