I'm trying to set up the Java config for MyBatis & the #MapperScan does not appear to be accomplishing anything. Note, I can get the application to work with XML config.
What am I missing? The com.test.mapper package definitely exists & has a file/iterface called TestMapper. The corresponding xml is in the correct location in the resources folder.
*************************** APPLICATION FAILED TO START
Description:
Field templateMapper in
com.test.TestController required a
bean of type 'com.test.mapper.TestMapper' that
could not be found.
Action:
Consider defining a bean of type
'com.test.mapper.TestMapper' in your
configuration.
Autowired that is failing
#Autowired
TestMapper _testMapper;
config
#Configuration
#MapperScan("com.test.mapper")
public class AppConfig {
#Bean
public DataSource dataSource() {
SimpleDriverDataSource dataSource = new SimpleDriverDataSource();
try {
dataSource.setDriverClass(com.microsoft.sqlserver.jdbc.SQLServerDriver.class);
//dataSource.setDriverClassName("com.microsoft.sqlserver.jdbc.SQLServerDriver");
dataSource.setUrl("jdbc:sqlserver://server;databaseName=db1;integratedSecurity=true;");
} catch (Exception e) {
}
return dataSource;
}
#Bean
public DataSourceTransactionManager transactionManager() {
return new DataSourceTransactionManager(dataSource());
}
#Bean
public SqlSessionFactoryBean sqlSessionFactory() throws Exception {
SqlSessionFactoryBean sqlSessionFactoryBean = new SqlSessionFactoryBean();
sqlSessionFactoryBean.setTypeAliasesPackage("com.test.domain");
sqlSessionFactoryBean.setDataSource(dataSource());
return sqlSessionFactoryBean;
}
}
I solved this. My issue wasn't with Mybatis. It was with Spring. This link to the Spring docs says to "...locate your main application class in a root package above other classes".
I had not done that. Once I moved application class ( annotated with SpringBootApplication) then the #MapperScan annotation worked.
Related
I'm trying to configure two datasources in my spring batch application. One for batch metadata tables, and another for the business tables.
Snippet from my application.properties file:
spring.datasource.url=
spring.datasource.username=
spring.datasource.password=
spring.datasource.driver-class-name=
spring.batchdatasource.url=
spring.batchdatasource.username=
spring.batchdatasource.password=
spring.batchdatasource.driver-class-name=
My batch config file:
#Configuration
public class SpringBatchConfig extends DefaultBatchConfigurer{
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
// #Autowired
// private DataSource dataSource;
#Autowired
private PlatformTransactionManager transactionManager;
#Bean(name = "batchDatasource")
#ConfigurationProperties(prefix="spring.batchdatasource")
public DataSource batchDataSource(){
return DataSourceBuilder.create().build();
}
#Bean(name = "primaryDatasource")
#ConfigurationProperties(prefix="spring.datasource")
#Primary
public DataSource primaryDataSource() {
return DataSourceBuilder.create().build();
}
#Override
public JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(batchDataSource());
// factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager);
factory.setTablePrefix("schema1"+ ".BATCH_");
factory.afterPropertiesSet();
return factory.getObject();
}
/* Job and step bean definitions here */
My main class is the one annotated with #EnableBatchProcessing
#SpringBootApplication
#EnableBatchProcessing
public class SpringBatchExample1Application {
public static void main(String[] args) {
SpringApplication.run(SampleApplication.class, args);
}
}
I'm getting this Requested bean is currently in creation: Is there an unresolvable circular reference? when trying to configure two datasources. It works fine when using a single datasource by autowiring(refer the commented out lines of code) instead of creating multiple beans.
Following is the exception snippet:
Error creating bean with name 'springBatchConfig': Unsatisfied dependency expressed through method 'setDataSource' parameter 0; nested exception is org.springframework.beans.factory.BeanCurrentlyInCreationException: Error creating bean with name 'batchDatasource': Requested bean is currently in creation: Is there an unresolvable circular reference?
I looked up and found out this occurs when there's a dependency on a bean which is still not created or is being created. I just see it in the createJobRepository method where datasource is being plugged. I still face the error even if I don't have the createJobRepository method.
It seems like the requirement is for the datasource beans to be created before others. I tried using the #Order annotation, but no luck.
EDIT:
I tried the solution from #Mykhailo Skliar's Accepted answer below, and serparated the Datasource beans into a new Configuration class. Though it resolved the initial Unresolveble circular reference issue anymore, it led me to the following error:
Error creating bean with name 'springBatchConfig': Invocation of init method failed; nested exception is org.springframework.batch.core.configuration.BatchConfigurationException: java.lang.IllegalArgumentException: jdbcUrl is required with driverClassName.
Based on this answer, I changed my url property names as follows:
spring.datasource.jdbc-url=
spring.datasource.jdbc-url=
Though it solved the jdbcUrl error, it posed another issue:
Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Reference to database and/or server name in 'sample-sql-server.schema1.MY_TABLE_NAME' is not supported in this version of SQL Server.
Both my data sources are Azure SQL server instances.
I looked up and found it was not possible to use multiple Azure SQL databases years ago, but based on this answer it should not be the case anymore.
The issue is most probably because of
factory.setDataSource(batchDataSource());
You should use autowired bean here, instead of calling batchDataSource()
I would split SpringBatchConfig in two beans:
#Configuration
public class DataSourceConfig {
#Bean(name = "batchDatasource")
#ConfigurationProperties(prefix="spring.batchdatasource")
public DataSource batchDataSource(){
return DataSourceBuilder.create().build();
}
#Bean(name = "primaryDatasource")
#ConfigurationProperties(prefix="spring.datasource")
#Primary
public DataSource primaryDataSource() {
return DataSourceBuilder.create().build();
}
}
#Configuration
public class SpringBatchConfig extends DefaultBatchConfigurer{
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Qualifier("batchDataSource")
#Autowired
private DataSource batchDataSource;
#Autowired
private PlatformTransactionManager transactionManager;
#Override
public JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(batchDataSource);
factory.setTransactionManager(transactionManager);
factory.setTablePrefix("schema1"+ ".BATCH_");
factory.afterPropertiesSet();
return factory.getObject();
}
}
I am using Spring boot 2.0.3 and mybatis with PostgreSql.
I am trying to set up multiple data source connection as follows by following https://programmer.help/blogs/spring-boot-integrates-mybatis-multiple-data-sources.html.
Datasource1
#Configuration
#MapperScan(basePackages = "com.repositories.StaRepository", sqlSessionFactoryRef = "sqlPromptSessionFactory", annotationClass = Mapper.class)
//SqlSessionFactory is created from DB1 and then a SqlSessionTemplate is created from the created SqlSessionFactory.
public class MyBatisConfigPrompt {
#Bean(name = "DB1")
#ConfigurationProperties(prefix = "spring.datasource.pro")
public DruidDataSource DB1() {
return DruidDataSourceBuilder.create().build();
}
#Bean(name = "sqlProSessionFactory")
SqlSessionFactory sqlProSessionFactory() {
SqlSessionFactory sessionFactory = null;
try {
SqlSessionFactoryBean bean = new SqlSessionFactoryBean();
bean.setDataSource(DB1());
sessionFactory = bean.getObject();
} catch (Exception e) {
e.printStackTrace();
}
return sessionFactory;
}
#Bean
public MapperScannerConfigurer proMapperScannerConfigurer() {
MapperScannerConfigurer configurer = new MapperScannerConfigurer();
configurer.setBasePackage("com.repositories.StaRepository");
configurer.setSqlSessionFactoryBeanName("sqlProSessionFactory");
return configurer;
}
}
Datasource2
#Configuration
#MapperScan(basePackages = "com.repositories.ContDBRepository", sqlSessionFactoryRef = "sqlContSessionFactory", annotationClass = Mapper.class)
//SqlSessionFactory is created from contDB and then a SqlSessionTemplate is created from the created SqlSessionFactory.
public class MyBatisConfigCont {
#Bean(name = "contDB")
#ConfigurationProperties(prefix = "spring.datasource.cont")
public DruidDataSource contDB() {
return DruidDataSourceBuilder.create().build();
}
#Bean(name = "sqlContSessionFactory")
SqlSessionFactory sqlContSessionFactory() {
SqlSessionFactory sessionFactory = null;
try {
SqlSessionFactoryBean bean = new SqlSessionFactoryBean();
bean.setDataSource(contDB());
sessionFactory = bean.getObject();
} catch (Exception e) {
e.printStackTrace();
}
return sessionFactory;
}
#Bean
public MapperScannerConfigurer contMapperScannerConfigurer() {
MapperScannerConfigurer configurer = new MapperScannerConfigurer();
configurer.setBasePackage("com.repositories.ContDBRepository");
configurer.setSqlSessionFactoryBeanName("sqlContSessionFactory");
return configurer;
}
}
I have also a ContDBRepository.class with #Mapper Annotation and ContDBRepository.xml and same as StaRepository.class with #Mapper Annotation and StaRepository.xml in same package.
With the above configuration i am getting ERROR
No qualifying bean of type 'org.apache.ibatis.session.SqlSessionFactory' available: expected single matching bean but found 2: sqlContSessionFactory,sqlProSessionFactory
As a fix to the above error i set #Primary to one of the SqlSessionFactory but other SqlSessionFactory is never called when i want to use second datasource.
Can anyone help what i am missing.
UPDATE 20210301
This example help me to find a solution for makeing sure I could use the specific datasource.
The basic idea is to create a abstract data source as the router giving to the mybatis config. Then use a enum and #interface as the selector and adding them before any interface you want a specific data source. Finally AOP is the program paradigm to define how to change the data source.
Some key points:
AbstractRoutingDataSource will be the key to store our whole datasources.
#interface will be the key to create our router for our different ServiceImpl with specific interface, which will not need #Repository anymore, by adding that to any interface you override with the specific data source type.
#Aspect and #Pointcut will be the key to guarantee our router will work properly.
ORIGINAL
I found the same question for most example online about multiple data source.
The example you saw is not checked very much because he only use two localhost with same database info and table info.
The config example used:
spring.datasource.one.url=jdbc:mysql://localhost:3306/test01?useUnicode=true&characterEncoding=utf-8
spring.datasource.one.username=root
spring.datasource.one.password=123456
spring.datasource.one.type=com.alibaba.druid.pool.DruidDataSource
spring.datasource.two.url=jdbc:mysql://localhost:3306/test02?useUnicode=true&characterEncoding=utf-8
spring.datasource.two.username=root
spring.datasource.two.password=123456
spring.datasource.two.type=com.alibaba.druid.pool.DruidDataSource
All the example I tried, all failed at get table from wrong database, which actually means the #Repository only can get one DataSource or config.
My Spring Batch repository (deployed on an Oracle database) lies in a different schema such that I need to prepend the schema name.
When using XML configuration, this would be easy to do:
<job-repository id="jobRepository" table-prefix="GFA.BATCH_" />
However, as I use Java Config, this turns out to be more tricky.
The best solution I found is to have my Java Config class extend DefaultBatchConfigurer and override the createJobRepository() method:
#Configuration
#EnableBatchProcessing
public class BatchConfiguration extends DefaultBatchConfigurer{
#Autowired
private DataSource dataSource;
#Autowired
private PlatformTransactionManager transactionManager;
#Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager);
factory.setTablePrefix("GFA.BATCH_");
factory.afterPropertiesSet();
return factory.getObject();
}
...
}
Compared to the XML solution, that's pretty much code! And it's not too logical either - my first guess was to provide an #Bean method as follows:
#Bean
public JobRepository jobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager);
factory.setTablePrefix("GFA.BATCH_");
factory.afterPropertiesSet();
return factory.getObject();
}
but this wouldn't work.
My question is:
Is my solution optimal or is there a better one? I would prefer to define a Bean instead of having to override some method of some class which is not very intuitive...
And obviously it would be even better if we could shorten the code to be somewhat close to the one-line code in the XML configuration.
Simply add this line to any of your properties files registered on you batch configuration:
spring.batch.table-prefix= GFA.BATCH_
FYI, the prefix spring.batch is mapped with org.springframework.boot.autoconfigure.batch.BatchProperties provided with Spring boot. See source code on github.
You should define in java the BatchConfigurer and in this bean you define the spring batch tables prefix:
#Bean
public BatchConfigurer batchConfigurer() {
return new DefaultBatchConfigurer() {
#Autowired
PlatformTransactionManager platformTransactionManager;
#Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(secondaryDataSource(secondaryDataSourceProperties()));
factory.setTransactionManager(platformTransactionManager);
factory.setIsolationLevelForCreate("ISOLATION_READ_COMMITTED");
factory.setTablePrefix(batchTablePrefix);
factory.setDatabaseType("POSTGRES");
factory.setMaxVarCharLength(maxVarcharSize);
return factory.getObject();
}
};
The accepted answer is not valid anymore, the Spring Boot properties is changed to
spring.batch.jdbc.table-prefix=something.prefix_
You can always refer to Spring-boot doc for latest update.
I am currently in a Spring 4 application that uses MyBatis and is completely annotation-driven (that cannot change per architecture requirements). I am trying to add a second data source definition with a completely separate set of mapping configurations.
The problem I am having is that I cannot get the two data sources to play nicely together.
I created a new, virtually identical class and added #Qualifier data to the new file.
The configuration for the classes looks like this:
Data Source 1
#Configuration
#MapperScan (basePackages = "com.myproject.package1", annotationClass = Mapper.class)
public class DataSource1 {
#Bean
#Qualifier ("DS1")
public DataSource getDataSource() {
/* configuration loaded */
}
#Bean
#Qualifier ("DS1")
public SqlSessionFactory getSqlSessionFactory() {
SqlSessionFactoryBean bean = new SqlSessionFactoryBean();
bean.setDataSource(getDataSource());
/* mapper resources added */
return bean.getObject();
}
}
Data Source 2
#Configuration
#MapperScan (basePackages = "com.myproject.package2", annotationClass = Mapper.class)
public class DataSource2 {
#Bean
#Qualifier ("DS2")
public DataSource getDataSource() {
/* configuration loaded */
}
#Bean
#Qualifier ("DS2")
public SqlSessionFactory getSqlSessionFactory() {
SqlSessionFactoryBean bean = new SqlSessionFactoryBean();
bean.setDataSource(getDataSource());
/* mapper resources added */
return bean.getObject();
}
}
When this runs I get exception messages like:
org.apache.ibatis.binding.BindingException: Invalid bound statement (not found)
If I comment-out the data in DS2, DS1 works just fine again. I tried adding the mapper scanning configuration data in another bean and setting the name of the SqlSessionFactoryBean to pass into it but that did not work.
Suggestions?
UPDATE
I looked at this post and updated to use the following.
#Bean (name = "the_factory_1")
public SqlSessionFactory getSqlSessionFactory() { /* same code */ }
#Bean
public MapperScannerConfigurer getMapperScannerConfigurer() {
MapperScannerConfigurer configurer = new MapperScannerConfigurer();
configurer.setBasePackage("com.myproject.package1");
configurer.setAnnotationClass(Mapper.class);
configurer.setSqlSessionFactoryBeanName("the_factory_1");
return configurer;
}
However, that leads me to this error:
No qualifying bean of type [com.myproject.package1.mapper.MyMapper] found for dependency: expected at least 1 bean which qualifies as autowire candidate for this dependency. Dependency annotations: {}
When I debug only one #Bean for the factory gets invoked.
UPDATE 2
If I move everything to a single file all is fine. However, that is not ideal as I want the DataSource definitions to be separated. That's my only hurdle right now.
You can use ace-mybatis, it simplifies configuration.
Add one bean.
#Bean
public static AceMapperScannerConfigurer mapperScannerConfigurer() {
return AceMapperScannerConfigurer.builder()
.basePackage("com.myproject.package1")
.build();
}
And then mark your mapper interfaces with #AceMapper and specify sqlSessionFactory
#AceMapper(sqlSessionFactoryBeanName = "firstSqlSessionFactory")
public interface UserMapper {
Stream<User> selectUsers();
}
#AceMapper(sqlSessionFactoryBeanName = "secondSqlSessionFactory")
public interface ClientMapper {
Stream<Client> selectClients();
}
Please use DAOFactory pattern to get connections for multiple datasources like DS1 and DS2 and use DAOUtil class to provide required configuration using annotation
Spring provides the FactoryBean interface to allow non-trivial initialisation of beans. The framework provides many implementations of factory beans and -- when using Spring's XML config -- factory beans are easy to use.
However, in Spring 3.0, I can't find a satisfactory way of using factory beans with the annotation-based configuration (née JavaConfig).
Obviously, I could manually instantiate the factory bean and set any required properties myself, like so:
#Configuration
public class AppConfig {
...
#Bean
public SqlSessionFactory sqlSessionFactory() throws Exception {
SqlSessionFactoryBean factory = new SqlSessionFactoryBean();
factory.setDataSource(dataSource());
factory.setAnotherProperty(anotherProperty());
return factory.getObject();
}
However, this would fail if the FactoryBean implemented any Spring-specific callback interfaces, like InitializingBean, ApplicationContextAware, BeanClassLoaderAware, or #PostConstruct for example. I also need to inspect the FactoryBean, find out what callback interfaces it implements, then implement this functionality myself by calling setApplicationContext, afterPropertiesSet() etc.
This feels awkward and back-to-front to me: application-developers should not have to implement the callbacks of the IOC container.
Does anyone know of a better solution to using FactoryBeans from Spring Annotation configs?
I think that this is best solved when you rely on auto-wiring. If you are using Java configuration for the beans, this would like:
#Bean
MyFactoryBean myFactory()
{
// this is a spring FactoryBean for MyBean
// i.e. something that implements FactoryBean<MyBean>
return new MyFactoryBean();
}
#Bean
MyOtherBean myOther(final MyBean myBean)
{
return new MyOtherBean(myBean);
}
So Spring will inject for us the MyBean instance returned by the myFactory().getObject() as it does with XML configuration.
This should also work if you are using #Inject/#Autowire in your #Component/#Service etc classes.
As far as I understand your problem is what you want a result of sqlSessionFactory() to be a SqlSessionFactory (for use in other methods), but you have to return SqlSessionFactoryBean from a #Bean-annotated method in order to trigger Spring callbacks.
It can be solved with the following workaround:
#Configuration
public class AppConfig {
#Bean(name = "sqlSessionFactory")
public SqlSessionFactoryBean sqlSessionFactoryBean() { ... }
// FactoryBean is hidden behind this method
public SqlSessionFactory sqlSessionFactory() {
try {
return sqlSessionFactoryBean().getObject();
} catch (Exception ex) {
throw new RuntimeException(ex);
}
}
#Bean
public AnotherBean anotherBean() {
return new AnotherBean(sqlSessionFactory());
}
}
The point is that calls to #Bean-annotated methods are intercepted by an aspect which performs initialization of the beans being returned (FactoryBean in your case), so that call to sqlSessionFactoryBean() in sqlSessionFactory() returns a fully initialized FactoryBean.
Spring JavaConfig had a ConfigurationSupport class that had a getObject() method for use with FactoryBean's.
You would use it be extending
#Configuration
public class MyConfig extends ConfigurationSupport {
#Bean
public MyBean getMyBean() {
MyFactoryBean factory = new MyFactoryBean();
return (MyBean) getObject(factory);
}
}
There is some background in this jira issue
With Spring 3.0 JavaConfig was moved into Spring core and it was decided to get rid of the ConfigurationSupport class. Suggested approach is to now use builder pattern instead of factories.
An example taken from the new SessionFactoryBuilder
#Configuration
public class DataConfig {
#Bean
public SessionFactory sessionFactory() {
return new SessionFactoryBean()
.setDataSource(dataSource())
.setMappingLocations("classpath:com/myco/*.hbm.xml"})
.buildSessionFactory();
}
}
Some background here
This is what I'm doing, and it works:
#Bean
#ConfigurationProperties("dataSource")
public DataSource dataSource() { // Automatically configured from a properties file
return new BasicDataSource();
}
#Bean
public SqlSessionFactoryBean sqlSessionFactory(DataSource dataSource) throws Exception {
SqlSessionFactoryBean factory = new SqlSessionFactoryBean();
factory.setDataSource(dataSource); // Invoking dataSource() would get a new instance which won't be initialized
factory.setAnotherProperty(anotherProperty());
return factory;
}
#Bean
public AnotherBean anotherBean(SqlSessionFactory sqlSessionFactory) { // This method receives the SqlSessionFactory created by the factory above
return new AnotherBean(sqlSessionFactory);
}
Any bean you have declared can be passed as an argument to any other #Bean method (invoking the same method again will create a new instance which is not processed by spring).
If you declare a FactoryBean, you can use the bean type it creates as an argument for another #Bean method, and it will receive the right instance.
You could also use
#Autowired
private SqlSessionFactory sqlSessionFactory;
Anywhere and it will work too.
Why do you not inject the Factory in your AppConfiguration?
#Configuration
public class AppConfig {
#Resource
private SqlSessionFactoryBean factory;
#Bean
public SqlSessionFactory sqlSessionFactory() throws Exception {
return factory.getObjectfactory();
}
}
But may I did not understand your question correct. Because it looks to me that you are trying something strange - go a step back and rethink what are you really need.
Here is how I am doing it:
#Bean
def sessionFactoryBean: AnnotationSessionFactoryBean = {
val sfb = new AnnotationSessionFactoryBean
sfb.setDataSource(dataSource)
sfb.setPackagesToScan(Array("com.foo.domain"))
// Other configuration of session factory bean
// ...
return sfb
}
#Bean
def sessionFactory: SessionFactory = {
return sessionFactoryBean.getObject
}
The sessionFactoryBean gets created and the proper post-create lifecycle stuff happens to it (afterPropertiesSet, etc).
Note that I do not reference the sessionFactoryBean as a bean directly. I autowire the sessionFactory into my other beans.