Spring boot datasource lazy initialization - java

#Lazy annotation not working on datasource configuration. This datasource autowired into prototype scoped bean, but datasource initialize on startup eager. In stack tracei see call from TomcatServletWebServerFactory.
#Configuration
#Lazy
public class MsSqlMppvConfig {
#Bean
#ConfigurationProperties("spring.mppvdatasource")
public DataSourceProperties mppvDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#Lazy
#Qualifier("mppvdatasource")
#ConfigurationProperties("spring.mppvdatasource.hikari")
public DataSource mppvDataSource() {
return mppvDataSourceProperties().initializeDataSourceBuilder().build();
}
#Bean(name = "tm_mppvdatasource")
#Autowired
DataSourceTransactionManager tm(#Qualifier("mppvdatasource") DataSource datasource) {
return new DataSourceTransactionManager(datasource);
}}
#Autowired
#Qualifier("mppvdatasource")
#Lazy
DataSource mppvDs;
Maybe problem in #Qualifier annotation?

Had the same issue when importing spring-boot-starter-jdbc. Had to change it to just use spring-jdbc

Related

Cyclic dependency when there are multiple DataSource defined

I wanted to declare two DataSource beans and use one of them dynamically using AbstractRoutingDataSource, which is declared as #Primary bean. Surprisingly, I was not able to run my application because of cyclic dependency:
org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaConfiguration
┌─────┐
| dataSource defined in <myclass>
↑ ↓
| readOnlyDataSource defined in <myclass>
↑ ↓
| org.springframework.boot.autoconfigure.jdbc.DataSourceInitializerInvoker
└─────┘
It is cause because of this implementation:
#SpringBootApplication
public class DemoApplication {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
#Bean
#Primary
DataSource dataSource(#Qualifier("firstDS") DataSource firstDS,
#Qualifier("secondDS") DataSource secondDS) {
MyRoutingDataSource ds = new MyRoutingDataSource();
ds.setCurrentDS(firstDS);
return ds;
}
#Bean("firstDS")
public DataSource firstDS(DataSourceProperties properties) {
return properties.initializeDataSourceBuilder().build();
}
#Bean("secondDS")
public DataSource secondDs(DataSourceProperties properties) {
return properties.initializeDataSourceBuilder().build();
}
class MyRoutingDataSource extends AbstractRoutingDataSource {
private DataSource currentDS;
public void setCurrentDS(DataSource currentDS) {
this.currentDS = currentDS;
}
#Override
protected Object determineCurrentLookupKey() {
return currentDS;
}
}
}
Please note that I don't want to exclude DataSourceAutoConfiguration - it provides some additional functionally that I want to use in my project (e.g. DataSourceInitializer).
Could you please explain to me why it does not work? I feel that this error message is misleading. There is no cyclic dependency between HibernateJpaConfiguration and DataSourceInitializerInvoker. Both of them uses DataSource which primary definition I provide.
There is full project with that issue: https://github.com/kozub/spring-dependency-management-bug
I ran into the same problem you have, with the difference that I am not using DataSourceAutoConfiguration.
I'm not a Spring expert, so I can't tell you the root cause. But I was able to get my code to work by going from something like this, which is similar to what you posted:
#Bean
#Primary
DataSource dataSource(#Qualifier("firstDS") DataSource firstDS,
#Qualifier("secondDS") DataSource secondDS) {
MyRoutingDataSource ds = new MyRoutingDataSource();
ds.setFirstDS(firstDS);
ds.setSecondDs(secondDS);
return ds;
}
#Bean("firstDS")
public DataSource firstDS() {
return /*create first DS*/
}
#Bean("secondDS")
public DataSource secondDs(DataSourceProperties properties) {
return /*create second DS*/
}
To this:
#Bean
DataSource dataSource() {
DataSource first = /*create first DS*/
DataSource second = /*create second DS*/
MyRoutingDataSource ds = new MyRoutingDataSource();
ds.setFirstDS(first);
ds.setSecondDs(second);
return ds;
}
As you can see, I was able to solve the problem by only having one Spring bean of type DataSource. I created the two "first" and "second" DataSources inside the method which creates the routing datasource so that they don't have to be Spring beans. Having only one bean of type DataSource got rid of my circular dependency error.
This solved my problem, but you also want to use DataSourceAutoConfiguration.
I think you may be able to achieve that with something like this:
#Bean
DataSource dataSource(#Qualifier("firstDSproperties") DataSourceProperties firstDSprops,
#Qualifier("secondDSproperties") DataSourceProperties secondDSprops) {
DataSource first = firstDSprops.initializeDataSourceBuilder().build();
DataSource second = secondDSprops.initializeDataSourceBuilder().build();
MyRoutingDataSource ds = new MyRoutingDataSource();
ds.setCurrentDS(firstDS);
return ds;
}
#Bean("firstDSproperties")
#ConfigurationProperties("datasource.first")
public DataSourceProperties firstDataSourceProperties() {
return new DataSourceProperties();
}
#Bean("secondDSproperties")
#ConfigurationProperties("datasource.second")
public DataSourceProperties secondDataSourceProperties() {
return new DataSourceProperties();
}
What this code does is to make two beans of type DataSourceProperties rather than type DataSource. With DataSourceProperties beans you can still let Spring autowire your config without (hopefully) having the cyclical dependency problem caused by having multiple beans of type DataSource depending on each other.
I haven't tested doing this with DataSourceProperties since I am not using DataSourceAutoConfiguration in my code. But based on your code I think it might work.
There is slight a mistake here. Let me explain you with foo and bar DB example along with git repo reference.
Here is how your application.properties look like
# Oracle DB - "foo"
spring.datasource.url=jdbc:oracle:thin:#//db-server-foo:1521/FOO
spring.datasource.username=fooadmin
spring.datasource.password=foo123
spring.datasource.driver-class-name=oracle.jdbc.OracleDriver
# PostgreSQL DB - "bar"
bar.datasource.url=jdbc:postgresql://db-server-bar:5432/bar
bar.datasource.username=baradmin
bar.datasource.password=bar123
bar.datasource.driver-class-name=org.postgresql.Driver
Set the SQL Dialect to “default” in your application.properties to let Spring autodetect the different SQL Dialects of each datasource
spring.jpa.database=default
Package should look something like
src/main/java
- com.foobar
- foo
- domain
- repo
- bar
- domain
- repo
Here is the main part. Configuration classes
Foo Configuration class (Oracle)
package com.foobar;
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "entityManagerFactory",
basePackages = { "com.foobar.foo.repo" }
)
public class FooDbConfig {
#Primary
#Bean(name = "dataSource")
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "entityManagerFactory")
public LocalContainerEntityManagerFactoryBean
entityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("dataSource") DataSource dataSource
) {
return builder
.dataSource(dataSource)
.packages("com.foobar.foo.domain")
.persistenceUnit("foo")
.build();
}
#Primary
#Bean(name = "transactionManager")
public PlatformTransactionManager transactionManager(
#Qualifier("entityManagerFactory") EntityManagerFactory
entityManagerFactory
) {
return new JpaTransactionManager(entityManagerFactory);
}
}
Bar Configuration class (postgres)
package com.foobar;
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "barEntityManagerFactory",
transactionManagerRef = "barTransactionManager",
basePackages = { "com.foobar.bar.repo" }
)
public class BarDbConfig {
#Bean(name = "barDataSource")
#ConfigurationProperties(prefix = "bar.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "barEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean
barEntityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("barDataSource") DataSource dataSource
) {
return
builder
.dataSource(dataSource)
.packages("com.foobar.bar.domain")
.persistenceUnit("bar")
.build();
}
#Bean(name = "barTransactionManager")
public PlatformTransactionManager barTransactionManager(
#Qualifier("barEntityManagerFactory") EntityManagerFactory
barEntityManagerFactory
) {
return new JpaTransactionManager(barEntityManagerFactory);
}
}
Your repositories would look something like
package com.foobar.bar.repo;
#Repository
public interface BarRepository extends JpaRepository<Bar, Long> {
Bar findById(Long id);
}
package com.foobar.foo.repo;
#Repository
public interface FooRepository extends JpaRepository<Foo, Long> {
Foo findById(Long id);
}
And you are done here.
You can refer code on github here

IllegalTransactionStateException Pre-bound JDBC Connection found when configuring multiple datasources

I have a spring batch project wherein I read data from a datasource , process the data and write into another primary data source. I am extending CrudRepository for dao operations.
I am trying to configure multiple datasources for my springbatch + spring boot application below is the package structure :
myproject
---com
---batch
---config
---firstDsConfig.java
---secondDsConfig.java
---firstrepository
---firstCrudRepository.java
---secondRepository
---SecondCrudRepository.java
---firstEntity
---firstDBEntity.java
---secondEntity
---secondDBEntity.java
----main
---MyMainClass.java
Code for firstDsConfig.java:
#Configuration
#EnableJpaRepositories(
entityManagerFactoryRef = "firstEntityManagerFactory",
transactionManagerRef = "firstTransactionManager",
basePackages = "com.batch.firstrepository"
)
#EnableTransactionManagement
public class FirstDbConfig {
#Primary
#Bean(name = "firstEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean firstEntityManagerFactory(final EntityManagerFactoryBuilder builder,
final #Qualifier("firstDs") DataSource dataSource) {
return builder
.dataSource(dataSource)
.packages("com.batch.firstEntity")
.persistenceUnit("abc")
.build();
}
#Primary
#Bean(name = "firstTransactionManager")
public PlatformTransactionManager firstTransactionManager(#Qualifier("firstEntityManagerFactory")
EntityManagerFactory firstEntityManagerFactory) {
return new JpaTransactionManager(firstEntityManagerFactory);
}
#Primary
#Bean(name = "firstDs")
#ConfigurationProperties(prefix = "spring.datasource.first")
public DataSource firstDataSource() {
return DataSourceBuilder.create().build();
}
}
Code for secondDsConfig:
#Configuration
#EnableJpaRepositories(
entityManagerFactoryRef = "secondEntityManagerFactory",
transactionManagerRef = "secondTransactionManager",
basePackages = "com.batch.secondrepository"
)
#EnableTransactionManagement
public class FirstDbConfig {
#Primary
#Bean(name = "secondEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean secondEntityManagerFactory(final EntityManagerFactoryBuilder builder,
final #Qualifier("secondDs") DataSource dataSource) {
return builder
.dataSource(dataSource)
.packages("com.batch.secondEntity")
.persistenceUnit("xyz")
.build();
}
#Primary
#Bean(name = "secondTransactionManager")
public PlatformTransactionManager secondTransactionManager(#Qualifier("secondEntityManagerFactory")
EntityManagerFactory firstEntityManagerFactory) {
return new JpaTransactionManager(secondEntityManagerFactory);
}
#Primary
#Bean(name = "secondDs")
#ConfigurationProperties(prefix = "spring.datasource.second")
public DataSource secondDataSource() {
return DataSourceBuilder.create().build();
}
}
Here is my main class
#EnableScheduling
#EnableBatchProcessing
#SpringBootApplication(scanBasePackages = { "com.batch" })
public class MyMainClass {
#Autowired
private JobLauncher jobLauncher;
#Autowired
private JobRepository jobRepository;
#Autowired
#Qualifier(firstDs)
private DataSource dataSource;
#Autowired
#Qualifier("myJob")
private Job job;
public static void main(String[] args) throws Exception {
SpringApplication.run(MyMainClass.class, args);
}
#EventListener(ApplicationReadyEvent.class)
private void start() throws Exception {
jobLauncher.run(job, new JobParameters());
}
#Bean(name="jobService")
public JobService jobService() throws Exception {
SimpleJobServiceFactoryBean factoryBean = new SimpleJobServiceFactoryBean();
factoryBean.setDataSource(dataSource);
factoryBean.setJobRepository(jobRepository);
factoryBean.setJobLocator(new MapJobRegistry());
factoryBean.setJobLauncher(jobLauncher);
factoryBean.afterPropertiesSet();
return factoryBean.getObject();
}
}
Here are the crud repo:
First curd repo
public interface FirstCrudRepository extends CrudRepository<FirstDbEntity, Integer> {
List<FirstDbEntity> findByOId(String oId);
}
Second curd repo
public interface SecondCrudRepository extends CrudRepository<SecondDbEntity, Integer> {
List<SecondDbEntity> findByPid(String pid);
}
When I run my application I see following error while saving record using FirstCrudRepository:
org.springframework.transaction.IllegalTransactionStateException: Pre-bound JDBC Connection found! JpaTransactionManager does not support running within DataSourceTransactionManager if told to manage the DataSource itself. It is recommended to use a single JpaTransactionManager for all transactions on a single DataSource, no matter whether JPA or JDBC access.
Note: I am able to fetch details successfully from SecondCrudRepository
By default, if you provide a DataSource, Spring Batch will use a DataSourceTransactionManager which knows nothing about your JPA configuration. You need to tell Spring Batch to use your JpaTransactionManager. This is explained in the:
reference documentation: https://docs.spring.io/spring-batch/4.1.x/reference/html/index-single.html#javaConfig.
Javadoc of #EnableBatchProcessing: https://docs.spring.io/spring-batch/4.1.x/api/org/springframework/batch/core/configuration/annotation/EnableBatchProcessing.html
I suspect there are 2 different transaction managers present which would be causing issue. Annotate with #Transactional and specifying the transaction manager would help
Source:
Spring - Is it possible to use multiple transaction managers in the same application?

Create metadata tables for Spring Batch in an embedded database

I'm using Spring Boot autoconfigured Spring Batch setup with #EnableBatchProcessing. The problem is that it creates metadata tables in the main database and I don't want this behavior. I would like to save all the Spring Batch information to an embedded database.
I've tried using the spring.batch.initialize-schema=embedded property and adding H2 to the classpath, overriding DefaultBatchConfigurer bean with the H2 data source, replacing JobRepository and JobLauncher beans but it constantly creates metadata tables in the main Oracle database. I'm using Spring Batch 3.0.8 and Spring Boot 1.5.9.
Any help is appreciated, thank you!
UPDATE: Adding come configuration:
H2 config:
#Bean
public DataSource springBatchDataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("org.h2.Driver");
dataSource.setUrl("jdbc:h2:mem:db;DB_CLOSE_DELAY=-1");
dataSource.setUsername("sa");
dataSource.setPassword("sa");
return dataSource;
}
Oracle:
#Bean
#Primary
public DataSource dataSource() throws SQLException {
PoolDataSourceImpl dataSource = new PoolDataSourceImpl();
dataSource.setConnectionFactoryClassName(environment.getRequiredProperty("db.driverClassName"));
dataSource.setURL(environment.getRequiredProperty("db.url"));
dataSource.setUser(environment.getRequiredProperty("db.username"));
dataSource.setPassword(environment.getRequiredProperty("db.password"));
dataSource.setFastConnectionFailoverEnabled(Boolean.valueOf(environment
.getRequiredProperty("db.fast.connect.failover.enabled")));
dataSource.setValidateConnectionOnBorrow(true);
dataSource.setSQLForValidateConnection("SELECT SYSDATE FROM DUAL");
dataSource.setONSConfiguration(environment.getRequiredProperty("db.ons.config"));
dataSource.setInitialPoolSize(Integer.valueOf(environment.getRequiredProperty("db.initial.pool.size")));
dataSource.setMinPoolSize(Integer.valueOf(environment.getRequiredProperty("db.min.pool.size")));
dataSource.setMaxPoolSize(Integer.valueOf(environment.getRequiredProperty("db.max.pool.size")));
dataSource.setAbandonedConnectionTimeout(120);
dataSource.setInactiveConnectionTimeout(360);
dataSource.setTimeToLiveConnectionTimeout(0);
return dataSource;
}
Batch configuration:
#Configuration
#EnableBatchProcessing
public class BatchConfigurer extends DefaultBatchConfigurer {
#Override
#Autowired
public void setDataSource(#Qualifier("springBatchDataSource") DataSource dataSource) {
super.setDataSource(dataSource);
}
}
and some properties related..
spring:
batch:
job.enabled: false
1.redefine the BasicBatchConfigurer
2.in spring-boot-batch-starter, you must create your own BatchDataSourceInitializer, so it will execute the init sql scripts
#Configuration
public class MyBatchConfigurer {
#Bean
public BasicBatchConfigurer batchConfigurer(BatchProperties properties,
#Qualifier("springBatchDataSource") DataSource dataSource,
ObjectProvider<TransactionManagerCustomizers> transactionManagerCustomizers) {
return new BasicBatchConfigurer(properties, dataSource,
transactionManagerCustomizers.getIfAvailable());
}
#Bean
public BatchDataSourceInitializer batchDataSourceInitializer(#Qualifier("springBatchDataSource") DataSource dataSource,
ResourceLoader resourceLoader, BatchProperties properties) {
return new BatchDataSourceInitializer(dataSource, resourceLoader,
properties);
}
}
You need to qualify your H2 datasource bean
#Bean(name = "springBatchDataSource")
public DataSource springBatchDataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("org.h2.Driver");
dataSource.setUrl("jdbc:h2:mem:db;DB_CLOSE_DELAY=-1");
dataSource.setUsername("sa");
dataSource.setPassword("sa");
return dataSource;
}
I hope this help.

Is PersistenceAnnotationBeanPostProcessor affect somehow on Environment or #PropertySource?

Hi i am beginner in Spring and Jpa Integration. While i have tried to configure my database connection,details handler itp. I came across a strange behavior of spring.
First of all, I have 3 config file:
1) RootConfig - contains everything but controllers
2) WebConfig - contains every Bean which is controllers annotated
3) JdbcConfig - contains Beans related with dataSource, this config is imported by RootConfig using this annotation (#Import(JdbcConfig.class)).
RootConfig looks like this:
#Configuration
#Import(JdbcConfig.class)
#ComponentScan(basePackages = "app", excludeFilters = {#ComponentScan.Filter(type = FilterType.ANNOTATION,value = {EnableWebMvc.class, Controller.class})})
public class RootConfig
{
}
JdbcConfig:
#Configuration
#PropertySource("classpath:db.properties")
#EnableTransactionManagement
public class JdbcConfig
{
#Resource
public Environment env;
#Bean
public DataSource dataSource()
{
System.out.println(env);
DriverManagerDataSource ds = new DriverManagerDataSource();
ds.setDriverClassName(env.getProperty("dataSource.driverClassName"));
ds.setUrl(env.getProperty("dataSource.Url"));
ds.setUsername(env.getProperty("dataSource.username"));
ds.setPassword(env.getProperty("dataSource.password"));
return ds;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactoryBean(DataSource ds, JpaVendorAdapter jpaVendorAdapter)
{
LocalContainerEntityManagerFactoryBean emfb = new LocalContainerEntityManagerFactoryBean();
emfb.setDataSource(ds);
emfb.setJpaVendorAdapter(jpaVendorAdapter);
emfb.setPackagesToScan("app.model");
return emfb;
}
#Bean
public JpaVendorAdapter jpaVendorAdapter()
{
HibernateJpaVendorAdapter adapter = new HibernateJpaVendorAdapter();
adapter.setDatabase(Database.POSTGRESQL);
adapter.setShowSql(true);
adapter.setGenerateDdl(true);
adapter.setDatabasePlatform(env.getProperty("dataSource.dialect"));
return adapter;
}
#Bean
public BeanPostProcessor beanPostProcessor()
{
return new PersistenceExceptionTranslationPostProcessor();
}
#Bean
public JpaTransactionManager jpaTransactionManager(EntityManagerFactory em) {
return new JpaTransactionManager(em)}}
At this moment everything works fine, Environment field is not null value and contains all defined properties. The problem appear when I am trying to add Bean PersistenceAnnotationBeanPostProcessor So when I add this Bean to JdbcConfig.class, Environment field became null but when i add this RootConfig Environment again contains all needed values. So is there any known problem with propertySource and this bean? Is PersistenceAnnotationBeanPostProcessor somehow affect on #PropertySource or #Autorwired(#Inject/#Ressource) Environment field? Is there any reason why Environment must be configure in main config and it cannot be imported from other config by #Import?
I think your problem is related with this spring issue SPR-8269.
Can you try setting up PersistenceAnnotationBeanPostProcessor bean definition as static?
I also had same problem and I solved in this way.

#Autowired bean not being found

I'm getting errors trying to inject resource dependencies into my unit testing.
My approach has been to write a TestConfig.java to replace the applicationContext.xml for production which manages the connections of the beans. So that I can run it with an in-memory database and just test components.
TestConfig.java
#Configuration
#EnableTransactionManagement
public class TestConfig {
#Bean
public DataSource dataSource() {
DriverManagerDataSource ds = new DriverManagerDataSource();
ds.setDriverClassName("org.hsqldb.jdbcDriver");
ds.setUrl("jdbc:hsqldb:mem:testdb");
ds.setUsername("sa");
ds.setPassword("");
return ds;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactoryBean(){
LocalContainerEntityManagerFactoryBean lcemfb
= new LocalContainerEntityManagerFactoryBean();
lcemfb.setDataSource(this.dataSource());
lcemfb.setPackagesToScan(new String[] {"com.dao","com.data"});
lcemfb.setPersistenceUnitName("MyTestPU");
HibernateJpaVendorAdapter va = new HibernateJpaVendorAdapter();
lcemfb.setJpaVendorAdapter(va);
Properties ps = new Properties();
ps.put("hibernate.dialect", "org.hibernate.dialect.HSQLDialect");
ps.put("hibernate.hbm2ddl.auto", "create");
lcemfb.setJpaProperties(ps);
lcemfb.afterPropertiesSet();
return lcemfb;
}
#Bean
public PlatformTransactionManager transactionManager() {
JpaTransactionManager tm = new JpaTransactionManager();
tm.setEntityManagerFactory(this.entityManagerFactoryBean().getObject());
return tm;
}
#Bean
public PersistenceExceptionTranslationPostProcessor exceptionTranslation(){
return new PersistenceExceptionTranslationPostProcessor();
}
#Bean
public AutowiredAnnotationBeanPostProcessor autowiredAnnotationBeanPostProcessor()
{
return new AutowiredAnnotationBeanPostProcessor();
}
}
ProductsDaoTest.java
#ContextConfiguration(classes = { TestConfig.class })
#RunWith(SpringJUnit4ClassRunner.class)
public class ProductsDaoTest {
#Resource(name="com.dao.ProductsDao")
private ProductsDao testDao;
#Test
public void testSaveProduct() {
Product productA = new Product();
testDao.save(productA);
Set<Product> products = testDao.getAllProducts();
assertNotNull(products);
}
}
The error is Error creating bean with name 'com.dao.ProductsDaoTest': Injection of resource dependencies failed
So it can't find the ProductDao Bean which is a #Repository with a #Autowired sessionFactory.
So my guess is that because I'm not naming the beans using xml it can't find it, though I thought it should automatically pick it up from setPackagesToScan(). So is there a way to manually insert the Bean mapping so that it can be found?
Also more generally is this a reasonable way to go about testing Spring DAO configurations?
Regards,
Iain
I think you are trying to use wrong name of your DAO bean in #Resource annotation. Have you explicitly specify name of the ProductsDao bean using #Qualifier? If no, then as I remember by default the name of the bean will be productsDao. So you should inject your DAO like:
#Resource(name="productsDao")
private ProductsDao testDao;
If you have only one ProductDAO implementation then simply write:
#Autowired
private ProductsDao testDao;
or
#Inject
private ProductsDao testDao;
In case if you want to give specific name to DAO then use next construction:
#Respository
#Qualifier(name="specificName")
public class ProductDAO...
EDIT:
As Boris noted you should also specify which package to scan for defined beans (classes annotated with #Component, #Service, #Repository...). For this you should add #ComponentScan annotation to your configuration class definition.
#Configuration
#EnableTransactionManagement
#ComponentScan(basePackages = {"package_to_scan"})
public class TestConfig {...}

Categories

Resources