I created a SpringBatch application with java configuration. I have a main method and a class that represents a job.
#ComponentScan
#EnableAutoConfiguration
public class App
{
public static void main( String[] args )
{
System.out.println( "Starting Spring Batch Execution -------" );
SpringApplication.run(App.class, args);
}
}
#Configuration
#EnableBatchProcessing
public class FlatFileJob {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
/**
* Create and configure job
* #return
*/
#Bean(name = "Read RabbitMQ")
public Job addFlatFileJob(){
return jobs.get("carJob")
.start(this.flatFileStep())
.build();
}
/**
* Create and configure the only step
* #return
*/
#Bean
public Step flatFileStep(){
return steps.get("step")
.<Car, Car> chunk(3)
.reader(new CarItemReader())
.processor(new CarItemProcessor())
.writer(new CarItemWriter())
.build();
}
#Bean
public PlatformTransactionManager transactionManager(){
return new ResourcelessTransactionManager();
}
#Bean
public JobRepository jobRepository() throws Exception{
JobRepository jobRepository = (JobRepository) new JobRepositoryFactoryBean();
return jobRepository;
}
#Bean
public JdbcTemplate jdbcTemplate(DataSource dataSource){
return new JdbcTemplate(dataSource);
}
#Bean
public DataSource getDataSource(){
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName("org.postgresql.Driver");
dataSource.setUrl("jdbc:postgresql://127.0.0.1:5432/spring_batch");
dataSource.setUsername("xxx");
dataSource.setPassword("xxx");
return dataSource;
}
}
When I execute this, I got an exception.
Caused by: java.lang.IllegalArgumentException: DataSource must not be null.
at org.springframework.util.Assert.notNull(Assert.java:112)
at org.springframework.batch.core.repository.support.JobRepositoryFactoryBean.afterPropertiesSet(JobRepositoryFactoryBean.java:171)
at org.springframework.batch.core.repository.support.AbstractJobRepositoryFactoryBean.getObject(AbstractJobRepositoryFactoryBean.java:202)
at neoway.com.job.FlatFileJob.jobRepository(FlatFileJob.java:88)
at neoway.com.job.FlatFileJob$$EnhancerBySpringCGLIB$$7ec3c4f6.CGLIB$jobRepository$0(<generated>)
at neoway.com.job.FlatFileJob$$EnhancerBySpringCGLIB$$7ec3c4f6$$FastClassBySpringCGLIB$$990caa45.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228)
at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:312)
at neoway.com.job.FlatFileJob$$EnhancerBySpringCGLIB$$7ec3c4f6.jobRepository(<generated>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:166)
I configure all with java, including DataSource. I don't know why Spring didn't recognize DataSource configuration. What's the problem?
Related
I am using spring batch spring cloud task to automate data read/write (read from file and store in MongoDb). In my use case i have 2 (will add 1 more step after successfully achieve 2) steps. I am trying to use remote partitioning integrate spring cloud task DeployerPartitionHandler.java class master node to slave node as a broker, that is what i understand instead of using activemg/rabbitmq spring integration. I have created 2 partitioner and 2 partitioner handlers bean for my 2 steps. Below is sample code. I am getting below exception.
2020-03-11 12:03:59 - o.s.batch.core.step.AbstractStep - Encountered an error executing step step1 in job Job669228617
java.lang.NullPointerException: null
at org.springframework.cloud.task.batch.partition.DeployerPartitionHandler.launchWorker(DeployerPartitionHandler.java:347)
at org.springframework.cloud.task.batch.partition.DeployerPartitionHandler.launchWorkers(DeployerPartitionHandler.java:313)
at org.springframework.cloud.task.batch.partition.DeployerPartitionHandler.handle(DeployerPartitionHandler.java:302)
at org.springframework.batch.core.partition.support.PartitionStep.doExecute(PartitionStep.java:106)
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:208)
at org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:148)
at org.springframework.batch.core.job.AbstractJob.handleStep(AbstractJob.java:410)
at org.springframework.batch.core.job.SimpleJob.doExecute(SimpleJob.java:136)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:319)
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:147)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:140)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:344)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:198)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163)
at org.springframework.batch.core.configuration.annotation.SimpleBatchConfiguration$PassthruAdvice.invoke(SimpleBatchConfiguration.java:127)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212)
at com.sun.proxy.$Proxy77.run(Unknown Source)
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.execute(JobLauncherCommandLineRunner.java:192)
#Configuration
#ComponentScan(basePackageClasses = DbConfig.class)
public JobConfig {
#Autowired
private TaskLauncher taskLauncher;
#Autowired
private JobExplorer jobExplorer;
#Autowired
private TaskRepository taskRepository;
#Autowired
private Reader1 reader2;
#Autowired
private Writer2 writer2;
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private DelegatingResourceLoader resourceLoader;
#Autowired
private ConfigurableApplicationContext context;
#Autowired
public JobRepository jobRepository;
#Autowired
private Environment environment;
private static final int GRID_SIZE = 2;
#Autowired
private Reader1 reader1;
#Autowired
private Writer2 writer2;
#Autowired
#Qualifier("partitionHandler1")
private PartitionHandler partitionHandler1;
#Autowired
#Qualifier("partitionHandler2")
private PartitionHandler partitionHandler2;
#Bean
#Profile("master")
public Job masterJob() {
Random random = new Random();
return this.jobBuilderFactory.get("masterJob" + random.nextInt())
.start(step1())
.next(step2())
.build();
}
#Bean
#Profile("master")
public Step step1() {
return this.stepBuilderFactory.get("step1")
.partitioner("slaveStep1", partitioner1())
.partitionHandler(partitionHandler1)
.taskExecutor(taskExecutor())
.build();
}
#Bean
#Profile("master")
public Step step2() {
return this.stepBuilderFactory.get("step2")
.partitioner("slaveStep2",partitioner2())
.partitionHandler(partitionHandler2)
.taskExecutor(taskExecutor())
.build();
}
#Bean
#Profile("worker")
public DeployerStepExecutionHandler stepExecutionHandler(JobExplorer jobExplorer) {
return new DeployerStepExecutionHandler(this.context, jobExplorer, this.jobRepository);
}
#Bean
public Step slaveStep1() {
return this.stepBuilderFactory.get("slaveStep1")
.<Domain1, Domain1>chunk(50)
.reader(reader1)
.writer(writer1)
.listener(stepExecutionListner())
.build();
}
#Bean
public Step slaveStep2() {
return this.stepBuilderFactory.get("slaveStep2")
.<Domain2, Domain2>chunk(50)
.reader(reader2)
.writer(writer2)
.listener(stepExecutionListner())
.build();
}
#Bean
public Partitioner partitioner1() {
FilePartitioner filePartitioner = new FilePartitioner("classpath:input/test1*.csv");
return filePartitioner.getFilesPartitioner();
}
#Bean
public Partitioner partitioner2() {
FilePartitioner filePartitioner = new FilePartitioner("classpath:input/test2*.csv");
return filePartitioner.getFilesPartitioner();
}
#Bean(name="partitionHandler1")
public PartitionHandler partitionHandler1(TaskLauncher taskLauncher,
JobExplorer jobExplorer, TaskRepository taskRepository) {
Resource resource = this.resourceLoader.getResource("maven://com.abc:test:1.0-SNAPSHOT");
DeployerPartitionHandler partitionHandler =
new DeployerPartitionHandler(taskLauncher, jobExplorer, resource, "ormBusUnitLoaderStep",taskRepository);
List<String> commandLineArgs = new ArrayList<>(3);
commandLineArgs.add("--spring.profiles.active=worker");
commandLineArgs.add("--spring.cloud.task.initialize.enable=false");
commandLineArgs.add("--spring.batch.initializer.enabled=false");
partitionHandler.setCommandLineArgsProvider(new PassThroughCommandLineArgsProvider(commandLineArgs));
SimpleEnvironmentVariablesProvider environmentVariablesProvider = new SimpleEnvironmentVariablesProvider(this.environment);
partitionHandler.setEnvironmentVariablesProvider(environmentVariablesProvider);
partitionHandler.setMaxWorkers(3);
partitionHandler.setApplicationName("Job");
return partitionHandler;
}
#Bean(name="partitionHandler2")
//#Scope(value = "prototype")
public PartitionHandler partitionHandler2(TaskLauncher taskLauncher,
JobExplorer jobExplorer, TaskRepository taskRepository) {
Resource resource = this.resourceLoader.getResource("maven://com.abc:test:1.0-SNAPSHOT");
DeployerPartitionHandler partitionHandler =
new DeployerPartitionHandler(taskLauncher, jobExplorer, resource, "cvaRmaStep",taskRepository);
List<String> commandLineArgs = new ArrayList<>(3);
commandLineArgs.add("--spring.profiles.active=worker");
commandLineArgs.add("--spring.cloud.task.initialize.enable=false");
commandLineArgs.add("--spring.batch.initializer.enabled=false");
partitionHandler.setCommandLineArgsProvider(new PassThroughCommandLineArgsProvider(commandLineArgs));
SimpleEnvironmentVariablesProvider environmentVariablesProvider = new SimpleEnvironmentVariablesProvider(this.environment);
partitionHandler.setEnvironmentVariablesProvider(environmentVariablesProvider);
partitionHandler.setMaxWorkers(3);
partitionHandler.setApplicationName("CVAJob");
return partitionHandler;
}
#Bean
#StepScope
public StepExecutionListner stepExecutionListner() {
return new StepExecutionListner();
}
}
Below is DB config
#Configuration
public class DbConfig implements BatchConfigurer {
#ConfigurationProperties(prefix = "spring.datasource")
#Bean(name="batchDataSource")
#Primary
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Override
public JobRepository getJobRepository() throws Exception {
JobRepositoryFactoryBean factoryBean = new JobRepositoryFactoryBean();
factoryBean.setDatabaseType("ORACLE");
factoryBean.setDataSource(dataSource());
factoryBean.setTransactionManager(getTransactionManager());
factoryBean.setIsolationLevelForCreate("ISOLATION_READ_COMMITTED");
factoryBean.setTablePrefix("SCHEMA.BATCH_");
return factoryBean.getObject();
}
#Override
public PlatformTransactionManager getTransactionManager() throws Exception {
return new DataSourceTransactionManager(dataSource());
}
#Override
public JobLauncher getJobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
#Override
public JobExplorer getJobExplorer() throws Exception {
JobExplorerFactoryBean factory = new JobExplorerFactoryBean();
factory.setDataSource(dataSource());
factory.afterPropertiesSet();
return factory.getObject();
}
#Bean
public TaskConfigurer taskConfigurer(
#Qualifier("batchDataSource")DataSource batchDataSource){
return new DefaultTaskConfigurer(batchDataSource);
}
}
How to accomplish my use case using remote partitioning?
I think that you can just have one single partitionHandler bean, but what you can do is to have a partitioned subflow with multiple steps, instead of having multiple steps with different partition handlers!
I am trying to load data from SQL server, apply some transformations and put it into CSV using the spring batch scheduler. All works fine when everything is in the same class.
This is my code:
package com.abc.tools.bootbatch;
#Configuration
#EnableBatchProcessing
public class BatchConfiguration {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
public DataSource dataSource;
private static final String qry = "select top 20 colA, colB, colC from ABC";
private Resource outputResource = new FileSystemResource("output/outputData.csv");
#Bean
public DataSource dataSource() {
final DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driver_class);
dataSource.setUrl("db_url");
dataSource.setUsername(usr);
dataSource.setPassword(pwd);
return dataSource;
}
#Bean
ItemReader<Trade> reader() {
JdbcCursorItemReader<Trade> databaseReader = new JdbcCursorItemReader<>();
databaseReader.setDataSource(dataSource);
databaseReader.setSql(qry);
databaseReader.setRowMapper(new BeanPropertyRowMapper<>(Trade.class));
return databaseReader;
}
#Bean
public TradeProcessor processor() {
return new TradeProcessor();
}
#Bean
public FlatFileItemWriter<Trade> writer()
{
//Create writer instance
FlatFileItemWriter<Trade> writer = new FlatFileItemWriter<>();
//Set output file location
writer.setResource(outputResource);
//All job repetitions should "append" to same output file
writer.setAppendAllowed(true);
//Name field values sequence based on object properties
writer.setLineAggregator(new DelimitedLineAggregator<Trade>() {
{
setDelimiter(",");
setFieldExtractor(new BeanWrapperFieldExtractor<Trade>() {
{
setNames(new String[] { "colA", "colB", "colC" });
}
});
}
});
return writer;
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1").<Trade, Trade> chunk(10)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
#Bean
public Job exportUserJob() {
return jobBuilderFactory.get("exportUserJob")
.incrementer(new RunIdIncrementer())
.flow(step1())
.end()
.build();
}
}
When I seperate the processing, loading and data reading in different classes, it works fine using autowire, unless I use batch job. On using the batch job it gives error in instantiating the database.
So I removed the autowire and tried to do something like this:
#Configuration
#EnableBatchProcessing
public class BatchConfiguration {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
public DBConfig dbConfig;
public DataConnection dataconnection=new DataConnection();
DataReader reader=new DataReader();
TradeProcessor processor=new TradeProcessor();
FlatFileWriter flatFileWriter=new FlatFileWriter();
DataSource ds=dataconnection.getDataSource(dbConfig);
#Bean
public Step step1() {
return stepBuilderFactory.get("step1").<Trade, Trade> chunk(10)
.reader(reader.reader(ds))
.processor(processor.processor())
.writer(flatFileWriter.writer())
.build();
}
#Bean
public Job exportUserJob() {
return jobBuilderFactory.get("exportUserJob")
.incrementer(new RunIdIncrementer())
.flow(step1())
.end()
.build();
}
}
This gives Failed to initialize BatchConfiguration
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'batchConfiguration'
I think I am missing something to aggregate it all. I am new to Spring, any help is appreciated
In your first example, you are autowiring a datasource and declaring a datasource bean in the same class which is incorrect. In the second example, instead of autowiring DBConfig, you can import it with #Import(DBConfig.class) and autowire the datasource in your job configuration as needed. Here is a typical configuration:
#Configuration
public class DBConfig {
#Bean
public DataSource dataSource() {
final DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driver_class);
dataSource.setUrl("db_url");
dataSource.setUsername(usr);
dataSource.setPassword(pwd);
return dataSource;
}
}
#Configuration
#EnableBatchProcessing
#Import(DBConfig.class)
public class BatchConfiguration {
#Bean
ItemReader<Trade> reader(DataSource datasource) {
// use datasource to configure the reader
}
}
Since you use Spring Boot, you can remove the DBConfig class, configure the datasource as needed in your application.properties file and the datasource will be automatically injected in your BatchConfiguration.
I am getting the below error when I try to run a spring batch.
java.lang.IllegalStateException: Failed to execute CommandLineRunner
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:779) ~[spring-boot-1.5.3.RELEASE.jar:1.5.3.RELEASE]
at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:760) ~[spring-boot-1.5.3.RELEASE.jar:1.5.3.RELEASE]
at org.springframework.boot.SpringApplication.afterRefresh(SpringApplication.java:747) ~[spring-boot-1.5.3.RELEASE.jar:1.5.3.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:315) ~[spring-boot-1.5.3.RELEASE.jar:1.5.3.RELEASE]
at com.amhi.care.claims.query.batch.QueryBatchApplication.main(QueryBatchApplication.java:15) [classes/:na]
Caused by: java.lang.IndexOutOfBoundsException: Index: 0, Size: 0
at java.util.ArrayList.rangeCheck(ArrayList.java:657) ~[na:1.8.0_192]
at java.util.ArrayList.get(ArrayList.java:433) ~[na:1.8.0_192]
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.getNextJobParameters(JobLauncherCommandLineRunner.java:143) ~[spring-boot-autoconfigure-1.5.3.RELEASE.jar:1.5.3.RELEASE]
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.execute(JobLauncherCommandLineRunner.java:212) ~[spring-boot-autoconfigure-1.5.3.RELEASE.jar:1.5.3.RELEASE]
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.executeLocalJobs(JobLauncherCommandLineRunner.java:231) ~[spring-boot-autoconfigure-1.5.3.RELEASE.jar:1.5.3.RELEASE]
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.launchJobFromProperties(JobLauncherCommandLineRunner.java:123) ~[spring-boot-autoconfigure-1.5.3.RELEASE.jar:1.5.3.RELEASE]
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.run(JobLauncherCommandLineRunner.java:117) ~[spring-boot-autoconfigure-1.5.3.RELEASE.jar:1.5.3.RELEASE]
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:776) ~[spring-boot-1.5.3.RELEASE.jar:1.5.3.RELEASE]
... 4 common frames omitted
Code:
#Configuration
#EnableBatchProcessing
#EnableScheduling
#EnableTransactionManagement
#ComponentScan(QueryBatchConstants.COMPONENT_SCAN_PACKAGE)
#Import(DataSourcecConfiguration.class)
public class QueryBatchConfiguration {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Bean
public Job reminderJob() {
return jobBuilderFactory.get("reminderJob").flow(step()).next(closureStep()).end().build();
}
#Bean
public Step step() {
return stepBuilderFactory.get("step").tasklet(tasklet()).build();
}
#Bean
public Step closureStep() {
return stepBuilderFactory.get("closureStep").tasklet(closureTasklet()).build();
}
#Bean
public Tasklet tasklet(){
return new QueryBatchTasklet();
}
#Bean
public Tasklet closureTasklet(){
return new ClosureBatchTasklet();
}
#Bean
#JobScope
public JobParameters jobParamater(){
return new JobParametersBuilder()
.addDate("date", new Date())
.toJobParameters();
}
/**
* This method is used to configure the Dozer Mapper
*
* #return Mapper
* #throws IOException
*/
#Bean(name = "mapper")
public Mapper configDozerMapper() throws IOException {
DozerBeanMapper mapper = new DozerBeanMapper();
return mapper;
}
/**
* This method is used to create RIDC client manager
*
* #return IdcClientManager
*/
#Bean(name = "idcClientmanager")
public IdcClientManager idcClientmanager() {
return new IdcClientManager();
}
}
#Configuration
#EnableTransactionManagement
#ComponentScan(QueryBatchConstants.COMPONENT_SCAN_PACKAGE)
#PropertySource(QueryBatchConstants.CLASSPATH_APPLICATION_PROPERTIES)
public class DataSourcecConfiguration {
#Resource
private Environment env;
#Autowired
public DataSource dataSource;
/**
* This method is used to configure a data source
*
* #return DataSource
* #throws SQLException
*/
#Bean
public DataSource dataSource() throws SQLException {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getRequiredProperty(QueryBatchConstants.DATABASE_DRIVER));
dataSource.setUrl(env.getRequiredProperty(QueryBatchConstants.DATABASE_URL));
dataSource.setUsername(env.getRequiredProperty(QueryBatchConstants.DATABASE_USERNAME));
dataSource.setPassword(env.getRequiredProperty(QueryBatchConstants.DATABASE_PSWRD));
return dataSource;
}
/**
* This method is used to configure a entity manager factory
*
* #return LocalContainerEntityManagerFactoryBean
* #throws SQLException
*/
#Autowired
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() throws SQLException {
LocalContainerEntityManagerFactoryBean entityManager = new LocalContainerEntityManagerFactoryBean();
entityManager.setDataSource(dataSource());
entityManager.setPersistenceProviderClass(HibernatePersistence.class);
entityManager.setPackagesToScan(env.getRequiredProperty(QueryBatchConstants.PACKAGES_TO_SCAN));
entityManager.setJpaProperties(jpaProperties());
return entityManager;
}
/**
* This method is used to configure the JPA properties
*
* #return JPA Properties
*/
private Properties jpaProperties() {
Properties properties = new Properties();
properties.put(QueryBatchConstants.HIBERNATE_DIALECT, env.getRequiredProperty(QueryBatchConstants.HIBERNATE_DIALECT));
properties.put(QueryBatchConstants.HIBERNATE_SHOW_SQL, env.getRequiredProperty(QueryBatchConstants.HIBERNATE_SHOW_SQL));
properties.put(QueryBatchConstants.HIBERNATE_JDBC_META_DATA, env.getRequiredProperty(QueryBatchConstants.HIBERNATE_FALSE));
return properties;
}
/**
* This method is used to configure the transaction manager
*
* #param emf
* #return JpaTransactionManager
*/
#Bean
public JpaTransactionManager transactionManager(EntityManagerFactory emf) {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(emf);
return transactionManager;
}
#Bean
public JdbcTemplate jdbcTemplate(DataSource dataSource) {
JdbcTemplate jdbcTemplate = new JdbcTemplate(dataSource);
jdbcTemplate.setResultsMapCaseInsensitive(true);
return jdbcTemplate;
}
}
#Component
public class QueryBatchTasklet implements Tasklet{
#Autowired
private QueryBatchService autoClosureService;
#Override
public RepeatStatus execute(StepContribution stepContribution, ChunkContext chunkContext) throws Exception {
autoClosureService.updateStatusAndTriggerComm();
return null;
}
}
#Component
public class ClosureBatchTasklet implements Tasklet{
#Autowired
private ClosureBatchService closureBatchService;
#Override
public RepeatStatus execute(StepContribution stepContribution, ChunkContext chunkContext) throws Exception {
closureBatchService.updateStatusAndTriggerComm();
return null;
}
}
I had this error all of a sudden. Check if there are records in the database you connect to in the BATCH_JOB_INSTANCE table which don't have matching records in the BATCH_JOB_EXECUTION table. Or perhaps there is some other data inconsistency in the Spring batch metadata tables. If possible, drop and recreate them.
This error came about for us as the result of a database purge job which didn't delete everything it needed to.
Please see my answer on this similar question, it may help.
This question comes along NoUniqueBeanDefinitionException when multiple datasouces in Spring Boot and Mybatis project. I try to configure two dataSources in Spring Boot and MyBatis project, but exception occurs:
org.apache.ibatis.binding.BindingException: Invalid bound statement
(not found): com.bookSystem.mapper.UserDao.findByDomain at
org.apache.ibatis.binding.MapperMethod$SqlCommand.(MapperMethod.java:227)
~[mybatis-3.4.6.jar:3.4.6] at
org.apache.ibatis.binding.MapperMethod.(MapperMethod.java:49)
~[mybatis-3.4.6.jar:3.4.6] at
org.apache.ibatis.binding.MapperProxy.cachedMapperMethod(MapperProxy.java:65)
~[mybatis-3.4.6.jar:3.4.6] at
org.apache.ibatis.binding.MapperProxy.invoke(MapperProxy.java:58)
~[mybatis-3.4.6.jar:3.4.6] at
com.sun.proxy.$Proxy83.findByDomain(Unknown Source) ~[na:na] at
com.bookSystem.serviceImp.UserServiceImp.findByDomain(UserServiceImp.java:72)
~[classes/:na] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
Method) ~[na:1.8.0_171] at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
~[na:1.8.0_171] at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
~[na:1.8.0_171] at java.lang.reflect.Method.invoke(Method.java:498)
~[na:1.8.0_171]
application.properties
# multiple datases
spring.datasource.primary.driverClassName = com.mysql.jdbc.Driver
spring.datasource.primary.url = jdbc:mysql://127.0.0.1/book?useSSL=false&useUnicode=true&characterEncoding=UTF-8
spring.datasource.primary.username = root
spring.datasource.primary.password = 123456
spring.datasource.primary.initialization-mode=always
spring.datasource.second.driverClassName = com.mysql.jdbc.Driver
spring.datasource.second.url = jdbc:mysql://127.0.0.1/newBook?useSSL=false&useUnicode=true&characterEncoding=UTF-8
spring.datasource.second.username = root
spring.datasource.second.password = 123456
spring.datasource.second.initialization-mode=always
mybatis.mapper-locations=classpath:mybatis/mapper/*.xml
mapper directories
start of project
#SpringBootApplication( exclude = {
DataSourceAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class
})
#EnableTransactionManagement
public class BookSystemApplication {
}
datasource configuration
#Configuration
public class DataSourceConfig {
#Bean(name = "primaryDataSource")
#Primary
#ConfigurationProperties(prefix = "spring.datasource.primary")
public DataSource primaryDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "secondDataSource")
#ConfigurationProperties(prefix = "spring.datasource.second")
public DataSource secondDataSource() {
return DataSourceBuilder.create().build();
}
}
primary dataSource
#Configuration
#MapperScan(basePackages = {"com.bookSystem.mapper"}, sqlSessionFactoryRef = "primarySqlSessionFactory")
public class MyBatisPrimaryDbConfig {
#Autowired
#Qualifier("primaryDataSource")
private DataSource primaryDataSource;
#Bean
public SqlSessionFactory primarySqlSessionFactory() throws Exception {
SqlSessionFactoryBean factoryBean = new SqlSessionFactoryBean();
factoryBean.setDataSource(primaryDataSource);
return factoryBean.getObject();
}
#Bean
public SqlSessionTemplate primarySqlSessionTempmlate() throws Exception {
SqlSessionTemplate template = new SqlSessionTemplate(primarySqlSessionFactory());
return template;
}
}
Transaction
#Configuration
public class TransactionConfig {
#Autowired
#Qualifier("primaryDataSource")
private DataSource primary;
#Autowired
#Qualifier("secondDataSource")
private DataSource second;
#Bean(name="primaryTx")
#Primary
public PlatformTransactionManager primaryTransaction() {
return new DataSourceTransactionManager(primary);
}
#Bean(name="secondTx")
public PlatformTransactionManager secondTransaction() {
return new DataSourceTransactionManager(second);
}
}
UserDao
package com.bookSystem.mapper;
#Mapper
public interface UserDao {
public User findByDomain(String domainName);
}
userMapper.xml, use primary datasource
<mapper namespace = "com.bookSystem.mapper.UserDao">
<select id="findByDomain" resultMap= "userResultMap">
SELECT
u.userId as user_user_id,
u.userName as user_user_name,
u.domainName as user_domain_name,
u.department as user_department,
u.title as user_title,
u.date as user_date,
u.password as user_password,
u.accountRole as user_account_role
FROM USER u
WHERE u.domainName = #{domainName}
</select>
UserServiceImp, use primary datasource.Service layer which causes the exception
#Service("UserService")
public class UserServiceImp implements UserService {
#Override
public User findByDomain(String domainName) {
User user = userDao.findByDomain(domainName);
return user;
}
}
I'm working with spring data, i create config class with #Bean, #Entity and Main.java but when run project i get exception:
Exception in thread "main" java.lang.NullPointerException
#Autowired annotation don't work!
Main.java
public class Main {
#Autowired
private static TodoRepository todoRepository;
public static void main(String[] args) {
Todo todo = new Todo();
todo.setId(1l);
todo.setTitle("title");
System.out.println(todoRepository); //null
todoRepository.save(todo); //Exception in thread "main" java.lang.NullPointerException
}
}
Context class
#Configuration
#EnableJpaRepositories(basePackages = {"repository"},
entityManagerFactoryRef = "entityManagerFactory",
transactionManagerRef = "transactionManager")
#EnableTransactionManagement
#PropertySource("classpath:app.properties")
public class PersistenceContext {
public PersistenceContext() {
}
/**
* The method that configures the datasource bean
* */
#Resource
private Environment env;
#Bean(destroyMethod = "close")
DataSource dataSource() {
HikariConfig dataSourceConfig = new HikariConfig();
dataSourceConfig.setJdbcUrl(env.getRequiredProperty("db.url"));
dataSourceConfig.setDriverClassName(env.getRequiredProperty("db.driver"));
dataSourceConfig.setUsername(env.getRequiredProperty("db.username"));
dataSourceConfig.setPassword(env.getRequiredProperty("db.password"));
return new HikariDataSource(dataSourceConfig);
}
/**
* The method that configures the entity manager factory
* */
#Bean
LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource dataSource, Environment env) {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource(dataSource);
entityManagerFactoryBean.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
entityManagerFactoryBean.setPackagesToScan("entity");
Properties jpaProperties = new Properties();
jpaProperties.put("hibernate.dialect", env.getRequiredProperty("hibernate.dialect"));
jpaProperties.put("hibernate.hbm2ddl.auto", env.getRequiredProperty("hibernate.hbm2ddl.auto"));
jpaProperties.put("hibernate.show_sql", env.getRequiredProperty("hibernate.show_sql"));
jpaProperties.put("hibernate.format_sql", env.getRequiredProperty("hibernate.format_sql"));
entityManagerFactoryBean.setJpaProperties(jpaProperties);
return entityManagerFactoryBean;
}
/**
* The method that configures the transaction manager
* */
#Bean
JpaTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory);
return transactionManager;
}
}
Repositories
public interface TodoRepository extends CrudRepository<Todo, Long> {
}
Stacktrace
null
Exception in thread "main" java.lang.NullPointerException
at Main.main(Main.java:28)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Process finished with exit code 1
Your main class is not a managed spring bean. You need to create an ApplicationContext, see below:
public class Main {
public static void main(String[] args) {public static void main(String[] args) {
ApplicationContext ctx = new AnnotationConfigApplicationContext(PersistenceContext.class);
TodoRepository todoRepository = ctx.getBean(TodoRepository.class);
Todo todo = new Todo();
todo.setId(1l);
todo.setTitle("title");
System.out.println(todoRepository); // not null
todoRepository.save(todo);
}
}