How to launch Spring Batch Job Asynchronously - java

I have followed the spring batch doc and couldn't get my job running Asynchronously.
So I am running the Job from a web container and the job will be triggered via a REST end point.
I wanted to get the JobInstance ID to pass it in response before completing the whole job. So they can check the status of the job later with the JobInstance ID instead of waiting. But I couldn't get it work. Below is the sample code I tried. Please let me know what am I missing or wrong.
BatchConfig to make Async JobLauncher
#Configuration
public class BatchConfig {
#Autowired
JobRepository jobRepository;
#Bean
public JobLauncher simpleJobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
}
Controller
#Autowired
JobLauncher jobLauncher;
#RequestMapping(value="/trigger-job", method = RequestMethod.GET)
public Long workHard() throws Exception {
JobParameters jobParameters = new JobParametersBuilder().
addLong("time", System.currentTimeMillis())
.toJobParameters();
JobExecution jobExecution = jobLauncher.run(batchComponent.customJob("paramhere"), jobParameters);
System.out.println(jobExecution.getJobInstance().getInstanceId());
System.out.println("OK RESPONSE");
return jobExecution.getJobInstance().getInstanceId();
}
And JobBuilder as component
#Component
public class BatchComponent {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
public Job customJob(String someParam) throws Exception {
return jobBuilderFactory.get("personProcessor")
.incrementer(new RunIdIncrementer()).listener(listener())
.flow(personPorcessStep(someParam)).end().build();
}
private Step personPorcessStep(String someParam) throws Exception {
return stepBuilderFactory.get("personProcessStep").<PersonInput, PersonOutput>chunk(1)
.reader(new PersonReader(someParam)).faultTolerant().
skipPolicy(new DataDuplicateSkipper()).processor(new PersonProcessor())
.writer(new PersonWriter()).build();
}
private JobExecutionListener listener() {
return new PersonJobCompletionListener();
}
private class PersonInput {
String firstName;
public PersonInput(String firstName) {
this.firstName = firstName;
}
public String getFirstName() {
return firstName;
}
public void setFirstName(String firstName) {
this.firstName = firstName;
}
}
private class PersonOutput {
String firstName;
public String getFirstName() {
return firstName;
}
public void setFirstName(String firstName) {
this.firstName = firstName;
}
}
public class PersonReader implements ItemReader<PersonInput> {
private List<PersonInput> items;
private int count = 0;
public PersonReader(String someParam) throws InterruptedException {
Thread.sleep(10000L); //to simulate processing
//manipulate and provide data in the read method
//just for testing i have given some dummy example
items = new ArrayList<PersonInput>();
PersonInput pi = new PersonInput("john");
items.add(pi);
}
#Override
public PersonInput read() {
if (count < items.size()) {
return items.get(count++);
}
return null;
}
}
public class DataDuplicateSkipper implements SkipPolicy {
#Override
public boolean shouldSkip(Throwable exception, int skipCount) throws SkipLimitExceededException {
if (exception instanceof DataIntegrityViolationException) {
return true;
}
return true;
}
}
private class PersonProcessor implements ItemProcessor<PersonInput, PersonOutput> {
#Override
public PersonOutput process(PersonInput item) throws Exception {
return null;
}
}
private class PersonWriter implements org.springframework.batch.item.ItemWriter<PersonOutput> {
#Override
public void write(List<? extends PersonOutput> results) throws Exception {
return;
}
}
private class PersonJobCompletionListener implements JobExecutionListener {
public PersonJobCompletionListener() {
}
#Override
public void beforeJob(JobExecution jobExecution) {
}
#Override
public void afterJob(JobExecution jobExecution) {
System.out.println("JOB COMPLETED");
}
}
}
Main Function
#SpringBootApplication
#EnableBatchProcessing
#EnableScheduling
#EnableAsync
public class SpringBatchTestApplication {
public static void main(String[] args) {
SpringApplication.run(SpringBatchTestApplication.class, args);
}
}
I am using annotation based configurations and use gradle with the below batch package.
compile('org.springframework.boot:spring-boot-starter-batch')
Please let me know if some more info needed. I couldn't find any example to run this common use case.
Thanks for you time.

Try this,In your Configuration You need to create customJobLauncher with SimpleAsyncTaskExecutor using the #Bean(name = "myJobLauncher") and same will be used #Qualifier in your controller.
#Bean(name = "myJobLauncher")
public JobLauncher simpleJobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
In your Controller
#Autowired
#Qualifier("myJobLauncher")
private JobLauncher jobLauncher;

If I look at your code I see a couple of mistake.
First of all your custom config is not loaded, because, if it was, the injection will fail for duplicate bean instance for the same interface.
There's a lot of magic in spring boot, but if you don't tell him to do some component scan, nothing will be loaded as espected.
The second problem that i can see is your BatchConfig class: it does not extends DefaultBatchConfigure, nor overrides getJobLauncher(), so even if the boot magic will load everything you'll get the default one.
Here is a configuration that will work and it's compliant with the documentation #EnableBatchProcessing API
BatchConfig
#Configuration
#EnableBatchProcessing(modular = true)
#Slf4j
public class BatchConfig extends DefaultBatchConfigurer {
#Override
#Bean
public JobLauncher getJobLauncher() {
try {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
} catch (Exception e) {
log.error("Can't load SimpleJobLauncher with SimpleAsyncTaskExecutor: {} fallback on default", e);
return super.getJobLauncher();
}
}
}
Main Function
#SpringBootApplication
#EnableScheduling
#EnableAsync
#ComponentScan(basePackageClasses = {BatchConfig.class})
public class SpringBatchTestApplication {
public static void main(String[] args) {
SpringApplication.run(SpringBatchTestApplication.class, args);
}
}

Although you’ve your custom jobLauncher, you’re running the job using default jobLauncher provided by Spring. Could you please autowire simpleJobLauncher in your controller and give it a try?

I know that this is an old question but I post this answer anyway for future users.
After reviewing your code I can't tell why you have this problem, but I can suggest you to use a Qualifier annotation plus use the ThreadPoolTaskExecutor like so and see if it solve your problem.
You may also check this tutorial: Asynchronous Spring Batch Job Processing for more details. It will help you configure a spring batch job asynchronously. This tutorial was written by me.
#Configuration
public class BatchConfig {
#Autowired
private JobRepository jobRepository;
#Bean
public TaskExecutor threadPoolTaskExecutor(){
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setMaxPoolSize(12);
executor.setCorePoolSize(8);
executor.setQueueCapacity(15);
return executor;
}
#Bean
public JobLauncher asyncJobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
jobLauncher.setTaskExecutor(threadPoolTaskExecutor());
return jobLauncher;
}
}

JobExecution jobExecution = jobLauncher.run(batchComponent.customJob("paramhere"), jobParameters);. Joblauncher will wait after the Job has been completed before returning anything, that why your service is probably taking long to respond if that is your problem.
If you want asynchronous capabilities, you might want to look at Spring's #EnableAsync & #Async.
#EnableAsync

According to spring documentation to return a response of the http request asynchronous it is required to use org.springframework.core.task.SimpleAsyncTaskExecutor.
Any implementation of the spring TaskExecutor interface can be used to control how jobs are asynchronously executed.
spring batch documentation
<bean id="jobLauncher"
class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<property name="jobRepository" ref="jobRepository" />
<property name="taskExecutor">
<bean class="org.springframework.core.task.SimpleAsyncTaskExecutor" />
</property>

If you're using Lombok this might help you:
TLDR: Lombok #AllArgsConstructor doesn't seem to work well with #Qualifier annotation
EDIT: if you have enable #Qualifier annotations in the lombok.config file to be able to use #Qualifier with #AllArgsConstructor like this:
lombok.copyableAnnotations += org.springframework.beans.factory.annotation.Qualifier
I know old question, however I had the exact same problem and none of the answers solved it.
I configured the async job launcher like this and added the qualifier to make sure this jobLauncher is injected:
#Bean(name = "asyncJobLauncher")
public JobLauncher simpleJobLauncher(JobRepository jobRepository) throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
And injected it like this
#Qualifier("asyncJobLauncher")
private final JobLauncher jobLauncher;
I was using Lombok #AllArgsConstructor after changing it to autowire, the correct job launcher got injected and the job is now executed asynchronously:
#Autowired
#Qualifier("asyncJobLauncher")
private JobLauncher jobLauncher;
Also I didn't had to extend my configuration from DefaultBatchConfigurer

Related

Spring Batch processes the records, but is not inserting them into the database

Issue
When I've started to use separate threads to run the same job several times at the same time, it's happening that the records that have to be inserted, when they've been processed, from the Writer aren't being inserted into the database. The batch runs correctly when I run two sets of data at the same time:
Records processed dataSet1: 3606 (expected 3606).
Records processed dataSet2: 1776 (expected 1776).
As can be seen in the following image, the number of records read and written by Spring Batch are as expected:
Context
In this project I'm using MySQL as database and Hibernate.
Some code
Batch config, job and steps
#Configuration
#EnableBatchProcessing
public class BatchConfig extends DefaultBatchConfigurer
{
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private StepSkipListener stepSkipListener;
#Autowired
private MainJobExecutionListener mainJobExecutionListener;
#Bean
public TaskExecutor taskExecutor()
{
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setMaxPoolSize(10);
taskExecutor.setThreadNamePrefix("batch-thread-");
return taskExecutor;
}
#Bean
public JobLauncher jobLauncher() throws Exception
{
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
jobLauncher.setTaskExecutor(taskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
#Bean
public Step mainStep(ReaderImpl reader, ProcessorImpl processor, WriterImpl writer)
{
return stepBuilderFactory.get("step")
.<List<ExcelLoad>, Invoice>chunk(10)
.reader(reader)
.processor(processor)
.writer(writer)
.faultTolerant().skipPolicy(new ExceptionSkipPolicy())
.listener(stepSkipListener)
.build();
}
#Bean
public Job mainJob(Step mainStep)
{
return jobBuilderFactory.get("mainJob")
.listener(mainJobExecutionListener)
.incrementer(new RunIdIncrementer())
.start(mainStep)
.build();
}
}
Writer
#Override
public void write(List<? extends Invoice> list)
{
invoiceRepository.saveAll(list);
}
Repository
#Repository
public interface InvoiceRepository extends JpaRepository<Invoice, Integer>
{}
Properties
spring.main.allow-bean-definition-overriding=true
spring.batch.initialize-schema=always
spring.batch.job.enabled=false
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
spring.datasource.url=jdbc:mysql://localhost:3306/bd_dev?autoReconnect=true&useTimezone=true&useLegacyDatetimeCode=false&serverTimezone=Europe/Paris&zeroDateTimeBehavior=convertToNull
spring.datasource.username=root
spring.datasource.password=password
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.MySQL5InnoDBDialect
Before using the separate threads, the processed records were inserted into the database correctly. What could be happening?
Before using the separate threads, the processed records were inserted into the database correctly. What could be happening?
If you decide to use a multi-threaded step, you need to make sure your batch artefacts (reader, writer, etc) are thread-safe. From what you shared, the write method is not synchronized between threads and hence is not thread-safe. This is explained in the Multi-threaded Step section of the documentation.
You need to either synchronize it (by using the synchronized key word, or using a Lock, etc) or wrap your writer in a SynchronizedItemStreamWriter.
To help with the implementation, in case someone comes to this question, I share the code that, in my case, has worked to solve the problem:
Batch config, job and steps
#Configuration
#EnableBatchProcessing
public class BatchConfig extends DefaultBatchConfigurer
{
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private StepSkipListener stepSkipListener;
#Autowired
private MainJobExecutionListener mainJobExecutionListener;
#Bean
public PlatformTransactionManager getTransactionManager()
{
return new JpaTransactionManager();
}
#Bean
public TaskExecutor taskExecutor()
{
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setMaxPoolSize(10);
taskExecutor.setThreadNamePrefix("batch-thread-");
return taskExecutor;
}
#Bean
public JobLauncher jobLauncher() throws Exception
{
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
jobLauncher.setTaskExecutor(taskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
#Bean
public Step mainStep(ReaderImpl reader, ProcessorImpl processor, WriterImpl writer)
{
return stepBuilderFactory.get("step")
.transactionManager(jpaTransactionManager)
.<List<ExcelLoad>, Invoice>chunk(10)
.reader(reader)
.processor(processor)
.writer(writer)
.faultTolerant().skipPolicy(new ExceptionSkipPolicy())
.listener(stepSkipListener)
.build();
}
#Bean
public Job mainJob(Step mainStep)
{
return jobBuilderFactory.get("mainJob")
.listener(mainJobExecutionListener)
.incrementer(new RunIdIncrementer())
.start(mainStep)
.build();
}
}
Writer
#Component
public class WriterImpl implements ItemWriter<Invoice>
{
#Autowired
private InvoiceRepository invoiceRepository;
#Override
public void write(List<? extends Invoice> list)
{
invoiceRepository.saveAll(list);
}
}

Java Spring Batch - Resource file is not injected into the tasklet

I'm doing the java examples from the book Spring Batch In Action chapter 1.
In this example, a tasket unzips a zip file. The tasklet receives the zip file path as a job parameter.
I implemented a test method that runs the job and passes the parameters.
#StepScope
#Component
public class DecompressTasklet implements Tasklet {
private static final Logger LOGGER = LogManager.getLogger(DecompressTasklet.class);
#Value("#{jobParameters['inputResource']}")
private Resource inputResource;
#Value("#{jobParameters['targetDirectory']}")
private String targetDirectory;
#Value("#{jobParameters['targetFile']}")
private String targetFile;
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
//code here
}
}
#Configuration
public class DescompressStep {
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private DecompressTasklet decompressTasklet;
#Bean
public Step stepDescompress() {
return stepBuilderFactory
.get(DescompressStep.class.getSimpleName())
.tasklet(decompressTasklet)
.build();
}
}
#EnableBatchProcessing
#Configuration
public class ImportProductsJob {
#Autowired
private DescompressStep descompressStep;
#Autowired
private ReadWriteProductStep readWriteProductStep;
#Bean
public Job job(JobBuilderFactory jobBuilderFactory) {
return jobBuilderFactory
.get("importProductsJob")
.start(descompressStep.stepDescompress())
.next(readWriteProductStep.stepReaderWriter())
.incrementer(new RunIdIncrementer())
.build();
}
}
Below is the test code that runs the job
#RunWith(SpringRunner.class)
#SpringBootTest
#SpringBatchTest
#AutoConfigureTestDatabase
public class ImportProductsIntegrationTest {
#Autowired
private JobRepositoryTestUtils jobRepositoryTestUtils;
#Autowired
private JobLauncherTestUtils jobLauncherTestUtils;
#After
public void cleanUp() {
jobRepositoryTestUtils.removeJobExecutions();
}
#Test
public void importProducts() throws Exception {
jobLauncherTestUtils.launchJob(defaultJobParameters());
}
private JobParameters defaultJobParameters() {
JobParametersBuilder paramsBuilder = new JobParametersBuilder();
paramsBuilder.addString("inputResource", "classpath:input/products.zip");
paramsBuilder.addString("targetDirectory", "./target/importproductsbatch/");
paramsBuilder.addString("targetFile", "products.txt");
paramsBuilder.addLong("timestamp", System.currentTimeMillis());
return paramsBuilder.toJobParameters();
}
}
The products.zip file is in src/main/resources/input
The problem is that when running the test the error occurs
java.lang.NullPointerException: null
at com.springbatch.inaction.ch01.DecompressTasklet.execute(DecompressTasklet.java:62) ~[classes/:na]
I verified that the inputResource property is null. Why does this error occur?
In your job definition, you have:
#Bean
public Job job(JobBuilderFactory jobBuilderFactory) {
return jobBuilderFactory
.get("importProductsJob")
.start(descompressStep.stepDescompress())
.next(readWriteProductStep.stepReaderWriter())
.incrementer(new RunIdIncrementer())
.build();
}
The way you are passing steps to start and next methods is incorrect (I don't even see how this would compile). What you can do is import step configuration classes and inject both steps in your job definition. Something like:
#EnableBatchProcessing
#Configuration
#Import({DescompressStep.class, ReadWriteProductStep.class})
public class ImportProductsJob {
#Bean
public Job job(JobBuilderFactory jobBuilderFactory,
Step stepDescompress, Step stepReaderWriter) {
return jobBuilderFactory
.get("importProductsJob")
.start(stepDescompress)
.next(stepReaderWriter)
.incrementer(new RunIdIncrementer())
.build();
}
}

Testing of spring batch job causes unexpected results

I'm trying to create a simple backup job processing, to verify that my goals are correct.
I'm trying to use spring batch testing, for results verification.
PS My batch processing uses non default configuration provided by framework, because our job repository should use non default schema name.
The read step of my job is configured for lazy initialization with #StepScope annotation, it is needed because my job should have some parameters to query the database on reading step
This is the sample configuration that we're using
It is in the root package, rest batch configs located in children packages
#Configuration
#Import({ApplicationHibernateConfiguration.class})
#ComponentScan
public class ApplicationBatchConfiguration extends DefaultBatchConfigurer {
private static final String BATCH_PROCESSING_PREFIX = "BATCH_PROCESSING.BATCH_";
private final DataSource dataSource;
private final PlatformTransactionManager transactionManager;
private final JobLauncher jobLauncher;
private final JobRepository jobRepository;
private final JobExplorer jobExplorer;
#Autowired
public GlobalLogisticsPortalBatchConfiguration(
DataSource dataSource, PlatformTransactionManager transactionManager) throws Exception {
this.dataSource = dataSource;
this.transactionManager = transactionManager;
this.jobRepository = createJobRepository();
this.jobLauncher = createJobLauncher();
this.jobExplorer = createJobExplorer();
}
#Override
protected JobLauncher createJobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
#Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factoryBean = new JobRepositoryFactoryBean();
factoryBean.setDatabaseType("DB2");
factoryBean.setTablePrefix(BATCH_PROCESSING_PREFIX);
factoryBean.setIsolationLevelForCreate("ISOLATION_REPEATABLE_READ");
factoryBean.setDataSource(this.dataSource);
factoryBean.setTransactionManager(this.transactionManager);
factoryBean.afterPropertiesSet();
return factoryBean.getObject();
}
#Override
protected JobExplorer createJobExplorer() throws Exception {
JobExplorerFactoryBean factoryBean = new JobExplorerFactoryBean();
factoryBean.setDataSource(this.dataSource);
factoryBean.setTablePrefix(BATCH_PROCESSING_PREFIX);
factoryBean.afterPropertiesSet();
return factoryBean.getObject();
}
#Override
#Bean
public JobRepository getJobRepository() {
return jobRepository;
}
#Override
public PlatformTransactionManager getTransactionManager() {
return transactionManager;
}
#Override
#Bean
public JobLauncher getJobLauncher() {
return jobLauncher;
}
#Override
#Bean
public JobExplorer getJobExplorer() {
return jobExplorer;
}
#Bean
public JobBuilderFactory jobBuilderFactory(JobRepository jobRepository) {
return new JobBuilderFactory(jobRepository);
}
#Bean
public StepBuilderFactory stepBuilderFactory(
JobRepository jobRepository, PlatformTransactionManager transactionManager) {
return new StepBuilderFactory(jobRepository, transactionManager);
}
}
Step that i'm trying to use looks like that:
#Bean
#StepScope
public JdbcPagingItemReader<DomainObject> itemReader(
#Value("#{jobParameters['id']}") String id) {
JdbcPagingItemReader<DomainObject> reader = new JdbcPagingItemReader<>();
reader.setDataSource(this.dataSource);
reader.setFetchSize(10);
Db2PagingQueryProvider nativeQueryProvider = new Db2PagingQueryProvider();
nativeQueryProvider.setSelectClause("*");
nativeQueryProvider.setFromClause("from SCHEMA.DOMAIN");
nativeQueryProvider.setWhereClause("id = :id");
Map<String, Object> params = new HashMap<>(1);
params.put("id", id);
reader.setRowMapper((rs, rowNum) -> {
DomainObject element = new DomainObject();
element.setId(rs.getString("ID"));
return element;
});
reader.setParameterValues(params);
reader.setQueryProvider(nativeQueryProvider);
return reader;
}
#Bean
public Step fetchDomain() throws Exception {
return stepBuilderFactory.get("fetchDomain")
.<HierarchyElement, HierarchyElement>chunk(10)
.faultTolerant()
.reader(itemReader(null))
.writer(items -> items.forEach(System.out::println))
.build();
}
The actual job bean is currently configured to launch only single step
#Bean
public Job backupJob() throws Exception {
return jobBuilderFactory.get("backupJob")
.start(fetchHeid())
.build();
}
My Test code looks like following
#RunWith(SpringRunner.class)
#SpringBatchTest
#ContextConfiguration(classes = {ApplicationBatchConfiguration.class})
public class BackupJobConfigurationTest {
#Autowired
#Qualifier(value = "backupJob")
public Job job;
#Autowired
private JobLauncherTestUtils jobLauncherTestUtils;
#Test
public void flowTest() throws Exception {
JobParameters parameters = new JobParametersBuilder()
.addString("id", "124")
.toJobParameters();
JobExecution execution = jobLauncherTestUtils.launchJob(parameters);
assertEquals(BatchStatuses.COMPLETED, execution.getExitStatus().getExitCode()); //failed
}
}
I expect the exit code to be "COMPLETED" and get "UNKNOWN".
Also I'm not sure that the code is actually getting even invoked because i don't see any outputs from the writer lambda.
The only output that i am seeing in test is
Aug 30, 2019 2:52:17 PM org.springframework.batch.core.launch.support.SimpleJobLauncher run
INFO: Job: [FlowJob: [name=backupJob]] launched with the following parameters: [{id=124}]
org.junit.ComparisonFailure:
Expected :COMPLETED
Actual :UNKNOWN
I am figured that out, first of all, i were need to remove SimpleAsyncTaskExecutor from my configuration to actually test the code in the same thread, then by more careful read of the reference I havve reconfigured my batch config, instead of extending BatchConfigurer directly I haveve configured it as a bean inside my configuration and add #EnableSpringBatch annotation
#Bean
public BatchConfigurer batchConfigurer() {
return new DefaultBatchConfigurer() {
#Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factoryBean = new JobRepositoryFactoryBean();
factoryBean.setDatabaseType("db2");
factoryBean.setTablePrefix(BATCH_PROCESSING_PREFIX);
factoryBean.setIsolationLevelForCreate("ISOLATION_REPEATABLE_READ");
factoryBean.setDataSource(dataSource);
factoryBean.setTransactionManager(transactionManager);
factoryBean.afterPropertiesSet();
return factoryBean.getObject();
}
#Override
protected JobExplorer createJobExplorer() throws Exception {
JobExplorerFactoryBean factoryBean = new JobExplorerFactoryBean();
factoryBean.setDataSource(dataSource);
factoryBean.setTablePrefix(BATCH_PROCESSING_PREFIX);
factoryBean.afterPropertiesSet();
return factoryBean.getObject();
}
#Override
protected JobLauncher createJobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
//jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
};
}

Spring Batch doesn't call both ItemProcessor and ItemWriter in chunk-flow

I have a spring batch application to get a file in samba server
and generate a new file in a different folder on the same server.
However,
only ItemReader is called in the flow.
What is the problem? Thanks.
BatchConfiguration:
#Configuration
#EnableBatchProcessing
public class BatchConfiguration extends BaseConfiguration {
#Bean
public ValeTrocaItemReader reader() {
return new ValeTrocaItemReader();
}
#Bean
public ValeTrocaItemProcessor processor() {
return new ValeTrocaItemProcessor();
}
#Bean
public ValeTrocaItemWriter writer() {
return new ValeTrocaItemWriter();
}
#Bean
public Job importUserJob(JobCompletionNotificationListener listener) throws Exception {
return jobBuilderFactory()
.get("importUserJob")
.incrementer(new RunIdIncrementer())
.repository(getJobRepository())
.listener(listener)
.start(this.step1())
.build();
}
#Bean
public Step step1() throws Exception {
return stepBuilderFactory()
.get("step1")
.<ValeTroca, ValeTroca>chunk(10)
.reader(this.reader())
.processor(this.processor())
.writer(this.writer())
.build();
}
}
BaseConfiguration:
public class BaseConfiguration implements BatchConfigurer {
#Bean
#Override
public PlatformTransactionManager getTransactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
#Override
public SimpleJobLauncher getJobLauncher() throws Exception {
final SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(this.getJobRepository());
return simpleJobLauncher;
}
#Bean
#Override
public JobRepository getJobRepository() throws Exception {
return new MapJobRepositoryFactoryBean(this.getTransactionManager()).getObject();
}
#Bean
#Override
public JobExplorer getJobExplorer() {
MapJobRepositoryFactoryBean repositoryFactory = this.getMapJobRepositoryFactoryBean();
return new SimpleJobExplorer(repositoryFactory.getJobInstanceDao(), repositoryFactory.getJobExecutionDao(),
repositoryFactory.getStepExecutionDao(), repositoryFactory.getExecutionContextDao());
}
#Bean
public MapJobRepositoryFactoryBean getMapJobRepositoryFactoryBean() {
return new MapJobRepositoryFactoryBean(this.getTransactionManager());
}
#Bean
public JobBuilderFactory jobBuilderFactory() throws Exception {
return new JobBuilderFactory(this.getJobRepository());
}
#Bean
public StepBuilderFactory stepBuilderFactory() throws Exception {
return new StepBuilderFactory(this.getJobRepository(), this.getTransactionManager());
}
}
ValeTrocaItemReader:
#Configuration
public class ValeTrocaItemReader implements ItemReader<ValeTroca>{
#Value(value = "${url}")
private String url;
#Value(value = "${user}")
private String user;
#Value(value = "${password}")
private String password;
#Value(value = "${domain}")
private String domain;
#Value(value = "${inputDirectory}")
private String inputDirectory;
#Bean
#Override
public ValeTroca read() throws MalformedURLException, SmbException, IOException, Exception {
File tempOutputFile = getInputFile();
DefaultLineMapper<ValeTroca> lineMapper = new DefaultLineMapper<>();
lineMapper.setLineTokenizer(new DelimitedLineTokenizer() {
{
setDelimiter(";");
setNames(new String[]{"id_participante", "cpf", "valor"});
}
});
lineMapper.setFieldSetMapper(
new BeanWrapperFieldSetMapper<ValeTroca>() {
{
setTargetType(ValeTroca.class);
}
});
FlatFileItemReader<ValeTroca> itemReader = new FlatFileItemReader<>();
itemReader.setLinesToSkip(1);
itemReader.setResource(new FileUrlResource(tempOutputFile.getCanonicalPath()));
itemReader.setLineMapper(lineMapper);
itemReader.open(new ExecutionContext());
tempOutputFile.deleteOnExit();
return itemReader.read();
}
Sample of ItemProcessor:
public class ValeTrocaItemProcessor implements ItemProcessor<ValeTroca, ValeTroca> {
#Override
public ValeTroca process(ValeTroca item) {
//Do anything
ValeTroca item2 = item;
System.out.println(item2.getCpf());
return item2;
}
EDIT:
- Spring boot 2.1.2.RELEASE - Spring batch 4.1.1.RELEASE
Looking at your configuration, here are a couple of notes:
BatchConfiguration looks good. That's a typical job with a single chunk-oriented step.
BaseConfiguration is actually the default configuration you get when using #EnableBatchProcessing without providing a datasource. So this class can be removed
Adding #Configuration on ValeTrocaItemReader and marking the method read() with #Bean is not correct. This means your are declaring a bean named read of type ValeTroca in your application context. Moreover, your custom reader uses a FlatFileItemReader but has no added value compared to a FlatFileItemReader. You can declare your reader as a FlatFileItemReader and configure it as needed (resource, line mapper, etc ). This will also avoid the mistake of opening the execution context in the read method, which should be done when initializaing the reader or in the ItemStream#open method if the reader implements ItemStream
Other than that, I don't see from what you shared why the processor and writer are not called.
SOLVED: The problem was that even though I'm not using any databases, the spring batch, although configured to have the JobRepository in memory, needs a database (usually H2) to save the configuration tables, jobs, etc.
In this case, the dependencies of JDBC and without H2 in pom.xml were disabled. Just added them to the project and the problem was solved!

Convert Message to Job to make it Spring Integration with Batch Processing

I am trying to process a series of files using Spring Integration in a batch fashion. I have this very old xml which tries to convert the messages into jobs
<int:transformer ref="messageToJobTransformer"/>
<batch-int:job-launching-gateway job-launcher="jobLauncher"/>
The messageToJobTransformer is a class which can convert a Message into a Job. The problem is I don't know where this file is now neither I want a xml config. I want it to be pure Java DSL. Here is my simple config.
return IntegrationFlows.from(Files.inboundAdapter(directory)
.preventDuplicates()
.patternFilter("*.txt")))
.handle(jobLaunchingGw())
.get();
And here is my bean for the gateway.
#Autowired
private JobLauncher jobLauncher;
#Bean
public MessageHandler jobLaunchingGw() {
return new JobLaunchingGateway(jobLauncher);
}
EDIT:Updating the Batch Config class.
#Configuration
#EnableBatchProcessing
public class BatchConfig
{
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
public ItemReader<String> reader(#Value({jobParameters['input.file.name']}") String filename) throws MalformedURLException
{
FlatFileItemReader<String> reader = new FlatFileItemReader<String>();
return reader;
}
#Bean
public Job job() throws MalformedURLException
{
return jobs.get("job").start(step()).build();
}
#Bean
public Step step() throws MalformedURLException
{
return steps.get("step").<String, String> chunk(5).reader(reader())
.writer(writer()).build();
}
#Bean
public ItemWriter<String> writer(#Value("#{jobParameters['input.file.name']}")
{
FlatFileItemWriter writer = new FlatFileItemWriter();
return writer;
}
}
Your question isn't clear. The JobLaunchingGateway expects JobLaunchRequest as a payload.
Since your Integration Flow begins from the Files.inboundAdapter(directory), I can assume that you that you have there some Job definitions. So, what you need here is some class which can parse the file and return JobLaunchRequest.
Something like this from the Spring Batch Reference Manual:
public class FileMessageToJobRequest {
private Job job;
private String fileParameterName;
public void setFileParameterName(String fileParameterName) {
this.fileParameterName = fileParameterName;
}
public void setJob(Job job) {
this.job = job;
}
#Transformer
public JobLaunchRequest toRequest(Message<File> message) {
JobParametersBuilder jobParametersBuilder =
new JobParametersBuilder();
jobParametersBuilder.addString(fileParameterName,
message.getPayload().getAbsolutePath());
return new JobLaunchRequest(job, jobParametersBuilder.toJobParameters());
}
}
After the definition that class as a #Bean you can use it from the .transform() EIP-method just before your .handle(jobLaunchingGw()).
UPDATE
#Bean
public FileMessageToJobRequest fileMessageToJobRequest(Job job) {
FileMessageToJobRequest fileMessageToJobRequest = new FileMessageToJobRequest();
fileMessageToJobRequest.setJob(job);
fileMessageToJobRequest.setfileParameterName("file");
return fileMessageToJobRequest;
}
...
#Bean
public IntegrationFlow flowToBatch(FileMessageToJobRequest fileMessageToJobRequest) {
return IntegrationFlows
.from(Files.inboundAdapter(directory)
.preventDuplicates()
.patternFilter("*.txt")))
.transform(fileMessageToJobRequest)
.handle(jobLaunchingGw())
.get();
}

Categories

Resources