I'm doing the java examples from the book Spring Batch In Action chapter 1.
In this example, a tasket unzips a zip file. The tasklet receives the zip file path as a job parameter.
I implemented a test method that runs the job and passes the parameters.
#StepScope
#Component
public class DecompressTasklet implements Tasklet {
private static final Logger LOGGER = LogManager.getLogger(DecompressTasklet.class);
#Value("#{jobParameters['inputResource']}")
private Resource inputResource;
#Value("#{jobParameters['targetDirectory']}")
private String targetDirectory;
#Value("#{jobParameters['targetFile']}")
private String targetFile;
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
//code here
}
}
#Configuration
public class DescompressStep {
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private DecompressTasklet decompressTasklet;
#Bean
public Step stepDescompress() {
return stepBuilderFactory
.get(DescompressStep.class.getSimpleName())
.tasklet(decompressTasklet)
.build();
}
}
#EnableBatchProcessing
#Configuration
public class ImportProductsJob {
#Autowired
private DescompressStep descompressStep;
#Autowired
private ReadWriteProductStep readWriteProductStep;
#Bean
public Job job(JobBuilderFactory jobBuilderFactory) {
return jobBuilderFactory
.get("importProductsJob")
.start(descompressStep.stepDescompress())
.next(readWriteProductStep.stepReaderWriter())
.incrementer(new RunIdIncrementer())
.build();
}
}
Below is the test code that runs the job
#RunWith(SpringRunner.class)
#SpringBootTest
#SpringBatchTest
#AutoConfigureTestDatabase
public class ImportProductsIntegrationTest {
#Autowired
private JobRepositoryTestUtils jobRepositoryTestUtils;
#Autowired
private JobLauncherTestUtils jobLauncherTestUtils;
#After
public void cleanUp() {
jobRepositoryTestUtils.removeJobExecutions();
}
#Test
public void importProducts() throws Exception {
jobLauncherTestUtils.launchJob(defaultJobParameters());
}
private JobParameters defaultJobParameters() {
JobParametersBuilder paramsBuilder = new JobParametersBuilder();
paramsBuilder.addString("inputResource", "classpath:input/products.zip");
paramsBuilder.addString("targetDirectory", "./target/importproductsbatch/");
paramsBuilder.addString("targetFile", "products.txt");
paramsBuilder.addLong("timestamp", System.currentTimeMillis());
return paramsBuilder.toJobParameters();
}
}
The products.zip file is in src/main/resources/input
The problem is that when running the test the error occurs
java.lang.NullPointerException: null
at com.springbatch.inaction.ch01.DecompressTasklet.execute(DecompressTasklet.java:62) ~[classes/:na]
I verified that the inputResource property is null. Why does this error occur?
In your job definition, you have:
#Bean
public Job job(JobBuilderFactory jobBuilderFactory) {
return jobBuilderFactory
.get("importProductsJob")
.start(descompressStep.stepDescompress())
.next(readWriteProductStep.stepReaderWriter())
.incrementer(new RunIdIncrementer())
.build();
}
The way you are passing steps to start and next methods is incorrect (I don't even see how this would compile). What you can do is import step configuration classes and inject both steps in your job definition. Something like:
#EnableBatchProcessing
#Configuration
#Import({DescompressStep.class, ReadWriteProductStep.class})
public class ImportProductsJob {
#Bean
public Job job(JobBuilderFactory jobBuilderFactory,
Step stepDescompress, Step stepReaderWriter) {
return jobBuilderFactory
.get("importProductsJob")
.start(stepDescompress)
.next(stepReaderWriter)
.incrementer(new RunIdIncrementer())
.build();
}
}
Related
Issue
When I've started to use separate threads to run the same job several times at the same time, it's happening that the records that have to be inserted, when they've been processed, from the Writer aren't being inserted into the database. The batch runs correctly when I run two sets of data at the same time:
Records processed dataSet1: 3606 (expected 3606).
Records processed dataSet2: 1776 (expected 1776).
As can be seen in the following image, the number of records read and written by Spring Batch are as expected:
Context
In this project I'm using MySQL as database and Hibernate.
Some code
Batch config, job and steps
#Configuration
#EnableBatchProcessing
public class BatchConfig extends DefaultBatchConfigurer
{
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private StepSkipListener stepSkipListener;
#Autowired
private MainJobExecutionListener mainJobExecutionListener;
#Bean
public TaskExecutor taskExecutor()
{
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setMaxPoolSize(10);
taskExecutor.setThreadNamePrefix("batch-thread-");
return taskExecutor;
}
#Bean
public JobLauncher jobLauncher() throws Exception
{
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
jobLauncher.setTaskExecutor(taskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
#Bean
public Step mainStep(ReaderImpl reader, ProcessorImpl processor, WriterImpl writer)
{
return stepBuilderFactory.get("step")
.<List<ExcelLoad>, Invoice>chunk(10)
.reader(reader)
.processor(processor)
.writer(writer)
.faultTolerant().skipPolicy(new ExceptionSkipPolicy())
.listener(stepSkipListener)
.build();
}
#Bean
public Job mainJob(Step mainStep)
{
return jobBuilderFactory.get("mainJob")
.listener(mainJobExecutionListener)
.incrementer(new RunIdIncrementer())
.start(mainStep)
.build();
}
}
Writer
#Override
public void write(List<? extends Invoice> list)
{
invoiceRepository.saveAll(list);
}
Repository
#Repository
public interface InvoiceRepository extends JpaRepository<Invoice, Integer>
{}
Properties
spring.main.allow-bean-definition-overriding=true
spring.batch.initialize-schema=always
spring.batch.job.enabled=false
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
spring.datasource.url=jdbc:mysql://localhost:3306/bd_dev?autoReconnect=true&useTimezone=true&useLegacyDatetimeCode=false&serverTimezone=Europe/Paris&zeroDateTimeBehavior=convertToNull
spring.datasource.username=root
spring.datasource.password=password
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.MySQL5InnoDBDialect
Before using the separate threads, the processed records were inserted into the database correctly. What could be happening?
Before using the separate threads, the processed records were inserted into the database correctly. What could be happening?
If you decide to use a multi-threaded step, you need to make sure your batch artefacts (reader, writer, etc) are thread-safe. From what you shared, the write method is not synchronized between threads and hence is not thread-safe. This is explained in the Multi-threaded Step section of the documentation.
You need to either synchronize it (by using the synchronized key word, or using a Lock, etc) or wrap your writer in a SynchronizedItemStreamWriter.
To help with the implementation, in case someone comes to this question, I share the code that, in my case, has worked to solve the problem:
Batch config, job and steps
#Configuration
#EnableBatchProcessing
public class BatchConfig extends DefaultBatchConfigurer
{
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private StepSkipListener stepSkipListener;
#Autowired
private MainJobExecutionListener mainJobExecutionListener;
#Bean
public PlatformTransactionManager getTransactionManager()
{
return new JpaTransactionManager();
}
#Bean
public TaskExecutor taskExecutor()
{
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setMaxPoolSize(10);
taskExecutor.setThreadNamePrefix("batch-thread-");
return taskExecutor;
}
#Bean
public JobLauncher jobLauncher() throws Exception
{
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
jobLauncher.setTaskExecutor(taskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
#Bean
public Step mainStep(ReaderImpl reader, ProcessorImpl processor, WriterImpl writer)
{
return stepBuilderFactory.get("step")
.transactionManager(jpaTransactionManager)
.<List<ExcelLoad>, Invoice>chunk(10)
.reader(reader)
.processor(processor)
.writer(writer)
.faultTolerant().skipPolicy(new ExceptionSkipPolicy())
.listener(stepSkipListener)
.build();
}
#Bean
public Job mainJob(Step mainStep)
{
return jobBuilderFactory.get("mainJob")
.listener(mainJobExecutionListener)
.incrementer(new RunIdIncrementer())
.start(mainStep)
.build();
}
}
Writer
#Component
public class WriterImpl implements ItemWriter<Invoice>
{
#Autowired
private InvoiceRepository invoiceRepository;
#Override
public void write(List<? extends Invoice> list)
{
invoiceRepository.saveAll(list);
}
}
Does placement of beans make a different when loading them into a scoped context? Is this a bug or a timing of instantiation issue?
If I include the #StepScope and #Bean directly in the BatchConfiguration class, everything works seamlessly with StepScope. However, if I define another class, say "BatchProcessProcessor" as included below, and mark a method within that other class as a Bean with StepScope, it does not resolve properly. The actual symptom in spring batch is StepScope not triggering and the beans being loaded as Singletons.
Something about providing the #Bean and #StepScope from another class that is loaded via constructor injection in the BatchConfiguration does not resolve properly.
Format described above, included below:
Main batch configuration class
#Slf4j
#Configuration
#EnableAutoConfiguration
#EnableBatchProcessing
public class BatchConfiguration extends DefaultBatchConfigurer {
private BatchProcessProcessor processor;
#Override
public void setDataSource(DataSource dataSource) {
// override to do not set datasource even if a datasource exist.
// initialize will use a Map based JobRepository (instead of database)
}
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autwired
public BatchConfiguration(BatchProcessProcessor processor){
this.processor = processor;
}
#Bean
#StepScope
public ListItemReader<String> reader() {
List<String> stringList = new ArrayList<>();
stringList.add("test");
stringList.add("another test");
log.info("LOGGING A BUNCH OF STUFF THIS IS UNIQUE" + String.valueOf(System.currentTimeMillis()));
return new ListItemReader<>(stringList);
}
#Bean
#StepScope
public CustomWriter writer() {
return new CustomWriter();
}
#Bean
public Job importUserJob(JobCompletionNotificationListener listener, Step step1) {
return jobBuilderFactory.get("importUserJob")
.incrementer(new RunIdIncrementer())
.listener(listener)
.flow(step1)
.end()
.build();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<String, String> chunk(10)
.reader(reader())
.processor(processor.processor())
.writer(writer()).build();
}
}
Processor container class
#Component
public class BatchProcessProcessor {
private MyService service;
#Autowired
BatchProcessProcessor(MyService service){
this.service= service;
}
/**
* Generate processor utilized for processing
* #return StringProcessor for testing
*/
#Bean
#StepScope
public DeploymentProcesser processor() {
return new DeploymentProcessor(service);
}
}
Actual Processor
#Slf4j
#Component
public class DeploymentProcesser implements ItemProcessor<Deployment, Model> {
private MyService service;
#Autowired
public DeploymentProcesser(MyService service){
this.service= service;
}
#Override
public Model process(final Deployment deployment) {
log.info(String.format("Processing %s details", deployment.getId()));
Model model = new Model();
model.setId(deployment.getId());
return model;
}
}
As far as I understand, when the BatchConfiguration loads it should inject the BatchProcessProcessor and load the bean with stepscope, but that doesn't seem to work.
As I said before, just copy-pasting the #Bean/#StepScope directly into the BatchConfiguration and returning the same DeploymentProcessor works perfectly and StepScope resolves.
Is this a lifecycle issue?
It does not make sense to declare a bean in a class annotated with #Component:
#Component
public class BatchProcessProcessor {
private MyService service;
#Autowired // This is correct, you can autowire collaborators
public DeploymentProcesser(MyService service){
this.service= service;
}
#Bean // THIS IS NOT CORRECT
#StepScope
public DeploymentProcesser processor() {
return new DeploymentProcessor(service);
}
}
You should rather do it in a configuration class annotated with #Configuration. That's why it works when you do it in BatchConfiguration.
I have followed the spring batch doc and couldn't get my job running Asynchronously.
So I am running the Job from a web container and the job will be triggered via a REST end point.
I wanted to get the JobInstance ID to pass it in response before completing the whole job. So they can check the status of the job later with the JobInstance ID instead of waiting. But I couldn't get it work. Below is the sample code I tried. Please let me know what am I missing or wrong.
BatchConfig to make Async JobLauncher
#Configuration
public class BatchConfig {
#Autowired
JobRepository jobRepository;
#Bean
public JobLauncher simpleJobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
}
Controller
#Autowired
JobLauncher jobLauncher;
#RequestMapping(value="/trigger-job", method = RequestMethod.GET)
public Long workHard() throws Exception {
JobParameters jobParameters = new JobParametersBuilder().
addLong("time", System.currentTimeMillis())
.toJobParameters();
JobExecution jobExecution = jobLauncher.run(batchComponent.customJob("paramhere"), jobParameters);
System.out.println(jobExecution.getJobInstance().getInstanceId());
System.out.println("OK RESPONSE");
return jobExecution.getJobInstance().getInstanceId();
}
And JobBuilder as component
#Component
public class BatchComponent {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
public Job customJob(String someParam) throws Exception {
return jobBuilderFactory.get("personProcessor")
.incrementer(new RunIdIncrementer()).listener(listener())
.flow(personPorcessStep(someParam)).end().build();
}
private Step personPorcessStep(String someParam) throws Exception {
return stepBuilderFactory.get("personProcessStep").<PersonInput, PersonOutput>chunk(1)
.reader(new PersonReader(someParam)).faultTolerant().
skipPolicy(new DataDuplicateSkipper()).processor(new PersonProcessor())
.writer(new PersonWriter()).build();
}
private JobExecutionListener listener() {
return new PersonJobCompletionListener();
}
private class PersonInput {
String firstName;
public PersonInput(String firstName) {
this.firstName = firstName;
}
public String getFirstName() {
return firstName;
}
public void setFirstName(String firstName) {
this.firstName = firstName;
}
}
private class PersonOutput {
String firstName;
public String getFirstName() {
return firstName;
}
public void setFirstName(String firstName) {
this.firstName = firstName;
}
}
public class PersonReader implements ItemReader<PersonInput> {
private List<PersonInput> items;
private int count = 0;
public PersonReader(String someParam) throws InterruptedException {
Thread.sleep(10000L); //to simulate processing
//manipulate and provide data in the read method
//just for testing i have given some dummy example
items = new ArrayList<PersonInput>();
PersonInput pi = new PersonInput("john");
items.add(pi);
}
#Override
public PersonInput read() {
if (count < items.size()) {
return items.get(count++);
}
return null;
}
}
public class DataDuplicateSkipper implements SkipPolicy {
#Override
public boolean shouldSkip(Throwable exception, int skipCount) throws SkipLimitExceededException {
if (exception instanceof DataIntegrityViolationException) {
return true;
}
return true;
}
}
private class PersonProcessor implements ItemProcessor<PersonInput, PersonOutput> {
#Override
public PersonOutput process(PersonInput item) throws Exception {
return null;
}
}
private class PersonWriter implements org.springframework.batch.item.ItemWriter<PersonOutput> {
#Override
public void write(List<? extends PersonOutput> results) throws Exception {
return;
}
}
private class PersonJobCompletionListener implements JobExecutionListener {
public PersonJobCompletionListener() {
}
#Override
public void beforeJob(JobExecution jobExecution) {
}
#Override
public void afterJob(JobExecution jobExecution) {
System.out.println("JOB COMPLETED");
}
}
}
Main Function
#SpringBootApplication
#EnableBatchProcessing
#EnableScheduling
#EnableAsync
public class SpringBatchTestApplication {
public static void main(String[] args) {
SpringApplication.run(SpringBatchTestApplication.class, args);
}
}
I am using annotation based configurations and use gradle with the below batch package.
compile('org.springframework.boot:spring-boot-starter-batch')
Please let me know if some more info needed. I couldn't find any example to run this common use case.
Thanks for you time.
Try this,In your Configuration You need to create customJobLauncher with SimpleAsyncTaskExecutor using the #Bean(name = "myJobLauncher") and same will be used #Qualifier in your controller.
#Bean(name = "myJobLauncher")
public JobLauncher simpleJobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
In your Controller
#Autowired
#Qualifier("myJobLauncher")
private JobLauncher jobLauncher;
If I look at your code I see a couple of mistake.
First of all your custom config is not loaded, because, if it was, the injection will fail for duplicate bean instance for the same interface.
There's a lot of magic in spring boot, but if you don't tell him to do some component scan, nothing will be loaded as espected.
The second problem that i can see is your BatchConfig class: it does not extends DefaultBatchConfigure, nor overrides getJobLauncher(), so even if the boot magic will load everything you'll get the default one.
Here is a configuration that will work and it's compliant with the documentation #EnableBatchProcessing API
BatchConfig
#Configuration
#EnableBatchProcessing(modular = true)
#Slf4j
public class BatchConfig extends DefaultBatchConfigurer {
#Override
#Bean
public JobLauncher getJobLauncher() {
try {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
} catch (Exception e) {
log.error("Can't load SimpleJobLauncher with SimpleAsyncTaskExecutor: {} fallback on default", e);
return super.getJobLauncher();
}
}
}
Main Function
#SpringBootApplication
#EnableScheduling
#EnableAsync
#ComponentScan(basePackageClasses = {BatchConfig.class})
public class SpringBatchTestApplication {
public static void main(String[] args) {
SpringApplication.run(SpringBatchTestApplication.class, args);
}
}
Although you’ve your custom jobLauncher, you’re running the job using default jobLauncher provided by Spring. Could you please autowire simpleJobLauncher in your controller and give it a try?
I know that this is an old question but I post this answer anyway for future users.
After reviewing your code I can't tell why you have this problem, but I can suggest you to use a Qualifier annotation plus use the ThreadPoolTaskExecutor like so and see if it solve your problem.
You may also check this tutorial: Asynchronous Spring Batch Job Processing for more details. It will help you configure a spring batch job asynchronously. This tutorial was written by me.
#Configuration
public class BatchConfig {
#Autowired
private JobRepository jobRepository;
#Bean
public TaskExecutor threadPoolTaskExecutor(){
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setMaxPoolSize(12);
executor.setCorePoolSize(8);
executor.setQueueCapacity(15);
return executor;
}
#Bean
public JobLauncher asyncJobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
jobLauncher.setTaskExecutor(threadPoolTaskExecutor());
return jobLauncher;
}
}
JobExecution jobExecution = jobLauncher.run(batchComponent.customJob("paramhere"), jobParameters);. Joblauncher will wait after the Job has been completed before returning anything, that why your service is probably taking long to respond if that is your problem.
If you want asynchronous capabilities, you might want to look at Spring's #EnableAsync & #Async.
#EnableAsync
According to spring documentation to return a response of the http request asynchronous it is required to use org.springframework.core.task.SimpleAsyncTaskExecutor.
Any implementation of the spring TaskExecutor interface can be used to control how jobs are asynchronously executed.
spring batch documentation
<bean id="jobLauncher"
class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<property name="jobRepository" ref="jobRepository" />
<property name="taskExecutor">
<bean class="org.springframework.core.task.SimpleAsyncTaskExecutor" />
</property>
If you're using Lombok this might help you:
TLDR: Lombok #AllArgsConstructor doesn't seem to work well with #Qualifier annotation
EDIT: if you have enable #Qualifier annotations in the lombok.config file to be able to use #Qualifier with #AllArgsConstructor like this:
lombok.copyableAnnotations += org.springframework.beans.factory.annotation.Qualifier
I know old question, however I had the exact same problem and none of the answers solved it.
I configured the async job launcher like this and added the qualifier to make sure this jobLauncher is injected:
#Bean(name = "asyncJobLauncher")
public JobLauncher simpleJobLauncher(JobRepository jobRepository) throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
And injected it like this
#Qualifier("asyncJobLauncher")
private final JobLauncher jobLauncher;
I was using Lombok #AllArgsConstructor after changing it to autowire, the correct job launcher got injected and the job is now executed asynchronously:
#Autowired
#Qualifier("asyncJobLauncher")
private JobLauncher jobLauncher;
Also I didn't had to extend my configuration from DefaultBatchConfigurer
Actually I jave a Job with several steps and two tasklets as follow:
#Bean
public Step stepItem() {
return stepBuilderFactory.get("stepItem")
.<Item, Item> chunk(10)
.reader(itemReader())
.processor(itemProcessor())
.faultTolerant()
.writer(itemWriter())
.build();
}
#Bean
public Step deleteAllItemStep() {
return stepBuilderFactory.get("deleteAllItemStep")
.tasklet(itemStepDeleteAllTasklet())
.build();
}
Here my custom tasklet
public class ItemStepDeleteAllTasklet implements Tasklet{
private ItemRepository itemRepository;
public ItemStepDeleteAllTasklet(ItemRepository itemRepository) {
this.itemRepository = itemRepository;
}
#Override
public RepeatStatus execute(StepContribution stepContribution, ChunkContext chunkContext) throws Exception {
itemRepository.deleteAllInBatch();
return null;
}
}
This is my configuration with the main job.
#Configuration
#Import({BatchItemConfiguration.class, BatchCriticalComponentConfiguration.class})
public class BatchConfiguration {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private Step stepCriticalComponent;
#Autowired
private Step stepItem;
#Autowired
private Step deleteAllItemStep;
#Autowired
private Step deleteAllCriticalComponentStep;
#Bean
public Job importJob(JobCompletionNotificationListener listener) {
return jobBuilderFactory.get("importJob")
.incrementer(new RunIdIncrementer())
.listener(listener)
.start(deleteAllCriticalComponentStep)
.next(deleteAllItemStep)
.next(stepItem).next(stepCriticalComponent)
.build();
}
}
Actually if a step (stepItem or stepCriticalComponent) fails,data that is deleted by the tasklet is gone and I cannot recover it.
Is there a way to do a rollback on the entire Job or a rollback before a specific step/tasklet?
There is no concept of a rollback of an entire job or step within Spring Batch. However, you can use a listener (JobExecutionListener or StepExecutionListener) to execute compensating logic to do a rollback by hand.
I am trying to process a series of files using Spring Integration in a batch fashion. I have this very old xml which tries to convert the messages into jobs
<int:transformer ref="messageToJobTransformer"/>
<batch-int:job-launching-gateway job-launcher="jobLauncher"/>
The messageToJobTransformer is a class which can convert a Message into a Job. The problem is I don't know where this file is now neither I want a xml config. I want it to be pure Java DSL. Here is my simple config.
return IntegrationFlows.from(Files.inboundAdapter(directory)
.preventDuplicates()
.patternFilter("*.txt")))
.handle(jobLaunchingGw())
.get();
And here is my bean for the gateway.
#Autowired
private JobLauncher jobLauncher;
#Bean
public MessageHandler jobLaunchingGw() {
return new JobLaunchingGateway(jobLauncher);
}
EDIT:Updating the Batch Config class.
#Configuration
#EnableBatchProcessing
public class BatchConfig
{
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
public ItemReader<String> reader(#Value({jobParameters['input.file.name']}") String filename) throws MalformedURLException
{
FlatFileItemReader<String> reader = new FlatFileItemReader<String>();
return reader;
}
#Bean
public Job job() throws MalformedURLException
{
return jobs.get("job").start(step()).build();
}
#Bean
public Step step() throws MalformedURLException
{
return steps.get("step").<String, String> chunk(5).reader(reader())
.writer(writer()).build();
}
#Bean
public ItemWriter<String> writer(#Value("#{jobParameters['input.file.name']}")
{
FlatFileItemWriter writer = new FlatFileItemWriter();
return writer;
}
}
Your question isn't clear. The JobLaunchingGateway expects JobLaunchRequest as a payload.
Since your Integration Flow begins from the Files.inboundAdapter(directory), I can assume that you that you have there some Job definitions. So, what you need here is some class which can parse the file and return JobLaunchRequest.
Something like this from the Spring Batch Reference Manual:
public class FileMessageToJobRequest {
private Job job;
private String fileParameterName;
public void setFileParameterName(String fileParameterName) {
this.fileParameterName = fileParameterName;
}
public void setJob(Job job) {
this.job = job;
}
#Transformer
public JobLaunchRequest toRequest(Message<File> message) {
JobParametersBuilder jobParametersBuilder =
new JobParametersBuilder();
jobParametersBuilder.addString(fileParameterName,
message.getPayload().getAbsolutePath());
return new JobLaunchRequest(job, jobParametersBuilder.toJobParameters());
}
}
After the definition that class as a #Bean you can use it from the .transform() EIP-method just before your .handle(jobLaunchingGw()).
UPDATE
#Bean
public FileMessageToJobRequest fileMessageToJobRequest(Job job) {
FileMessageToJobRequest fileMessageToJobRequest = new FileMessageToJobRequest();
fileMessageToJobRequest.setJob(job);
fileMessageToJobRequest.setfileParameterName("file");
return fileMessageToJobRequest;
}
...
#Bean
public IntegrationFlow flowToBatch(FileMessageToJobRequest fileMessageToJobRequest) {
return IntegrationFlows
.from(Files.inboundAdapter(directory)
.preventDuplicates()
.patternFilter("*.txt")))
.transform(fileMessageToJobRequest)
.handle(jobLaunchingGw())
.get();
}