Spring batch - skip on MyException stop on all other Exception - java

Goal: if there is an AdmisSkipException (custom exception) I want the job to skip the record and keep on processing next lines.
If there is any other exception I want the job to stop.
Here is what I have so far:
Conf:
.<Admis, PreCandidat>chunk(100)
.reader(readerDBAdmis())
.processor(new AdmisItemProcessor(preCandidatRepository, scolFormationSpecialisationRepository, preCandidatureRepository))
.faultTolerant()
.skipPolicy(AdmisVerificationSkipper())
.writer(writerPGICocktail()).build();
AdmisSkipException :
public class AdmisSkipException extends Exception {
private TypeRejet typeRejet;
private Admis admis;
public AdmisSkipException(TypeRejet typeRejet, Admis admis) {
super();
this.typeRejet = typeRejet;
this.admis = admis;
}
public TypeRejet getTypeRejet() {
return typeRejet;
}
public Admis getAdmis() {
return admis;
}
}
AdmisVerificationSkipper :
public class AdmisVerificationSkipper implements SkipPolicy {
private AdmisRejetRepository admisRejetRepository;
public AdmisVerificationSkipper(AdmisRejetRepository admisRejetRepository) {
this.admisRejetRepository = admisRejetRepository;
}
#Override
public boolean shouldSkip(Throwable exception, int skipCount) throws SkipLimitExceededException {
if (exception instanceof AdmisSkipException) {
AdmisSkipException admisSkipException = (AdmisSkipException) exception;
AdmisRejet rejet = new AdmisRejet();
rejet.setAdmis(admisSkipException.getAdmis());
rejet.setTypeRejet(admisSkipException.getTypeRejet());
admisRejetRepository.save(rejet);
return true;
}
return false;
}
}
With this configuration, if a NullPointerException (for example) is thrown in AdmisItemProcessor, the job will continue instead of failing.
What should I change to stop the job ?

if there is an AdmisSkipException (custom exception) I want the job to skip the record and keep on processing next lines. If there is any other exception I want the job to stop.
You can achieve this with:
.<Admis, PreCandidat>chunk(100)
.reader(readerDBAdmis())
.processor(new AdmisItemProcessor(preCandidatRepository, scolFormationSpecialisationRepository, preCandidatureRepository))
.writer(writerPGICocktail())
.faultTolerant()
.skip(AdmisSkipException.class)
.skipLimit(SKIP_LIMIT)
.build();
Looking at your code, you probably had to create a custom skip policy because you want to save skipped items somewhere. I would recommend to use a SkipListener instead, which is designed specifically for this type of requirements. Having a shouldSkip method save items to a repository is a side effect. So this is better done with a listener. That said, you won't need a custom policy and .skip(AdmisSkipException.class).skipLimit(SKIP_LIMIT) should be enough.
With this configuration, if a NullPointerException (for example) is thrown in AdmisItemProcessor, the job will continue instead of failing. What should I change to stop the job ?
Here is an example you can run to see how it works:
import java.util.Arrays;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobExecution;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.batch.item.support.ListItemReader;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.lang.Nullable;
#Configuration
#EnableBatchProcessing
public class MyJob {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
public ItemReader<Integer> itemReader() {
return new ListItemReader<>(Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8, 9, 10));
}
#Bean
public ItemProcessor<Integer, Integer> itemProcessor() {
return new ItemProcessor<Integer, Integer>() {
#Nullable
#Override
public Integer process(Integer item) throws Exception {
if (item.equals(3)) {
throw new IllegalArgumentException("No 3!");
}
if (item.equals(9)) {
throw new NullPointerException("Boom at 9!");
}
return item;
}
};
}
#Bean
public ItemWriter<Integer> itemWriter() {
return items -> {
for (Integer item : items) {
System.out.println("item = " + item);
}
};
}
#Bean
public Step step() {
return steps.get("step")
.<Integer, Integer>chunk(1)
.reader(itemReader())
.processor(itemProcessor())
.writer(itemWriter())
.faultTolerant()
.skip(IllegalArgumentException.class)
.skipLimit(3)
.build();
}
#Bean
public Job job() {
return jobs.get("job")
.start(step())
.build();
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(MyJob.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
JobExecution jobExecution = jobLauncher.run(job, new JobParameters());
System.out.println(jobExecution);
}
}
This example skips items when IllegalArgumentExceptions are thrown and fails the job if a NullPointerException happens.
Hope this helps.

Related

Spring Batch stop job execution from external class

I have an existing spring batch project that have multiple steps. I want to modify a step so I can stop the job : jobExecution.getStatus() == STOPPED.
My step :
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
private StepReader reader;
#Autowired
private StepProcessor processor;
#Autowired
private StepWriter writer;
#Autowired
public GenericListener listener;
#Bean
#JobScope
#Qualifier("mystep")
public Step MyStep() throws ReaderException {
return stepBuilderFactory.get("mystep")
.reader(reader.read())
.listener(listener)
.processor(processor)
.writer(writer)
.build();
}
GenericListener implements ItemReadListener, ItemProcessListener, ItemWriteListener and overrides before and after methods that basically write log.
The focus here is on the StepReader class and its read() method that returns a FlatFileItemReader :
#Component
public class StepReader {
public static final String DELIMITER = "|";
#Autowired
private ClassToAccessProperties classToAccessProperties;
private Logger log = Logger.create(StepReader.class);
#Autowired
private FlatFileItemReaderFactory<MyObject> flatFileItemReaderFactory;
public ItemReader<MyObject> read() throws ReaderException {
try {
String csv = classToAccessProperties.getInputCsv();
FlatFileItemReader<MyObject> reader = flatFileItemReaderFactory.create(csv, getLineMapper());
return reader;
} catch (ReaderException | EmptyInputfileException | IOException e) {
throw new ReaderException(e);
} catch (NoInputFileException e) {
log.info("Oh no !! No input file");
// Here I want to stop the job
return null;
}
}
private LineMapper<MyObject> getLineMapper () {
DefaultLineMapper<MyObject> mapper = new DefaultLineMapper<>();
DelimitedLineTokenizer delimitedLineTokenizer = new DelimitedLineTokenizer();
delimitedLineTokenizer.setDelimiter(DELIMITER);
mapper.setLineTokenizer(delimitedLineTokenizer);
mapper.setFieldSetMapper(new MyObjectFieldSetMapper());
return mapper;
}
}
I tried to implement StepExecutionListener in StepReader but with no luck, I think because the reader method in StepBuilderFactory is expecting an ItemReader from the reader.read() method and it doesn't care about the rest of the class.
I'm looking for ideas or solution to be able to stop the entire job (not fail it) when NoInputFileException is catched.
I'm looking for ideas or solution to be able to stop the entire job (not fail it) when NoInputFileException is catched.
This is a common pattern and is described in details in the Handling Step Completion When No Input is Found section of the reference documentation. The example in that section shows how to fail a job when no input file is found, but since you want to stop the job instead of failing it, you can use StepExecution#setTerminateOnly(); in the listener and your job will end with status STOPPED. In your example, you would add that listener to the MyStep step.
However, I would suggest to add a pre-validation step and stop the job if there is no file. Here is a quick example:
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobExecution;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableBatchProcessing
public class MyJob {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
public Step fileValidationStep() {
return steps.get("fileValidationStep")
.tasklet((contribution, chunkContext) -> {
// TODO add code to check if the file exists
System.out.println("file not found");
chunkContext.getStepContext().getStepExecution().setTerminateOnly();
return RepeatStatus.FINISHED;
})
.build();
}
#Bean
public Step fileProcessingStep() {
return steps.get("fileProcessingStep")
.tasklet((contribution, chunkContext) -> {
System.out.println("processing file");
return RepeatStatus.FINISHED;
})
.build();
}
#Bean
public Job job() {
return jobs.get("job")
.start(fileValidationStep())
.next(fileProcessingStep())
.build();
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(MyJob.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
JobExecution jobExecution = jobLauncher.run(job, new JobParameters());
System.out.println("Job status: " + jobExecution.getExitStatus().getExitCode());
}
}
The example prints:
file not found
Job status: STOPPED
Hope this helps.

Spring Batch : How to read footer of CSV file and validation using FlatFileItemReader

I am using Spring Batch and FlatFileItemReader to read a .CSV file. A file have a header(first line), details and footer(last line). So, I want to validate total number of details by a footer line.
This is my example .csv file.
movie.csv
Name|Type|Year
Notting Hill|romantic comedy|1999
Toy Story 3|Animation|2010
Captain America: The First Avenger|Action|2011
3
from example file
First line is a header (and I ignore it).
At line 2-4 is a detail lines, and last is a footer.
I want to read footer and get value (last line = 3)
and after, get total number recod of details (in this case we have 3 lines)
and last I'll validation total from footer (3) and total number record of details (3) is equals?
and this is my code.
#Bean
#StepScope
public FlatFileItemReader<Movie> movieItemReader(String filePath) {
FlatFileItemReader<Movie> reader = new FlatFileItemReader<>();
reader.setLinesToSkip(1); //skip header line
reader.setResource(new PathResource(filePath));
DelimitedLineTokenizer tokenizer = new DelimitedLineTokenizer("|");
DefaultLineMapper<Movie> movieLineMapper = new DefaultLineMapper<>();
FieldSetMapper<Movie> movieMapper = movieFieldSetMapper();
movieLineMapper.setLineTokenizer(tokenizer);
movieLineMapper.setFieldSetMapper(movieFieldSetMapper);
movieLineMapper.afterPropertiesSet();
reader.setLineMapper(movieLineMapper);
return reader;
}
public FieldSetMapper<Movie> movieFieldSetMapper() {
BeanWrapperFieldSetMapper<Movie> movieMapper = new BeanWrapperFieldSetMapper<>();
movieMapper.setTargetType(Movie.class);
return movieMapper;
}
You can use a chunk oriented step as a validation step before your job's business logic. This step would use a ItemReadListener to save the last item and a StepExecutionListener for the validation. Here is a quick example:
import org.springframework.batch.core.ExitStatus;
import org.springframework.batch.core.ItemReadListener;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.StepExecution;
import org.springframework.batch.core.StepExecutionListener;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.core.listener.StepExecutionListenerSupport;
import org.springframework.batch.item.ItemWriter;
import org.springframework.batch.item.file.FlatFileItemReader;
import org.springframework.batch.item.file.mapping.PassThroughLineMapper;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.ByteArrayResource;
#Configuration
#EnableBatchProcessing
public class MyJob {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
#StepScope
public FlatFileItemReader<String> itemReader() {
FlatFileItemReader<String> reader = new FlatFileItemReader<>();
reader.setLinesToSkip(1); //skip header line
reader.setResource(new ByteArrayResource("header\nitem1\nitem2\n2".getBytes()));
reader.setLineMapper(new PassThroughLineMapper());
return reader;
}
#Bean
public ItemWriter<String> itemWriter() {
return items -> {
for (String item : items) {
System.out.println("item = " + item);
}
};
}
#Bean
public Step step1() {
MyListener myListener = new MyListener();
return steps.get("step1")
.<String, String>chunk(5)
.reader(itemReader())
.writer(itemWriter())
.listener((ItemReadListener<String>) myListener)
.listener((StepExecutionListener) myListener)
.build();
}
#Bean
public Step step2() {
return steps.get("step2")
.tasklet((contribution, chunkContext) -> {
System.out.println("Total count is ok as validated by step1");
return RepeatStatus.FINISHED;
})
.build();
}
#Bean
public Job job() {
return jobs.get("job")
.start(step1())
.next(step2())
.build();
}
static class MyListener extends StepExecutionListenerSupport implements ItemReadListener<String> {
private String lastItem;
#Override
public void beforeRead() {
}
#Override
public void afterRead(String item) {
this.lastItem = item;
}
#Override
public void onReadError(Exception ex) {
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
int readCount = stepExecution.getReadCount();
int totalCountInFooter = Integer.valueOf(this.lastItem); // TODO sanity checks (number format, etc)
System.out.println("readCount = " + (readCount - 1)); // substract footer from the read count
System.out.println("totalCountInFooter = " + totalCountInFooter);
// TODO do validation on readCount vs totalCountInFooter
return ExitStatus.COMPLETED; // return appropriate exit status according to validation result
}
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(MyJob.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
jobLauncher.run(job, new JobParameters());
}
}
This example prints:
item = item1
item = item2
item = 2
readCount = 2
totalCountInFooter = 2
Total count is ok as validated by step1
Hope this helps.

spring batch restart job starts from initial stage instead of where it left off?

I want to implement the functionality of restart job to start it from initial sage. I am facing two issues.
First Problem: When I restart the job very first time it will create a new job instance id and behave like a fresh job. In the second time, it will restart and run with same job instance id. (I sent the execution id from rest controller)
Second Problem: It will start from the initial stage when I will restart it.
Custom Reader:
package com.orange.alc.dabekdataload.reader;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.core.StepExecution;
import org.springframework.batch.core.annotation.AfterStep;
import org.springframework.batch.core.annotation.BeforeStep;
import org.springframework.batch.item.ExecutionContext;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemStream;
import org.springframework.batch.item.ItemStreamException;
import org.springframework.batch.item.NonTransientResourceException;
import org.springframework.batch.item.ParseException;
import org.springframework.batch.item.UnexpectedInputException;
import org.springframework.batch.item.file.FlatFileItemReader;
import org.springframework.batch.item.file.mapping.DefaultLineMapper;
import org.springframework.batch.item.file.transform.DelimitedLineTokenizer;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Scope;
import org.springframework.context.annotation.ScopedProxyMode;
import org.springframework.core.io.FileSystemResource;
import org.springframework.stereotype.Component;
import com.orange.alc.dabekdataload.constants.PostalHeader;
import com.orange.alc.dabekdataload.dto.PostalDto;
#Component("itemReader")
#Scope(value = "step", proxyMode = ScopedProxyMode.TARGET_CLASS)
public class PostalReader implements ItemReader<PostalDto>, ItemStream{
private static final Logger LOGGER = LoggerFactory.getLogger(PostalReader.class);
#Value("#{jobParameters[fullPathFileName]}")
public String fileName;
private int currentIndex = 0;
private static final String CURRENT_INDEX = "current.index";
private FlatFileItemReader<PostalDto> reader;
#BeforeStep
public void beforeStep(StepExecution stepExecution) {
LOGGER.info("Executing batch reader...");
reader = new FlatFileItemReader<>();
reader.setResource(new FileSystemResource(fileName));
reader.setLinesToSkip(1);
reader.setLineMapper(new DefaultLineMapper<PostalDto>() {{
setLineTokenizer(new DelimitedLineTokenizer() {{
setNames(PostalHeader.getPostalColumnNames());
}});
setFieldSetMapper(new PostalFieldSetMapper());
}});
reader.setSaveState(true);
reader.open(stepExecution.getExecutionContext());
}
#Override
public PostalDto read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
reader.setCurrentItemCount(currentIndex++);
return reader.read();
}
#AfterStep
public void afterStep(StepExecution stepExecution) {
LOGGER.info("Closing the reader...");
reader.close();
}
#Override
public void open(ExecutionContext executionContext) throws ItemStreamException {
if(executionContext.containsKey(CURRENT_INDEX)){
currentIndex = new Long(executionContext.getLong(CURRENT_INDEX)).intValue();
} else{
currentIndex = 0;
}
}
#Override
public void update(ExecutionContext executionContext) throws ItemStreamException {
executionContext.putLong(CURRENT_INDEX, new Long(currentIndex).longValue());
}
#Override
public void close() throws ItemStreamException {
}
}
Job Restart Code:
#Override
public void restartJob(Long jobId) throws JobInstanceAlreadyCompleteException, NoSuchJobExecutionException, NoSuchJobException, JobRestartException, JobParametersInvalidException {
LOGGER.info("Restarting job with JobId: {}", jobId);
jobOperator.restart(jobId);
}
Please let me know in case you need any code from my side.
The delegate reader (FlatFileItemReader) used in your custom reader (PostalReader) is not honouring the ItemStream contract. You need to call open/update/close on the delegate reader in the corresponding open/update/close methods of your item reader. Something like:
public class PostalReader implements ItemReader<PostalDto>, ItemStream{
private FlatFileItemReader<PostalDto> reader;
#Override
public void open(ExecutionContext executionContext) throws ItemStreamException {
reader.open(executionContext);
}
#Override
public void update(ExecutionContext executionContext) throws ItemStreamException {
reader.update(executionContext);
}
#Override
public void close() throws ItemStreamException {
reader.close();
}
}
The update method from itemStream will put the exeuctionContext in current Step's execution context. But when you restart the job, the step re run as a new step and it cannot get the execution context of previous step. In order to start processing the data from where it left for restart, you need to save the state in job execution context.

Using Spring Batch to read to convert rows from input CSV file to output CSV file with 1 to many relation

I've already published this question:
How to use Spring Batch read CSV,process it and write it as a CSV with one row can produce more than one row?
And also reviewed these relevant answers:
Spring Batch - Using an ItemWriter with List of Lists
But still can't figure out how to use Spring Batch in order to:
Read a row from input CSV file.
Process it and produce one or more output rows.
Write the output rows into the output file.
I know that the solution should be implementing a writer that will accepts a list of items and will use "delegate" in some way in order to process the items one by one.
I would appreciate if someone can shed some light on this.
My code:
public class CsvRowsProcessor implements ItemProcessor<RowInput, List<RowOutput>>{
#Override
public List<RowOutput> process(final RowInput rowInput) {
final String id = rowInput.getId();
final String title = rowInput.getTitle();
final String description = rowInput.getDescription();
final RowOutput transformedRowInput = new RowOutput(id, title, description);
List<RowOutput> rows=new LinkedList<>();
rows.add(transformedRowInput);
return rows;
}
}
#Bean
ItemWriter<RowOutput> csvRowsWriter() {
FlatFileItemWriter<RowOutput> csvFileWriter = new FlatFileItemWriter<>();
csvFileWriter.setResource(new FileSystemResource("C:\\Users\\orenl\\IdeaProjects\\Spring-Batch-CSV-Example\\src\\main\\resources\\outputFile.csv"));
LineAggregator<RowOutput> lineAggregator = createLineAggregator();
csvFileWriter.setLineAggregator(lineAggregator);
csvFileWriter.setHeaderCallback(new FlatFileHeaderCallback() {
public void writeHeader(Writer writer) throws IOException {
writer.write("Id,Title,Description");
}
});
return csvFileWriter;
}
private LineAggregator<RowOutput> createLineAggregator() {
DelimitedLineAggregator<RowOutput> lineAggregator = new DelimitedLineAggregator<>();
lineAggregator.setDelimiter(",");
FieldExtractor<RowOutput> fieldExtractor = createFieldExtractor();
lineAggregator.setFieldExtractor(fieldExtractor);
return lineAggregator;
}
private FieldExtractor<RowOutput> createFieldExtractor() {
BeanWrapperFieldExtractor<RowOutput> extractor = new BeanWrapperFieldExtractor<>();
extractor.setNames(new String[] { "Id", "Title", "Description" });
return extractor;
}
#Bean
public Step csvFileToFileStep() {
return stepBuilderFactory.get("csvFileToFileStep")
.<RowInput ,RowOutput>chunk(1)
.reader(csvRowsReader())
.processor(csvRowsProcessor())
.writer(csvRowsWriter())
.build();
}
#Bean
Job csvFileToCsvJob(JobCompletionNotificationListener listener) {
return jobBuilderFactory.get("csvFileToCsvJob")
.incrementer(new RunIdIncrementer())
.listener(listener)
.flow(csvFileToFileStep())
.end()
.build();
}
Here is an example:
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.batch.item.support.ListItemReader;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableBatchProcessing
public class MyJob {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
public ItemReader<Integer> itemReader() {
return new ListItemReader<>(Arrays.asList(1, 3, 5, 7, 9));
}
#Bean
public ItemProcessor<Integer, List<Integer>> itemProcessor() {
return item -> {
List<Integer> result = new ArrayList<>();
result.add(item);
result.add(item + 1);
return result;
};
}
#Bean
public ItemWriter<List<Integer>> itemWriter() {
return items -> {
for (List<Integer> item : items) {
for (Integer integer : item) {
System.out.println("integer = " + integer);
}
}
};
}
#Bean
public Step step() {
return steps.get("step")
.<Integer, List<Integer>>chunk(2)
.reader(itemReader())
.processor(itemProcessor())
.writer(itemWriter())
.build();
}
#Bean
public Job job() {
return jobs.get("job")
.start(step())
.build();
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(MyJob.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
jobLauncher.run(job, new JobParameters());
}
}
This sample reads some numbers and for each number it returns the number and its successor and then prints numbers to the standard output. The example shows how the processing of one item returns multiple items.
It prints:
integer = 1
integer = 2
integer = 3
integer = 4
integer = 5
integer = 6
integer = 7
integer = 8
integer = 9
integer = 10
You can adapt the sample to read/write from/to files.
Hope this helps.

Spring-batch : How to catch exception message with skip method in spring batch?

I am a novice of spring batch.
My question is how to catch exceptions with the skip method in spring-batch?
As I know, we can use a skip method to skip them when some exceptions happened in spring batch.
But how can I get the exception message with skip method?
somebody suggested me use SkipListener ,this class has three call back method like onSkipInProcess(),but it is no use for me.
And ItemProcessListener did not work either.
The code like below:(I use skip method to ignore the exception and two listeners to receive the exception info)
Step mainStep = stepBuilder.get("run")
.<ItemProcessing, ItemProcessing>chunk(5)
.faultTolerant()
.skip(IOException.class).skip(SocketTimeoutException.class)//skip IOException here
.skipLimit(2000)
.reader(reader)
.processor(processor)
.writer(writer)
.listener(stepExecListener)
.listener(new ItemProcessorListener()) //add process listener
.listener(new SkipExceptionListener()) //add skip exception listner
.build();
ItemProcessorListener like below:
//(this class implements ItemProcessListener )
{
#Override
public void beforeProcess(Object item) {
// TODO Auto-generated method stub
}
#Override
public void afterProcess(Object item, Object result) {
logger.info("invoke remote finished, item={},result={}",item,result);
}
#Override
public void onProcessError(Object item, Exception e) {
logger.error("invoke remote error, item={},exception={},{}",item,e.getMessage(),e);
}
}
SkipExceptionListener like below:
//(implements SkipListener<Object, Object>)
{
#Override
public void onSkipInRead(Throwable t) {
// TODO Auto-generated method stub
}
#Override
public void onSkipInWrite(Object item, Throwable t) {
// TODO Auto-generated method stub
}
#Override
public void onSkipInProcess(Object item, Throwable t) {
logger.info("invoke remote finished,item={},itemJsonStr={},errMsg={},e={}",
item,
JSONObject.toJSONString(item),
t.getMessage(),
t);
}
}
The issue is that all logger did not work. Actually the skip method does work well, I can get the skip count in table batch_step_execution. I am not sure these two listeners whether be callback. Who can tell me how can I do? Or is there anything else? Thanks a lot.
How to catch exception message with skip method in spring batch?
You can do that by implementing the SkipListener interface or extending the SkipListenerSupport class. All methods in the SkipListener interface have a Throwable parameter which is the exception thrown and that caused the item to be skipped. This is where you can get the exception message. Here is an example:
import java.util.Arrays;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.SkipListener;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.batch.item.support.ListItemReader;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableBatchProcessing
public class MyJob {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
public ItemReader<Integer> itemReader() {
return new ListItemReader<>(Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8, 9, 10));
}
#Bean
public ItemWriter<Integer> itemWriter() {
return items -> {
for (Integer item : items) {
System.out.println("item = " + item);
}
};
}
#Bean
public ItemProcessor<Integer, Integer> itemProcessor() {
return item -> {
if (item.equals(7)) {
throw new IllegalArgumentException("Sevens are not accepted!!");
}
return item;
};
}
#Bean
public Step step() {
return steps.get("step")
.<Integer, Integer>chunk(5)
.reader(itemReader())
.processor(itemProcessor())
.writer(itemWriter())
.faultTolerant()
.skip(IllegalArgumentException.class)
.skipLimit(3)
.listener(new MySkipListener())
.build();
}
#Bean
public Job job() {
return jobs.get("job")
.start(step())
.build();
}
public static class MySkipListener implements SkipListener<Integer, Integer> {
#Override
public void onSkipInRead(Throwable t) {
}
#Override
public void onSkipInWrite(Integer item, Throwable t) {
}
#Override
public void onSkipInProcess(Integer item, Throwable t) {
System.out.println("Item " + item + " was skipped due to: " + t.getMessage());
}
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(MyJob.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
jobLauncher.run(job, new JobParameters());
}
}
In this example, MySkipListener implements SkipListener and gets the message from the exception as you are trying to do. The example reads numbers from 1 to 10 and skips number 7. You can run the main method and should see the following output:
item = 1
item = 2
item = 3
item = 4
item = 5
item = 6
item = 8
item = 9
item = 10
Item 7 was skipped due to: Sevens are not accepted!!
I hope this helps.
I couldn't get it to work either with implementing SkipListener (would be nice to know why) but in the end I went with the annotation way which is only briefly mentioned in the docs. Also somebody had a similar issue here using the implementation method (question) and the guy in the answer uses this annotation method instead of implementing the interface.
Example bean:
#Component
public class CustomSkippedListener {
#OnSkipInRead
public void onSkipInRead(Throwable throwable) {
}
#OnSkipInWrite
public void onSkipInWrite(FooWritingDTO fooWritingDTO, Throwable throwable) {
LOGGER.info("balabla" + throwable.getMessage());
}
#OnSkipInProcess
public void onSkipInProcess(FooLoaderDTO fooLoaderDTO, Throwable throwable) {
LOGGER.info("blabla" + throwable.getMessage());
}
private static final Logger LOGGER = LoggerFactory.getLogger(CustomSkippedListener.class);
}
then autowire and include in the step chain as you did.

Categories

Resources