I'm trying to create a Spring Batch job using a ListItemReader<String>, ItemProcessor<String, String> and ItemWriter<String>.
The XML looks like the following,
<job id="sourceJob" xmlns="http://www.springframework.org/schema/batch">
<step id="step1" next="step2">
<tasklet>
<chunk reader="svnSourceItemReader"
processor="metadataItemProcessor"
writer="metadataItemWriter"
commit-interval="1" />
</tasklet>
</step>
<step id="step2">
<tasklet ref="lastRevisionLoggerTasklet"></tasklet>
</step>
</job>
<bean id="svnSourceItemReader"
class="com.example.repository.batch.SvnSourceItemReader"
scope="prototype">
<constructor-arg index="0">
<list>
<value>doc1.xkbml</value>
<value>doc2.xkbml</value>
<value>doc3.xkbml</value>
</list>
</constructor-arg>
</bean>
<bean id="metadataItemProcessor"
class="com.example.repository.batch.MetadataItemProcessor"
scope="prototype" />
<bean id="metadataItemWriter"
class="com.example.repository.batch.MetadataItemWriter"
scope="prototype" />
The reader, processor and writer are vanilla,
public class SvnSourceItemReader extends ListItemReader<String> {
public SvnSourceItemReader(List<String> list) {
super(list);
System.out.println("Reading data list " + list);
}
#Override
public String read() {
String out = (String) super.read();
System.out.println("Reading data " + out);
return out;
}
}
public class MetadataItemProcessor implements ItemProcessor<String, String> {
#Override
public String process(String i) throws Exception {
System.out.println("Processing " + i + " : documentId " + documentId);
return i;
}
}
public class MetadataItemWriter implements ItemWriter<String> {
#Override
public void write(List<? extends String> list) throws Exception {
System.out.println("Writing " + list);
}
}
The job is started like this, but on a schedule of every 10 seconds.
long nanoBits = System.nanoTime() % 1000000L;
if (nanoBits < 0) {
nanoBits *= -1;
}
String dateParam = new Date().toString() + System.currentTimeMillis()
+ "." + nanoBits;
param = new JobParametersBuilder().addString("date", dateParam)
.toJobParameters();
JobExecution execution = jobLauncher.run(job, param);
When the application starts, I see it read, process and write each of the three items in the list passed to the reader.
Reading data doc1.xkbml
Processing doc1.xkbml : documentId doc1
Writing [doc1.xkbml]
Reading data doc2.xkbml
Processing doc2.xkbml : documentId doc2
Writing [doc2.xkbml]
Reading data doc3.xkbml
Processing doc3.xkbml : documentId doc3
Writing [doc3.xkbml]
Because this sourceJob is on a scheduled timer, every 10 seconds I expected to see that list processed, but instead I see on all subsequent runs.
Reading data null
Does anyone know why this is happening? I'm new to Spring Batch and just can't get my hands around the issue.
Thanks /w
The problem is that you marked your reader as scope="prototype". It should be scope="step".
In Spring-batch there are only two scopes: singleton (the default) and step.
From the javadoc:
StepScope: Scope for step context. Objects in this scope use the
Spring container as an object factory, so there is only one instance
of such a bean per executing step. All objects in this scope are
(no need to decorate the bean definitions).
and
Using a scope of Step is required in order to use late binding since
the bean cannot actually be instantiated until the Step starts, which
allows the attributes to be found.
During the Spring context startup look at your log and you will see this line:
INFO: Done executing SQL script from class path resource
[org/springframework/batch/core/schema-hsqldb.sql] in 9 ms.
Reading data list [doc1.xkbml, doc2.xkbml, doc3.xkbml]
as you can see your reader has already been created and managed as a singleton; dynamic beans in spring-batch context should be managed with the special step scope so that Spring will create a fresh copy of the bean every time a step is executed.
In your reader, ListItemReader.read() is written as:
public T read() {
if (!list.isEmpty()) {
return list.remove(0);
}
return null;
}
In each read items are removed from original list! The reader is constructed once and, on second job execution, the list is empty!
Just an additional information: you can also use JavaConfig instead of the xml config file, and annotate the reader bean declaration with #StepConfig.
ex:
#Configuration
#EnableBatchProcessing
public class MyConfig {
...
#Bean
#StepScope
public ItemReader<HeadingBreakevenAssociation> readerHeadingBreakevenAssociationList(){
ItemReader<Person> itemReader = new ListItemReader<Person>(myList);
return itemReader;
}
}
Related
I am implementing a batch application using springboot 2.4.3 + jsr352. There is a simple batchlet class(sleepybatchlet) defined. I am trying to reference it in the JSL. but It fails saying classnotfound exception when the job is started using joboperator.
sleepy-batchlet.xml:
<job xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://xmlns.jcp.org/xml/ns/javaee" xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee http://xmlns.jcp.org/xml/ns/javaee/jobXML_1_0.xsd" restartable="true" version="1.0" id="sleepy-batchlet">
<step id="step1">
<batchlet ref="sleepyBatchlet">
<properties>
<property name="sleep.time.seconds" value="#{jobParameters['sleep.time.seconds']}" />
</properties>
</batchlet>
</step>
</job>
Below is my batchlet class which is annotated with #Named
#Named
public class SleepyBatchlet extends AbstractBatchlet{
private final static Logger logger = Logger.getLogger(SleepyBatchlet.class.getName());
private Map<ReportMetaData,byte[]> pdfMetadataMap;
/**
* Logging helper.
*/
protected static void log(String method, Object msg) {
System.out.println("SleepyBatchlet: " + method + ": " + msg);
// logger.info("SleepyBatchlet: " + method + ": " + String.valueOf(msg));
}
/**
* This flag gets set if the batchlet is stopped. This will break the batchlet
* out of its sleepy loop.
*/
private volatile boolean stopRequested = false;
/**
* The total sleep time, in seconds.
*/
#Inject
#BatchProperty(name = "sleep.time.seconds")
String sleepTimeSecondsProperty;
private int sleepTime_s = 3;
#Inject
private JschFileUtil jschFileUtil;
#Override
public String process() throws Exception {
log("process", "entry");
System.out.println("Test");
return "exitStatus";
}
/**
* Called if the batchlet is stopped by the container.
*/
#Override
public void stop() throws Exception {
log("stop:", "");
stopRequested = true;
}
}
Defined the bean in java configuration class as well.
#Autowired
private SleepyBatchlet sleepyBatchlet;
#Bean
public Batchlet fooBatchlet() {
return sleepyBatchlet;
}
But for some reason, Its not getting referenced in the JSL. Can someone please suggest what needs to be done to use the bean created already ?
I am trying to reference it in the JSL. but It fails saying classnotfound exception when the job is started using joboperator.
This is because you are referring to the class by its name and not its fully qualified name.
I added sample springboot+jsr352 application here . github.com/MekalaJ/demo
In your example, you need to update your step definition as follows:
<batchlet ref="com.example.demo.batch.SleepyBatchlet">
<properties>
<property name="sleep.time.seconds" value="#{jobParameters['sleep.time.seconds']}" />
</properties>
</batchlet>
Is it possible to get a list of defined jobs in Spring Batch at runtime without using db? Maybe it's possible to get this metadata from jobRepository bean or some similar object?
It is possible to retrieve the list of all job names using JobExplorer.getJobNames().
You first have to define the jobExplorer bean using JobExplorerFactoryBean:
<bean id="jobExplorer" class="org.springframework.batch.core.explore.support.JobExplorerFactoryBean">
<property name="dataSource" ref="dataSource"/>
</bean>
and then you can inject this bean when you need it.
To list jobs defined as beans, you can just let the spring context inject them for you all the bean types of type Job into a list as below:
#Autowired
private List<? extends Job> jobs;
..
//You can then launch you job given a name.
Alternative strategy to get list of job names that are configured as beans one can use the ListableJobLocator.
#Autowired
ListableJobLocator jobLocator;
....
jobLocator.getJobNames();
This does not require a job repository.
I use these code to list and execute jobs
private String jobName = "";
private JobLauncher jobLauncher = null;
private String selectedJob;
private String statusJob = "Exit Status : ";
private Job job;
ApplicationContext context;
private String[] lstJobs;
/**
* Execute
*/
public ExecuteJobBean() {
this.context = ApplicationContextProvider.getApplicationContext();
this.lstJobs = context.getBeanNamesForType(Job.class);
if (jobLauncher == null)
jobLauncher = (JobLauncher) context.getBean("jobLauncher");
}
/**
* Execute
*/
public void executeJob() {
setJob((Job) context.getBean(this.selectedJob));
try {
statusJob = "Exit Status : ";
JobParameters jobParameters = new JobParametersBuilder().addLong("time", System.currentTimeMillis()).toJobParameters();
JobExecution execution = jobLauncher.run(getJob(), jobParameters);
this.statusJob = execution.getStatus() + ", ";
} catch (Exception e) {
e.printStackTrace();
this.statusJob = "Error, " + e.getMessage();
}
this.statusJob += " Done!!";
}
I want to handle multi Thread in spring mvn model. I have written this code
#RequestMapping("/indialCall")
#ResponseBody
public String indialCall(HttpServletRequest request) {
String result = "FAIL";
try {
Map<String, String> paramList = commonUtilities.getParamList(request);
logger.info("indialCall paramList :::" + paramList);
// System.out.println("indial-call paramList :::" + paramList);
result = inDialHandler.processIndialWork(paramList);
logger.info(result);
} catch (Exception e) {
logger.info("Error :" + e);
}
return result;
}
public String processIndialWork(final Map<String, String> paramList) {
final Boolean sendSms = Boolean.parseBoolean(paramList.get(constantService.getSendSms()));
//assume it is always true
if(true){
Thread thread = new Thread(new Runnable() {
#Override
public void run() {
String sessionId = (String) paramList.get(constantService.getInDialSession());
String msisdn = (String) paramList.get(constantService.getInDialMsisdn());
//This method will save the entry into database
saveMissedCall(callStartDate, sessionId, msisdn, vmnNo, advId, enterpriseId, sendSms, advertiser);
}
});
thread.start();
return "1";
}
}
In this code i am using thread creation on every http request. Which is not good for my case.
because system get 50 request /sec. and when i see the cpu usage it is too high.
I am calling this thread for async communication so that calling party can get response instantly
and later on application do the further processing.
I want to use the Executor service but do not know how to do this. Can some one guide me or can write
few line of code for me to implment the correct thread pool executor.
First define a simple taskExecutor in your config file.
<bean id="taskExecutor"
class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="5" />
<property name="maxPoolSize" value="10" />
<property name="WaitForTasksToCompleteOnShutdown" value="true" />
</bean>
Create spring bean with prototype scope (prototype is important as you want to give each thread different data), and they
will run simultaneously.
This spring bean will implement runnable with run method and will have class level variable paramlist for getting vales.
public class MyRunnableBean implements runnable{
private Map<String, String> paramList();
// add setter
public void run(){
// your logic
}
}
Inject task executor (singleton) in your existin bean , get the instances of this runnable bean in your existing bean set the paramlist and add it in executor :-
MyRunnableBEan myRunnableBEan = applicationContext.getBean("myRunnable");
myRunnableBean.setParamList(/* your paramlist*/ );
taskExecutor.execute(myRunnableBean);
Correct the compilation and syntax error, this sample code written on notepad, i don't have java on my machine.
I have some service bean which is accessible by identifier someSpecificService which I need to modify.
Beans are defined in different xml files and are collected together in runtime. So one big xml file is created where all these xmls are imported:
context.xml
....
<import path="spring1.xml" />
<import path="spring2.xml" />
...
So there is following configuration:
<-- definitions from spring1.xml -->
<alias name="defaultSomeSpecificService" alias="someSpecificService" />
<bean id="defaultSomeSpecificService" class="..."/>
....
<!-- definitions from spring2.xml -->
<alias name="myOwnSomeSpecificService" alias="someSpecificService" />
<bean id="myOwnSomeSpecificService" class="..." /> <!-- how to inject previously defined someSpecificService into this new bean? -->
I would like to override someSpecificService from spring1.xml in spring2.xml, however I do need to inject previously defined bean defaultSomeSpecificService and all I know is its alias name someSpecificService which I need to redefine to new bean myOwnSomeSpecificService.
Is it possible to implement?
One solution would be to avoid trying to override the definition, by creating a proxy for the service implementation to intercept all calls towards it.
1) For the sake of the example, suppose the service would be something like:
public interface Service {
public String run();
}
public class ExistingServiceImpl implements Service {
#Override
public String run() {
throw new IllegalStateException("Muahahahaha!");
}
}
2) Implement an interceptor instead of myOwnSomeSpecificService:
import org.aopalliance.intercept.MethodInterceptor;
import org.aopalliance.intercept.MethodInvocation;
public class SomeSpecificServiceInterceptor implements MethodInterceptor {
#Override
public Object invoke(MethodInvocation invocation) throws Throwable {
String status;
try {
// allow the original invocation to actually execute
status = String.valueOf(invocation.proceed());
} catch (IllegalStateException e) {
System.out.println("Existing service threw the following exception [" + e.getMessage() + "]");
status = "FAIL";
}
return status;
}
}
3) In spring2.xml define the proxy creator and the interceptor:
<bean id="serviceInterceptor" class="com.nsn.SomeSpecificServiceInterceptor" />
<bean id="proxyCreator" class="org.springframework.aop.framework.autoproxy.BeanNameAutoProxyCreator">
<property name="beanNames" value="someSpecificService"/>
<property name="interceptorNames">
<list>
<value>serviceInterceptor</value>
</list>
</property>
</bean>
4) Running a small example such as:
public class Main {
public static void main(String[] args) {
Service service = new ClassPathXmlApplicationContext("context.xml").getBean("someSpecificService", Service.class);
System.out.println("Service execution status [" + service.run() + "]");
}
}
... instead of the IllegalStateException stacktrace you'd normally expect, it will print:
Existing service threw the following exception [Muahahahaha!]
Service execution status [FAIL]
Please note that in this example the service instance is not injected in the interceptor as you asked because I had no user for it. However should you really need it, you can easily inject it via constructor/property/etc because the interceptor is a spring bean itself.
I use properties file in spring framework
root-context.xml
<context:property-placeholder location="classpath:config.properties" />
<util:properties id="config" location="classpath:config.properties" />
java code
#Value("#{config[ebookUseYN]}")
String EBOOKUSEYN;
when Using url call(#RequestMapping(value="/recommendbooks" , method=RequestMethod.GET, produces="application/json;charset=UTF-8")).. this work!
but, i use method call,
public void executeInternal(JobExecutionContext arg0) throws JobExecutionException {
IndexManageController indexManage = new IndexManageController();
CommonSearchDTO commonSearchDTO = new CommonSearchDTO();
try {
if("Y".equals(EBOOKUSEYN)){
indexManage.deleteLuceneDocEbook();
indexManage.initialBatchEbook(null, commonSearchDTO);
}
indexManage.deleteLuceneDoc(); <= this point
indexManage.deleteLuceneDocFacet();
indexManage.initialBatch(null, commonSearchDTO);
}catch (Exception e) {
e.printStackTrace();
}
}
when 'this point ' method call, changing controller, and don't read properties file field..
#Value("#{config[IndexBasePath]}")
String IndexBasePath;
#RequestMapping(value="/deleteLuceneDoc" , method=RequestMethod.GET, produces="application/json;charset=UTF-8")
public #ResponseBody ResultCodeMessageDTO deleteLuceneDoc()
throws Exception
{
long startTime = System.currentTimeMillis();
ResultCodeMessageDTO result = new ResultCodeMessageDTO();
System.out.println(IndexBasePath);
}
It doesn't read IndexBasePath
In your code you are creating a new instance of the IndexManageController, Spring doesn't know this instance and as such it will never be processed.
public void executeInternal(JobExecutionContext arg0) throws JobExecutionException {
IndexManageController indexManage = new IndexManageController();
Instead of creating a new instance inject the dependency for the IndexManageController so that it uses the pre-configured instance constructed and managed by Spring. (And remove the line which constructs a new instance of that class).
public class MyJob {
#Autowired
private IndexManageController indexManage;
}
Your configuration is also loading the properties twice
<context:property-placeholder location="classpath:config.properties" />
<util:properties id="config" location="classpath:config.properties" />
Both load the config.properties file. Simply wire the config to the property-placeholder element.
<context:property-placeholder properties-ref="config"/>
<util:properties id="config" location="classpath:config.properties" />
Saves you loading twice and saves you another bean.