How do I call the retrieveStudents() from my cron job? - java

I'm new to Java development and Spring/Springboot. I was tasked to create a cron job that would execute a query to a database. Although, my problem would focus on calling the function since it's already existing in the project. I'm still quite confused and learning about Dependency Injection, Design Patterns and Spring. I tried to resolve this myself, but it's taking a while now -- so I figured to ask while I'm also trying to figure it out just to save time in case they ask me for a deadline. Thank you so much in advance!
This is how the programs are generally structured:
QueryConfig.java
This is a new and only file I created. I was able to make the cron job work, as I tried to put a logger inside runQuery() and it was able to run every 5 minutes as per the configuration file.
#RefreshScope
#Component
public class QueryConfig {
#Value("${cron.job.query}")
private String sql;
String name = "Bob";
StudentMgtApiDelegate delegate = new StudentMgtApiDelegateImpl();
#Scheduled(cron = "${cron.job.schedule}", zone = "${cron.job.timezone}")
public void runQuery() {
delegate.retrieveStudents(name);
}
}
StudentMgtApiDelegateImpl.java
Please also note that this is just a representation of the code since I cannot share the actual. I'll try my best to make it as close to the real implementation. There are 3 methods for the API, but I just want to call the retrieveStudents().
#Component
public class StudentMgtApiDelegateImpl implements StudentMgtApiDelegate {
#Autowired
private StudentFacade studentFacade;
#Override
public ResponseEntity<List<Student>> retrieveStudents(String name) {
return ResponseEntity.ok(studentFacade.retrieveStudents(
...
));
}
#Override
public ResponseEntity<StudentDetails> retrieveStudentDetails(String name...) {
return ResponseEntity.ok(studentFacade.retrieveStudentDetails(
...
));
}
#Override
public ResponseEntity<List<CountBreakdown>> retrieveStudentCounts(String name) {
return ResponseEntity.ok(studentFacade.studentCountsRetrieve(
...
));
}
}
StudentFacade.java
public class StudentFacade {
private Function<DataWrapper<StudentParams<String>>, List<Student>> studentListRetrieveFn;
private Function<DataWrapper<StudentParams<String>>, StudentDetails> studentDetailsRetrieveFn;
private Function<DataWrapper<StudentRetrieveCriteria>, List<CountBreakdown>> studentCountsRetrieveFn;
public StudentFacade(Function<DataWrapper<StudentParams<String>>, List<Student>> studentListRetrieveFn, Function<DataWrapper<StudentParams<String>>, StudentDetails> studentDetailsRetrieveFn, Function<DataWrapper<StudentRetrieveCriteria>, List<CountBreakdown>> studentCountsRetrieveFn) {
this.studentListRetrieveFn = studentListRetrieveFn;
this.studentDetailsRetrieveFn = studentDetailsRetrieveFn;
this.studentCountsRetrieveFn = studentCountsRetrieveFn;
}
public List<Student> retrieveStudents(DataWrapper<StudentParams<String>> wrapper) {
return Optional.ofNullable(wrapper).map(studentListRetrieveFn).orElse(null);
}
public StudentDetails retrieveStudentDetails(DataWrapper<StudentParams<String>> wrapper) {
return Optional.ofNullable(wrapper).map(studentDetailsRetrieveFn).orElse(null);
}
public List<CountBreakdown> studentCountsRetrieve(DataWrapper<StudentRetrieveCriteria> wrapper) {
return Optional.ofNullable(wrapper).map(studentCountsRetrieveFn).orElse(null);
}
}
I apologize in advance for the many code omissions and I know some parameters won't match and make sense. But as of the current implementation in my QueryConfig.java, I am encountering this error:
[scheduling-1] ERROR o.s.s.s.TaskUtils$LoggingErrorHandler - Unexpected error occurred in scheduled task
java.lang.NullPointerException: null
I tried to debug and see the value of delegate inside the QueryConfig.java, and it has a studentFacade that is null.

Autowire the StudentMgtApiDelegate; do not construct it manually.
#Autowired
StudentMgtApiDelegate delegate;

Related

JOOQ listeners: context data is not cleaned between two queries

In my current project, I use java 11/JOOQ 3.15/Micronaut/Micrometer. In order to have relevant SQL metrics, I would like to put a name on my JOOQ queries.
To do that, I have tried to use the ctx.data() field combined with a custom ExecuteListener.
Let's take a really simplified listener:
#Singleton
public class JooqListener extends DefaultExecuteListener {
transient StopWatch watch;
private final MeterRegistry meterRegistry;
public JooqListener(MeterRegistry meterRegistry) {
this.meterRegistry = meterRegistry;
}
#Override
public void executeStart(ExecuteContext ctx) {
watch = new StopWatch();
}
#Override
public void fetchEnd(ExecuteContext ctx) {
Tags prometheusTag = Tags.of("queryName", ctx.configuration().data("queryName").toString());
meterRegistry.timer("sql.query.timer", prometheusTag)
.record(watch.split(), TimeUnit.NANOSECONDS);
}
// I have tried to remove the data manually, but not working
#Override
public void end(ExecuteContext ctx) {
ctx.configuration().data().remove("queryName");
}
}
If I send 2 different queries from two different repositories, like for example:
DSLContext context = DSL.using(jooqConfiguration);
context.data("queryName", "query1");
return context.select(1).from("dual").fetch();
And just after, let say I'm not attentive and I forgot to name my query:
DSLContext context = DSL.using(jooqConfiguration);
return context.select(2).from("dual").fetch();
ctx.configuration().data("queryName") in my listener will always contain "query1", which I didn't expect because ExecuteListeners are listening query by query, and furthermore, I have created two different DSLContexts. It looks like the ctx.data() cannot be cleaned but just overwritten.
Is it an expected behaviour? Is there an other object/method I should use which can be limited to the query scope? (I searched a lot on google but "data" keyword is a little bit annoying...)
Thank you
A DSLContext just wraps a Configuration. It doesn't have its own lifecycle. So, if you're modifying the Configuration.data() map through DSLContext, you're modifying a globally shared object. In other words, you must not modify Configuration.data() except for when you initialise your configuration for the first time. See this section of the manual for more details.
A better way to do what you intend to do is:
// Create a "derived" configuration, which is a new,
// independent Configuration instance
DSLContext context = DSL.using(jooqConfiguration.derive());
context.data("queryName", "query1");
return context.select(1).from("dual").fetch();
And then, in your ExecuteListener:
#Override
public void fetchEnd(ExecuteContext ctx) {
// Reading the Configuration.data() is still fine:
Tags prometheusTag = Tags.of("queryName",
ctx.configuration().data("queryName").toString());
meterRegistry.timer("sql.query.timer", prometheusTag)
.record(watch.split(), TimeUnit.NANOSECONDS);
}
#Override
public void end(ExecuteContext ctx) {
// But you shouldn't modify it
ctx.configuration().data().remove("queryName");
}

Create bean instance at runtime for interface

i am kind of stuck on a problem with creating beans, or probably i got the wrong intention.. Maybe you can help me solve it:
I got a application which takes in requests for batch processing. For every batch i need to create an own context depending on the parameters issued by the request.
I will try to simplyfy it with the following example:
I receive a request to process in a batch FunctionA which is a implementation for my Function_I interface and has sub-implementation FunctionA_DE and FunctionA_AT
Something like this:
public interface Function_I {
String doFunctionStuff()
}
public abstract class FunctionA implements Function_I {
FunctionConfig funcConfig;
public FunctionA(FunctionConfig funcConfig) {
this.funcConfig = funcConfig;
}
public String doFunctionStuff() {
// some code
String result = callSpecificFunctionStuff();
// more code
return result;
}
protected abstract String callSpecificFunctionStuff();
}
public class FunctionA_DE extends FunctionA {
public FunctionA_DE(FunctionConfig funcConf) {
super(funcConf)
}
protected String callSpecifiFunctionStuff() {
//do some specificStuff
return result;
}
}
public class FunctionA_AT extends FunctionA {
public FunctionA_AT(FunctionConfig funcConf) {
super(funcConf)
}
protected String callSpecifiFunctionStuff() {
//do some specificStuff
return result;
}
}
what would be the Spring-Boot-Way of creating a instance for FunctionA_DE to get it as Function_I for the calling part of the application, and what should it look like when i add FunctionB with FunctionB_DE / FunctionB_AT to my classes..
I thought it could be something like:
PSEUDO CODE
#Configuration
public class FunctionFactory {
#Bean(SCOPE=SCOPE_PROTOTYPE) // i need a new instance everytime i call it
public Function_I createFunctionA(FunctionConfiguration funcConfig) {
// create Function depending on the funcConfig so either FunctionA_DE or FunctionA_AT
}
}
and i would call it by Autowiring the FunctionFactory into my calling class and use it with
someSpringFactory.createFunction(functionConfiguration);
but i cant figure it out to create a Prototype-Bean for the function with passing a parameter.. And i cant really find a solution to my question by browsing through SO, but maybe i just got the wrong search terms.. Or my approach to solve this issue i totally wrong (maybe stupid), nobody would solve it the spring-boot-way but stick to Factories.
Appreciate your help!
You could use Springs's application context. Create a bean for each of the interfaces but annotate it with a specific profile e.g. "Function-A-AT". Now when you have to invoke it, you can simply set the application context of spring accordingly and the right bean should be used by Spring.
Hello everyone and thanks for reading my question.
after a discussion with a friend who is well versed in the spring framework i came to the conclusion that my approach or my favoured solution was not what i was searching for and is not how spring should be used. Because the Function_I-Instance depends on the for the specific batch loaded configuration it is not recommended to manage all these instances as #Beans.
In the end i decided to not manage the instances for my Function_I with spring. but instead i build a Controller / Factory which is a #Controller-Class and let this class build the instance i need with the passed parameters for decision making on runtime.
This is how it looks (Pseudo-Code)
#Controller
public class FunctionController {
SomeSpringManagedClass ssmc;
public FunctionController(#Autowired SomeSpringManagedClass ssmc) {
this.ssmc = ssmc;
}
public Function_I createFunction(FunctionConfiguration funcConf) {
boolean funcA, cntryDE;
// code to decide the function
if(funcA && cntryDE) {
return new FunctionA_DE(funcConf);
} else if(funB && cntryDE) {
return new FunctionB_DE(funcConf);
} // maybe more else if...
}
}

After a while, production spring boot app throws NoSuchBeanDefinitionException

We had a medium complicated spring boot 1.5.14 app with rest api + mybatis for backend, angular 4 with material/prime-ng for frontend. It works fine from developers' box up to UAT environments, but in production, it works fine for the first couple of days, then throws NoSuchBeanDefinition. The production environment is openshift + openjdk version "1.8.0_171".
To trim down the app and leave related info, here are code snippets:
public interface ITaxCalculator {
BigDecimal calc(BigDecimal amount);
}
public class FedProvTaxCalculator implements ITaxCalculator {
... ...
}
#Configuration
public class TaxCalculatorConfiguration {
...
#Bean("onTaxCalculator")
public ITaxCalculator ontairioTaxCalculator() {
FedProvTaxCalculator ret = ..
...
return ret;
}
#Bean("bcTaxCalculator")
public ITaxCalculator britishColumbiaTaxCalculator() {
FedProvTaxCalculator ret = ..
...
return ret;
}
}
public class CAOrderProcessor implements IOrderProcessor {
#Autowire #Qualifier("onTaxCalculator")
private FedProvTaxCalculator onTaxCalculator;
#Autowire #Qualifier("bcTaxCalculator")
private FedProvTaxCalculator bcTaxCalculator;
....
}
// --------------- below code are at framework level -----
public interface IOrderProcessor {
void process(Order order);
}
public interface IOrderProcessorFactory {
IOrderProcessor createOrderProcessor(String countryCode, MembershipType membership);
}
#Service
public class OrderProcessorFactoryPropImpl implements IOrderProcessorFactory {
#Autowired
private AutowireCapableBeanFactory beanFactory;
#Override
#Cacheable("orderProcessor")
public IOrderProcessor createOrderProcessor(String countryCode, MembershipType membership) {
String clzName = resolveOrderProcessClzName(countryCode, membership); // resolve to CAOrderProcess clz-name
try {
Object ret = Class.forName(clzName).newInstance();
beanFactory.autowireBean(ret);
// the above line throws error after a while
return (IOrderProcessor)ret;
} catch (Exception ex) {
...
throw new RuntimeException(...);
}
}
private String resolveOrderProcessClzName(String countryCode, MembershipType membership) {
String clzName = lookupFromPropFile(countryCode + "." + membership.name());
if (StringUtils.isBlank( clzName )) {
clzName = lookupFromPropFile(countryCode);
}
return clzName;
}
}
After restarting spring boot app, it works fine for the first couple of days, even with CA=CAOrderProcessor. But then one day, with countryCode=CA, it throws NoSuchBeanDefinitionException: No qualifying bean of type ‘FedProvTaxCalculator’ available: expected at least 1 bean which qualifies as autowire candidate. After restarting Java app, it works again for CA=CAOrderProcessor.
Why does spring framework behave this way? Thanks in advance!
The issue can be solved by
#Configuration public class TaxCalculatorConfiguration {
#Bean("onTaxCalculator")
public ITaxCalculator ontairioTaxCalculator() { ... }
}
public class CAOrderProcessor implements IOrderProcessor {
#Autowire #Qualifier("onTaxCalculator")
private ITaxCalculator onTaxCalculator;
}
Using AutowireCapableBeanFactory is fine. Why does it work initially, and then fails, and only fails on one ENV - openshift with min 2 pods? the other ENVs work fine always. Looks like spring relaxes autowire bean-type check initially, and later on under certain conditions, it checks the bean-type. Logical guess is that bean-definition returns interface type, which may be proxied, bean-wiring refers a concrete type, the proxied interface doesn't equal concrete type, raising this error. But in that case, it should always give error. If not, if I don't use the cache, or evict cache, I should be able to easily re-produce it in any ENVs, but it works fine on my local macos + oracle jdk 1.8. I even create a docker container based on production openshift docker image to run the app without cache, evict cache, force YGC and FGC, it works fine too.
I don't why behaves like that, probably because you use AutowireCapableBeanFactory directly and even worse then that in combination with #Cacheable.
You should reconsider your framework level code. I believe that you should never use AutowireCapableBeanFactory directly and especially in your case. It's simple and you can achieve the same result with less effort and using simple Map of country_code + membershi_type -> processor, for example:
#Configuration
public class ProcessorConfiguration {
. . .
#Bean("cAOrderProcessor ")
public IOrderProcessor cAOrderProcessor() [
return new CAOrderProcessor();
}
. . .
#Bean
public IOrderProcessorFactory processorFactory() {
// create country_code + membershi_type -> processor map
Map<ProcessorKey, IOrderProcessor> processorMap = new HashMap<>();
// not sure about values in MembershipType, so I put SOME just for example
// this map also can be a bean if you're gonna need that in other parts of app
processorMap.put(new ProcessorKey("CA", MembershipType.SOME), cAOrderProcessor());
// set it to factory
return new OrderProcessorFactoryPropImpl(processorMap );
}
. . .
}
public class OrderProcessorFactoryPropImpl implements IOrderProcessorFactory {
private final Map<ProcessorKey, IOrderProcessor> processorMap;
public OrderProcessorFactoryPropImpl(Map<ProcessorKey, IOrderProcessor> processorMap) {
this.processorMap = processorMap;
}
#Override
// #Cacheable("orderProcessor") you dont need that because get it from map costs nothing
// changed the name to "get" instead of "create"
public IOrderProcessor getOrderProcessor(String countryCode, MembershipType membership) {
// just get processor by key
return processorMap.get(constructKey(countryCode, membership));
}
private ProcessorKey constructKey(String countryCode, MembershipType membership) {
return new ProcessorKey(countryCode, membership);
}
}
Also I noticed that you mix java and annotation-based bean config which is considered as a bad practice. Hope this's going to help.
Update 1 - Answering comment
Well, to figure out what's wrong person's gonna need to full copy of your app to debug/logs and reproduce it's usual use cases. It's probably not possible to say what's wrong just looking at examples you've provided (at least for me).
And I just pointed out that the way you are using AutowireCapableBeanFactory is not up to best practices and that's why you have problems in your runtime.
So you probably have 2 solutions:
Get rid of it, and use somewhat different approach (maybe similar to the one I've suggested previously). I believe that's only one good option. But this is up to you to decide.
Enable spring logs and hope that you will catch problem there. Probably need to enable debug logs like that in your log4j.xml (I suppose it's log4j but might be something else):
<category name="org.springframework.beans">
<priority value="debug" />
</category>

(Spring) Can I use dependency injection inside a loop?

I am currently working on a Spring Boot application that allows users to save categories into a database. I can get my code to "work", however, I think it limits the amount of testing I can do, hence my question.
The controller receives a list of categories. The controller iterates over these categories, validates them and depending on whether they are valid, they are saved to a database. The controller finally returns a list of messages, such that the recipient can identify which category has been accepted or rejected, etc.
I have a list of model messages (List), which upon each iteration, the controller instantiates a new model message (new ModelMessage()) and eventually adds it to the List. Is there a way to inject a new ModelMessage upon each iteration or do I need to use the new keyword? If I do use the new keyword, I feel like this is limiting my testability/tightly coupling my controller to the model message.
The controller:
#PostMapping("/category")
public String saveCategoryModelToDatabase(#RequestBody CategoryModelWrapper categoryModelWrapper){
List<CategoryModel> categoryModelList = categoryModelWrapper.getCategoryModelList();
modelMessageList.clear();
for(CategoryModel categoryModel : categoryModelList){
//Resetting model
modelMessage = new ModelMessage(); //This tightly couples my method to the ModelMessage class, which is bad for testing?
//#Autowired modelMessage; <-- something like this? Inject a new ModelMessage with each iteration.
modelMessage.setName(categoryModel.getName());
//Resetting categoryModelErrors
Errors categoryModelErrors = new BeanPropertyBindingResult(categoryModel, "categoryModel");
categoryModelValidator.validate(categoryModel, categoryModelErrors);
if(categoryModelErrors.hasErrors()){
modelMessage.setStatus(ModelMessageStatusEnum.REJECTED);
modelMessage.setReason(MODEL_MESSAGE_0004);
}
if(categoryModelService.save(categoryModel)){
modelMessage.setStatus(ModelMessageStatusEnum.ACCEPTED);
}
else{
modelMessage.setStatus(ModelMessageStatusEnum.REJECTED);
modelMessage.setReason(MODEL_MESSAGE_0005);
}
modelMessageList.add(modelMessage);
}
return gson.toJson(modelMessageList);
}
An example of the response to the recipient:
[{"name":"Arts","status":"ACCEPTED"},{"name":"Business","status":"ACCEPTED"},{"name":"Gaming","status":"ACCEPTED"},{"name":"Deals","status":"REJECTED","reason":"Category rejected because of an unexpected exception, i.e. possibly due to duplicate keys."}]
Thanks for any help :)
You could use the ApplicationContext, assuming you have access to it, as a factory for ModelMessage. But, is that really necessary?
I think you can create new ModelMessages in your Controller, it's only a data object and not a service bean.
A junit can check the result of the method.
But if you really want to use Spring, I would look at the FactoryBean...
Example:
public class ModelMessage {
String name;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
#Component
public class ModelMessageFactory implements FactoryBean<ModelMessage> {
#Override
public ModelMessage getObject() throws Exception {
return new ModelMessage();
}
#Override
public Class<?> getObjectType() {
return ModelMessage.class;
}
}
#RunWith(SpringRunner.class)
#SpringBootTest
#SpringJUnitConfig
public class ModelMessageFactoryTest {
#Autowired
private ModelMessageFactory messageFactory;
#Test
public void testGetObject() throws Exception {
assertNotNull("Factory is null", messageFactory);
IModelMessage modelMessage1 = messageFactory.getObject();
IModelMessage modelMessage2 = messageFactory.getObject();
assertNotEquals("error object is equal", System.identityHashCode(modelMessage1),
System.identityHashCode(modelMessage2));
}
#Test
public void testGetObjectType() throws Exception {
assertEquals(ModelMessage.class, messageFactory.getObjectType());
}
}

How to record the status of steps in a process?

I have a web application that takes input from a user and uses it to generate a report based on the results of calling various external web services.
I want to track the progress of the report generation, being able to see the status, start time and stop time of each step.
I've added the domain objects Job and JobStep:
#Entity
#Table(name="jobs")
#Data
#EqualsAndHashCode(callSuper=false, of={ "id" })
#ToString()
public class Job extends DomainObject {
#NotNull
#OneToMany(cascade=CascadeType.ALL)
#JoinColumn(name="job_id")
private Set<JobStep> steps = new TreeSet<JobStep>();
protected Job() {/*Hibernate requirement*/}
public Job() {
// Create all the steps in the beginning with the default settings:
// status=waiting, date_time both null.
for (JobStep.Type stepType : JobStep.Type.values()) {
JobStep step = new JobStep(stepType);
steps.add(step);
}
}
public Set<JobStep> getSteps() {
return steps;
}
public void startStep(JobStep.Type stepType)
{
for (JobStep step : steps) {
if (step.getType() == stepType) {
step.start();
return;
}
}
}
public void stopStep(JobStep.Type stepType, JobStep.Status status) {
for (JobStep step : steps) {
if (step.getType() == stepType) {
step.stop(status);
return;
}
}
}
}
#Entity
#Table(name="job_steps")
#Data
#EqualsAndHashCode(callSuper=false, of={ "type", "job" })
#ToString
public class JobStep extends DomainObject implements Comparable<JobStep> {
private static final Logger LOG = LoggerFactory.getLogger(JobStep.class);
public enum Type {
TEST_STEP1,
TEST_STEP2,
TEST_STEP3
}
public enum Status {
WAITING,
RUNNING,
FINISHED,
ERROR
}
#NotNull
#Getter
#Enumerated(EnumType.STRING)
private Type type;
#NotNull
#Setter(AccessLevel.NONE)
#Enumerated(EnumType.STRING)
private Status status = Status.WAITING;
#Setter(AccessLevel.NONE)
private DateTime start = null;
#Setter(AccessLevel.NONE)
private DateTime stop = null;
#ManyToOne
private Job job;
protected JobStep() {/*Hibernate requirement */}
public JobStep(Type type) {
this.type = type;
}
public void start() {
assert(status == Status.WAITING);
status = Status.RUNNING;
start = new DateTime();
}
public void stop(Status newStatus) {
assert(newStatus == Status.FINISHED ||
newStatus == Status.ERROR);
assert(status == Status.RUNNING);
status = newStatus;
stop = new DateTime();
}
#Override
public int compareTo(final JobStep o) {
return getType().compareTo(o.getType());
}
}
These are manipulated using the JobService class:
#Service
public class JobService {
private static final Logger LOG = LoggerFactory.getLogger(JobService.class);
#Autowired
private JobDAO jobDao;
#Transactional
public void createJob() {
Job job = new Job();
Long id = jobDao.create(job);
LOG.info("Created job: {}", id);
}
#Transactional
public Job getJob(Long id) {
return jobDao.get(id);
}
#Transactional
public void startJobStep(Job job, JobStep.Type stepType) {
LOG.debug("Starting JobStep '{}' for Job {}", stepType, job.getId());
job.startStep(stepType);
}
#Transactional
public void stopJobStep(Job job, JobStep.Type stepType,
JobStep.Status status) {
LOG.debug("Stopping JobStep '{}' for Job {} with status {}", stepType,
job.getId(), status);
job.stopStep(stepType, status);
}
}
So in a method that starts a step, I can write:
class Foo() {
#Autowired
JobService jobService;
public void methodThatStartsAStep(Job job) {
jobService.startJobStep(job, JobStep.Type.TEST_STEP1);
// Implementation here
}
}
The problem I'm having is finding a way to give the Job instance to the method that requires it in order to record that the step has started.
The obvious solution is to pass the Job as a parameter (as above), but it doesn't always make sense passing a Job - it's only done to record the step (extreme example below):
public int multiplySomeNumbers(Job job, int num1, int num2) {
jobService.startJobStep(job, JobStep.Type.TEST_STEP1);
// Implementation here.
}
I have two thoughts on an ideal solution:
Use an aspect and annotate functions that can cause a change in the job step state. This makes it less coupled, but the aspect would still need to get the job from somewhere;
Store the Job object or id in a global-like scope (e.g. a session or context). I tried using #Scope("session") on my JobService with the intention of storing the Job instance there, but I kept getting java.lang.IllegalStateException: No thread-bound request found. I'm not even sure if this is the right use-case for such a solution.
My questions are:
Is it possible to store the Job or its id somewhere so I don't have to add the Job as a parameter to method?
Is there a way of doing this that I'm not aware of?
re: question 2, I'm going to go out on a limb and take the widest definition of that question possible.
You seem to be reimplementing Spring Batch. Batch has extensive support for defining and executing jobs, persisting job progress, and supporting resumption. It also has contexts for remembering state and moving state from one step to another, chunk-oriented processing, and a generally well thought out and extensive infrastructure, including a bunch of readers and writers for common workflows.
Feel free to ignore this answer, I just wanted to throw the suggestion out there in case it spares you a ton of work.
you can keep it in thread local , you can directly access the Object from thread local / Or you can create custom Spring scope for more info about custom scope http://springindepth.com/book/in-depth-ioc-scope.html . And you can define the Job in custom Scope and inject that into your beans.
EDIT : This will work only if your entire process runs in single thread and your Job steps are static you can follow the process you mentioned. In case if your Jobs are not static ( mean calling the external services / order of external services may be changed based on input) i would implement Chain responsibility and command pattern ( commands as actual process) and Chain as your Job Steps. then you can track / stop / change the steps based on configuration.

Categories

Resources