How to pass instance variables into Quartz job? - java

I wonder how to pass an instance variable externally in Quartz?
Below is pseudo code I would like to write. How can I pass externalInstance into this Job?
public class SimpleJob implements Job {
#Override
public void execute(JobExecutionContext context)
throws JobExecutionException {
float avg = externalInstance.calculateAvg();
}
}

you can put your instance in the schedulerContext.When you are going to schedule the job ,just before that you can do below:
getScheduler().getContext().put("externalInstance", externalInstance);
Your job class would be like below:
public class SimpleJob implements Job {
#Override
public void execute(JobExecutionContext context)
throws JobExecutionException {
SchedulerContext schedulerContext = null;
try {
schedulerContext = context.getScheduler().getContext();
} catch (SchedulerException e1) {
e1.printStackTrace();
}
ExternalInstance externalInstance =
(ExternalInstance) schedulerContext.get("externalInstance");
float avg = externalInstance.calculateAvg();
}
}
If you are using Spring ,you can actually using spring's support to inject the whole applicationContext like answered in the Link

While scheduling the job using a trigger, you would have defined JobDataMap that is added to the JobDetail. That JobDetail object will be present in the JobExecutionContext passed to the execute() method in your Job. So, you should figure out a way to pass your externalInstance through the JobDataMap. HTH.

Add the object to the JobDataMap:
JobDetail job = JobBuilder.newJob(MyJobClass.class)
.withIdentity("MyIdentity",
"MyGroup")
.build();
job.getJobDataMap()
.put("MyObject",
myObject);
Access the data from the JobDataMap:
var myObject = (MyObjectClass) context.getJobDetail()
.getJobDataMap()
.get("carrier");

Solve this problem by creating one interface with one HashMap putting required information there.
Implement this interface in your Quartz Job class this information will be accessible.
In IFace
Map<JobKey,Object> map = new HashMap<>();
In Job
map.get(context.getJobDetail().getKey()) => will give you Object

Quartz has a simple way to grep params from JobDataMap using setters
I am using Quartz 2.3 and I simply used setter to fetch passed instance objects
For example I created this class
public class Data implements Serializable {
#JsonProperty("text")
private String text;
#JsonCreator
public Data(#JsonProperty("b") String text) {this.text = text;}
public String getText() {return text;}
}
Then I created an instance of this class and put it inside the JobDataMap
JobDataMap jobDataMap = new JobDataMap();
jobDataMap.put("data", new Data(1, "One!"));
JobDetail job = newJob(HelloJob.class)
.withIdentity("myJob", "group")
.withDescription("bla bla bla")
.usingJobData(jobDataMap) // <!!!
.build();
And my job class looks like this
public class HelloJob implements Job {
Data data;
public HelloJob() {}
public void execute(JobExecutionContext context)
throws JobExecutionException
{
String text = data.getText();
System.out.println(text);
}
public void setData(Data data) {this.data = data;}
}
Note: it is mandatory that the field and the setter matched the key
This code will print One! when you schedule the job.
That's it, clean and efficient

This is the responsibility of the JobFactory. The default PropertySettingJobFactory implementation will invoke any bean-setter methods, based on properties found in the schdeuler context, the trigger, and the job detail. So as long as you have implemnted an appropriate setContext() setter method you should be able to do any of the following:
scheduler.getContext().put("context", context);
Or
Trigger trigger = TriggerBuilder.newTrigger()
...
.usingJobData("context", context)
.build()
Or
JobDetail job = JobBuilder.newJob(SimpleJob.class)
...
.usingJobData("context", context)
.build()
Or if that isn't enough you can provide your own JobFactory class which instantiates the Job objects however you please.

Related

how to run scheduler just after inserting data into database in spring boot?

I am new to spring boot and I have a requirement in which I have to run scheduler only if new data is inserted into table. Thanks for any help
Hibernate has an interceptor mecanism that allows you to get notified, at specific times, when database events occurs.
Such events are creation/deletion/flush of the session. As you get access to the objects being subject to the given event, you have a mean to fire a process when a given object of a given class (which you can easily map to a table in your schema) is modified.
The javadoc can be found here :
https://docs.jboss.org/hibernate/orm/4.0/manual/en-US/html/events.html
You can use the interceptor mecanism along with Java's ScheduledExecutorService to schedule a task when hibernate intercepts the save operation. You can create your business logic under that interceptor.
Scheduling is not enabled by default. Before adding any scheduled jobs we need to enable scheduling explicitly by adding the #enableScheduling annotation.
Now with the help of the ScheduledTaskRegistrar#addTriggerTask method, we can add a Runnable task and a Trigger implementation to recalculate the next execution time after the end of each execution.
#EnableScheduling
public class DynamicSchedulingConfig implements SchedulingConfigurer {
#Autowired
private TickService tickService;
#Bean
public Executor taskExecutor() {
return Executors.newSingleThreadScheduledExecutor();
}
#Override
public void configureTasks(ScheduledTaskRegistrar taskRegistrar) {
taskRegistrar.setScheduler(taskExecutor());
taskRegistrar.addTriggerTask(
new Runnable() {
#Override
public void run() {
tickService.tick();
}
},
new Trigger() {
#Override
public Date nextExecutionTime(TriggerContext context) {
Optional<Date> lastCompletionTime =
Optional.ofNullable(context.lastCompletionTime());
Instant nextExecutionTime =
lastCompletionTime.orElseGet(Date::new).toInstant()
.plusMillis(tickService.getDelay());
return Date.from(nextExecutionTime);
}
}
)
}
}

Custom ListItemReader Spring Batch

I coded a custom implementation of ListItemReader, triying to follow the example in the spring batch's github. Anyway, in my case, I need read a variable from the jobContext, this variable is a path where I have to read the files that contains. I can't use the constructor because the constructors executes before the beforeStep event and I don't have these var at this moment.
Anyway this will work first execution, but if the list never goes again to null I can't execute again the initialize method.
If I tried add an else in the !list.isEmpty() condition that set my list to null. I enter in an infinite loop.
There are other methods to solve this? Maybe I am overcomplicating this.
public class ListItemReader<Path> implements ItemReader<Path>, StepExecutionListener {
private List<Path> list;
private org.springframework.batch.core.JobExecution jobExecution;
public ListItemReader() {
this.list = new ArrayList<>();
}
public void initialize(){
//Here I made an listdirectories of a path and add all the files to the list
String pathString = jobExecution.getExecutionContext().getString(Constants.CONTEXT_PATH);
Path path = Paths.get(pathString );
...
items.add(Paths.get(..path..));
}
#Nullable
#Override
public T read() {
if(list == null) initialize();
if (!list.isEmpty()) {
return list.remove(0);
}
return null;
}
#Override
public ExitStatus afterStep(StepExecution se) {
return ExitStatus.COMPLETED;
}
#Override
public void beforeStep(StepExecution se) {
jobExecution = se.getJobExecution();
}
}
I can't use the constructor because the constructors executes before the beforeStep event and I don't have these var at this moment.
Actually, you can delay your bean constructor by using #JobScope and #StepScope. Additionally, you can use #Value and Spring SpEL to inject your jobParameters.
In your case, you might want to rewrite your code, for e.g:
#Configuration
#RequiredArgsConstructor
public class YourJobConfig {
private final StepBuilderFactory stepBuilder;
#Bean("yourStep")
public Step doWhatever() {
return stepBuilder
.get("yourStepName")
.<Path, Path>chunk(50) // <-- replace your chunk size here
.reader(yourItemReader(null, 0)) // <-- place your default values here
.process(yourItemProcessor())
.writer(yourItemWriter())
}
#Bean
public ItemReader<Path> yourItemReader(
#Value("#{jobParameters['attribute1']}") String attribute1,
#Value("#{jobParameters['attribute2']}") long attribute2
) {
return new ListItemReader(attribute1, attribute2) // <-- this is your ListItemReader
}
#Bean
public ItemProcessor<Path, Path> yourItemProcessor(){
return new YourItemProcessor<>();
}
#Bean
public ItemWriter<Path> yourItemWriter(){
return new YourItemWriter<>();
}
}
Before starting your job, you can add some jobParameters:
public JobParameters getJobParams() {
JobParametersBuilder params = new JobParametersBuilder();
params.addString("attribute1", "something_cool");
params.addLong("attribute2", 123456);
return params.toJobParameters();
}
and add it to your job:
Job yourJob = getYourJob();
jobParameters jobParams = getJobParams();
JobExecution execution = jobLauncher.run(job, jobParams);
References
The Delegate Pattern and Registering with the Step
How to get access to job parameters from ItemReader

HK2 factory for Quartz jobs, not destroying service after execution

I want to use Quartz Scheduler in my server application that uses HK2 for dependency injection. In order for Quartz jobs to have access to DI, they need to be DI-managed themselves. As a result, I wrote a super simple HK2-aware job factory and registered it with the scheduler.
It works fine with instantiation of services, observing the requested #Singleton or #PerLookup scope. However, it's failing to destroy() non-singleton services (= jobs) after they are finished.
Question: how do I get HK2 to manage jobs properly, including tearing them down again?
Do I need to go down the path of creating the service via serviceLocator.getServiceHandle() and later manually destroy the service, maybe from a JobListener (but how get the ServiceHandle to it)?
Hk2JobFactory.java
#Service
public class Hk2JobFactory implements JobFactory {
private final Logger log = LoggerFactory.getLogger(getClass());
#Inject
ServiceLocator serviceLocator;
#Override
public Job newJob(TriggerFiredBundle bundle, Scheduler scheduler) throws SchedulerException {
JobDetail jobDetail = bundle.getJobDetail();
Class<? extends Job> jobClass = jobDetail.getJobClass();
try {
log.debug("Producing instance of Job '" + jobDetail.getKey() + "', class=" + jobClass.getName());
Job job = serviceLocator.getService(jobClass);
if (job == null) {
log.debug("Unable to instantiate job via ServiceLocator, returning unmanaged instance.");
return jobClass.newInstance();
}
return job;
} catch (Exception e) {
SchedulerException se = new SchedulerException(
"Problem instantiating class '"
+ jobDetail.getJobClass().getName() + "'", e);
throw se;
}
}
}
HelloWorldJob.java
#Service
#PerLookup
public class HelloWorldJob implements Job {
private final Logger log = LoggerFactory.getLogger(this.getClass());
#PostConstruct
public void setup() {
log.info("I'm born!");
}
#PreDestroy
public void shutdown() {
// it's never called... :-(
log.info("And I'm dead again");
}
#Override
public void execute(JobExecutionContext context) throws JobExecutionException {
log.info("Hello, world!");
}
}
Similar to #jwells131313 suggestion, I have implemented a JobListener that destroy()s instances of jobs where appropriate. To facilitate that, I pass along the ServiceHandle in the job's DataMap.
The difference is only that I'm quite happy with the #PerLookup scope.
Hk2JobFactory.java:
#Service
public class Hk2JobFactory implements JobFactory {
private final Logger log = LoggerFactory.getLogger(getClass());
#Inject
ServiceLocator serviceLocator;
#Override
public Job newJob(TriggerFiredBundle bundle, Scheduler scheduler) throws SchedulerException {
JobDetail jobDetail = bundle.getJobDetail();
Class<? extends Job> jobClass = jobDetail.getJobClass();
try {
log.debug("Producing instance of job {} (class {})", jobDetail.getKey(), jobClass.getName());
ServiceHandle sh = serviceLocator.getServiceHandle(jobClass);
if (sh != null) {
Class scopeAnnotation = sh.getActiveDescriptor().getScopeAnnotation();
if (log.isTraceEnabled()) log.trace("Service scope is {}", scopeAnnotation.getName());
if (scopeAnnotation == PerLookup.class) {
// #PerLookup scope means: needs to be destroyed after execution
jobDetail.getJobDataMap().put(SERVICE_HANDLE_KEY, sh);
}
return jobClass.cast(sh.getService());
}
log.debug("Unable to instantiate job via ServiceLocator, returning unmanaged instance");
return jobClass.newInstance();
} catch (Exception e) {
SchedulerException se = new SchedulerException(
"Problem instantiating class '"
+ jobDetail.getJobClass().getName() + "'", e);
throw se;
}
}
}
Hk2CleanupJobListener.java:
public class Hk2CleanupJobListener extends JobListenerSupport {
public static final String SERVICE_HANDLE_KEY = "hk2_serviceHandle";
private final Map<String, String> mdcCopy = MDC.getCopyOfContextMap();
#Override
public String getName() {
return getClass().getSimpleName();
}
#Override
public void jobWasExecuted(JobExecutionContext context, JobExecutionException jobException) {
JobDetail jobDetail = context.getJobDetail();
ServiceHandle sh = (ServiceHandle) jobDetail.getJobDataMap().get(SERVICE_HANDLE_KEY);
if (sh == null) {
if (getLog().isTraceEnabled()) getLog().trace("No serviceHandle found");
return;
}
Class scopeAnnotation = sh.getActiveDescriptor().getScopeAnnotation();
if (scopeAnnotation == PerLookup.class) {
if (getLog().isTraceEnabled()) getLog().trace("Destroying job {} after it was executed (Class {})",
jobDetail.getKey(),
jobDetail.getJobClass().getName()
);
sh.destroy();
}
}
}
Both are registered with the Scheduler.
For Singletons:
Seems like a Singleton service would NOT be destroyed when the job is finished, because it is a Singleton, right? If you are expecting the Singleton to be destroyed at the end of the Job then it seems like the service is more of a "JobScope" and not really a Singleton scope.
JobScope:
If "Jobs" follow certain rules then it might be an good candidate for an "Operation" scope (please see Operation Example). In particular jobs can be in an "Operation" scope if:
There can be many parallel jobs going at once
There can only be one job active on a thread at a time
Note that the above rules also means that Jobs can exists on multiple threads at the same or at different times. The most important rule is that on a single thread only one Job can be active at a time.
If those two rules apply then I highly recommend writing an Operation scope that's something like "JobScope".
This is how you could define a JobScope if Jobs follow the rules above:
#Scope
#Proxiable(proxyForSameScope = false)
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.TYPE)
public #interface JobScope {
}
And this would be the entire implementation of the corresponding Context:
#Singleton
public class JobScopeContext extends OperationContext<JobScope> {
public Class<? extends Annotation> getScope() {
return JobScope.class;
}
}
You would then use the OperationManager service to start and stop Jobs when, you know, Jobs start and stop.
Even if Jobs do not follow the rules for an "Operation" you still might want to use a "JobScope" scope that would know to destroy its services when a "Job" comes to its end.
PerLookup:
So if your question is about PerLookup scope objects, you could run into some trouble because you probably need the original ServiceHandle, which it sounds like you wouldn't have. In that case, and if you can at least find out that the original service WAS in fact in PerLookup scope you can use ServiceLocator.preDestroy to destroy the object.

How do I invoke a method on an existing object with Quartz?

I have a object, nicely configured with everything it needs to do its job. If I could just call run() on it once a day, my life would be complete.
To be clear, I know how to create a schedule and a trigger. But the methods to schedule all take JobDetail, which wants to create a new instance of my class. How do I use the one that I have?
In short, is there a nice way without Spring to call a method on my object using Quartz?
If you are using Quartz with Spring you can do the following :
Sample code
MethodInvokingJobDetailFactoryBean jobDetailfactory = new MethodInvokingJobDetailFactoryBean();
jobDetailfactory.setTargetObject(configuredObject);
jobDetailfactory.setTargetMethod("methodName");
Here configuredObject is your nicely configured object and methodName is the name of the method to be invoked. You can autowire the configuredObject into this class.
You can use Quartz JobBuilder to build Quartz JobDetail Object using your own jobDetails class, if I get you correctly. let me know if it is not required by you.
Suppose Job info is your own class having jobdetails. then you can use it below like this:
JobDataMap jobDataMap = new JobDataMap();
Map<String, Object> jobParams = jobInfo.getJobParams();
for (String paramKey : jobParams.keySet()) {
jobDataMap.put(paramKey, jobParams.get(paramKey));
}
jobBuilder.ofType((Class<? extends Job>) Class.forName(jobInfo.getJobClass()))
.withIdentity(jobInfo.getName(), jobInfo.getGroup())
.withDescription(jobInfo.getDescription()).storeDurably(jobInfo.isStoreDurably())
.usingJobData(jobDataMap);
JobDetail jobDetail = jobBuilder.build();
Here's some code (Kotlin)
fun createJobDetail(jobName: String, function: () -> Unit) = JobBuilder.newJob(MyJob::class.java)
.withIdentity(jobName)
.usingJobData(JobDataMap(mapOf(jobDataKey to function)))
.build()`
#DisallowConcurrentExecution
class MyJob : Job {
#Suppress("UNCHECKED_CAST")
override fun execute(context: JobExecutionContext) {
try {
(context.jobDetail.jobDataMap[jobDataKey] as () -> Unit)()
} catch(e: Exception) {
throw JobExecutionException(e)
}
}
}
Instead of using Quartz, you might be better off using the built-in java.util.concurrent. ScheduledExecutorService and its scheduleAtFixedRate() method.
For example:
ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1,
new ThreadFactory() {
#Override
public Thread newThread(Runnable runnable) {
Thread t = Executors.defaultThreadFactory().newThread(runnable);
t.setDaemon(true);
return t;
}
});
scheduler.scheduleAtFixedRate(new Runnable() {
#Override
public void run() {
myLovelyObject.run();
}
}, 0, 24, TimeUnit.HOURS);
If you need to use Quartz, you could always store a reference to your object in a static field in the Job class. Not elegant, but not exactly the end of the world either.

Java EJB Modify Schedule Property Values At Runtime

I have a Singleton class in Java and I have a timer using the #Schedule annotation. I wish to change the property of the Schedule at runtime. Below is the code:
#Startup
#Singleton
public class Listener {
public void setProperty() {
Method[] methods = this.getClass().getDeclaredMethods();
Method method = methods[0];
Annotation[] annotations = method.getDeclaredAnnotations();
Annotation annotation = annotations[0];
if(annotation instanceof Schedule) {
Schedule schedule = (Schedule) annotation;
System.out.println(schedule.second());
}
}
#PostConstruct
public void runAtStartUp() {
setProperty();
}
#Schedule(second = "3")
public void run() {
// do something
}
}
I wish to change the value at runtime of Schedule second based on the information from a Property file. Is this actually possibe? The Property file contains the configuration information. I tried to do #Schedule(second = SOME_VARIABLE) where private static String SOME_VARIABLE = readFromConfigFile(); This does not work. It expects a constant meaning a final and I don't want to set final.
I also saw this post: Modifying annotation attribute value at runtime in java
It shows this is not possible to do.
Any ideas?
EDIT:
#Startup
#Singleton
public class Listener {
javax.annotation.#Resource // the issue is this
private javax.ejb.TimerService timerService;
private static String SOME_VARIABLE = null;
#PostConstruct
public void runAtStartUp() {
SOME_VARIABLE = readFromFile();
timerService.createTimer(new Date(), TimeUnit.SECONDS.toMillis(Long.parse(SOME_VARIABLE)), null);
}
#Timeout
public void check(Timer timer) {
// some code runs every SOME_VARIABLE as seconds
}
}
The issue is injecting using #Resource. How can this be fixed?
The Exception is shown below:
No EJBContainer provider available The following providers: org.glassfish.ejb.embedded.EJBContainerProviderImpl Returned null from createEJBContainer call
javax.ejb.EJBException
org.glassfish.ejb.embedded.EJBContainerProviderImpl
at javax.ejb.embeddable.EJBContainer.reportError(EJBContainer.java:186)
at javax.ejb.embeddable.EJBContainer.createEJBContainer(EJBContainer.java:121)
at javax.ejb.embeddable.EJBContainer.createEJBContainer(EJBContainer.java:78)
#BeforeClass
public void setUpClass() throws Exception {
Container container = EJBContainer.createEJBContainer();
}
This occurs during unit testing using the Embeddable EJB Container. Some of the Apache Maven code is located on this post: Java EJB JNDI Beans Lookup Failed
I think the solution you are looking for was discussed here.
TomasZ is right you should use programmatic timers with TimerService for the situations when you want dynamically change schedule in run time.
Maybe you could use the TimerService. I have written some code but on my Wildfly 8 it seems to run multiple times even if its a Singleton.
Documentation http://docs.oracle.com/javaee/6/tutorial/doc/bnboy.html
Hope this helps:
#javax.ejb.Singleton
#javax.ejb.Startup
public class VariableEjbTimer {
#javax.annotation.Resource
javax.ejb.TimerService timerService;
#javax.annotation.PostConstruct
public void runAtStartUp() {
createTimer(2000L);
}
private void createTimer(long millis) {
//timerService.createSingleActionTimer(millis, new javax.ejb.TimerConfig());
timerService.createTimer(millis, millis, null);
}
#javax.ejb.Timeout
public void run(javax.ejb.Timer timer) {
long timeout = readFromConfigFile();
System.out.println("Timeout in " + timeout);
createTimer(timeout);
}
private long readFromConfigFile() {
return new java.util.Random().nextInt(5) * 1000L;
}
}

Categories

Resources