How to execute db/jpa operations inside quartz context? - java

I have this service Scheduling a task:
#ApplicationScoped
public class PaymentService {
#Transactional
public Payment scheduleNewPayment(Payment payment) throws ParseException, SchedulerException {
Payment.persist(payment);
JobDetail job = JobBuilder.newJob(PaymentJob.class)
.withIdentity(String.format("job%d", payment.id), "payment-job-group")
.build();
Date parsed = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse(payment.dueDate);
SimpleTrigger trigger = (SimpleTrigger) TriggerBuilder.newTrigger()
.withIdentity(String.format("trigger%d", payment.id), "trigger-group")
.startAt(parsed)
.forJob(job)
.build();
SchedulerFactory schedulerFactory = new StdSchedulerFactory();
Scheduler scheduler = schedulerFactory.getScheduler();
scheduler.scheduleJob(job, trigger);
scheduler.start();
return payment;
}
}
And this job:
#ApplicationScoped
public class PaymentJob implements Job {
#Override
public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException {
System.out.println(Payment.count());
}
}
But I can not perform a DB operation inside the Job context (jobExecutionContext.getScheduler().getContext() is null by the way).
I'm running my app with quarkus, the hibernate operation comes from Hibernate Panache and the Scheduler is quartz.

First of all, you should use the underlying managed Quartz Scheduler instance : #Inject org.quartz.Scheduler (I suppose you're using the quarkus-quartz extension).
The other "problem" is that the default Quartz job factory simply calls new PaymentJob() and so no injection/initialization is performed. Quarkus is only using a custom factory for the jobs generated for methods annotated with #Scheduled. If you don't need injection then simply remove the superfluous#ApplicationScoped from the PaymentJob class.
Finally, you need to activate all the necessary CDI contexts manually. It's very likely that the request context is needed. You can copy the following snippet: https://github.com/quarkusio/quarkus/blob/master/extensions/arc/runtime/src/main/java/io/quarkus/arc/runtime/BeanInvoker.java#L14-L24 into your execute() method.
jobExecutionContext.getScheduler().getContext() is null by the way
This is really odd. What exception/error do you actually get?

Related

Quartz job scheduler using Java, stuck on standby mode without the job being executed

i have a class where i perform some activities, and i want to create a job that will handle this operation automatically, scheduled every x minutes for example.
I am using Quartz, this class implements Job, and in my driver class i'm creating my jobdetail, scheduler and trigger and then starting it. However, the job isn't being executed, log info :
NOT STARTED.
Currently in standby mode.
Number of jobs executed: 0
The code for the scheduler in my driver class:
try {
JobDetail job = JobBuilder.newJob(TestMkFPMJob.class).withIdentity("TestMkFPMJob").build();
Trigger trigger = TriggerBuilder.newTrigger().withSchedule(SimpleScheduleBuilder.simpleSchedule().withIntervalInSeconds(Integer.parseInt(strTimeSched)).repeatForever()).build();
SchedulerFactory schFactory = new StdSchedulerFactory();
Scheduler sch = schFactory.getScheduler();
sch.start();
sch.scheduleJob(job, trigger);
}
catch (SchedulerException e)
{
e.printStackTrace();
System.out.println("Scheduler Error");
}
With "TestMkFPMJob" being the job class where my operations are handled, and strTimeSched is already fetched and set as 120 fetched from
I've been looking for a similar issue but can't seem to find any tip to move forward, appreciate any.
Please note that this is my first time using Quartz/Job scheduling.
The log entry with NOT STARTED is misleading, as it is shown whenever a QuartzScheduler instance is created. It does not mean that the jobs are not running. It is written after the line Scheduler sch = schFactory.getScheduler(); is executed and the scheduler is started in the next line.
If I take your example and run it on my pc, it is working as designed:
public class Quartz {
public static void main(String[] args) {
try {
JobDetail job = JobBuilder.newJob(MyJob.class).withIdentity("myJob").build();
Trigger trigger = TriggerBuilder.newTrigger().withSchedule(SimpleScheduleBuilder.simpleSchedule().withIntervalInSeconds(Integer.parseInt("10")).repeatForever()).build();
SchedulerFactory schFactory = new StdSchedulerFactory();
Scheduler sch = schFactory.getScheduler();
sch.start();
sch.scheduleJob(job, trigger);
}
catch (SchedulerException e)
{
e.printStackTrace();
System.out.println("Scheduler Error");
}
}
public static class MyJob implements Job {
#Override
public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException {
System.out.println("runnning job");
}
}
}

Execute a Quartz Job only once in a multi-instance environment

I'm trying to create a Job in Quartz 1.6, but with the necessity to execute only once, because I have two test instances with the same version of a .war file.
This is my TestPlugin class, the Job will be executed every 60 seconds:
public class TestPlugin implements PlugIn {
public TestPlugin() {
super();
}
public void destroy() {
}
public void init(ActionServlet arg0, ModuleConfig arg1)
throws ServletException {
try {
JobDetail job = JobBuilder.newJob(TestDemonio.class)
.withIdentity("anyJobName", "group1").build();
Trigger trigger = TriggerBuilder
.newTrigger()
.withIdentity("anyTriggerName", "group1")
.withSchedule(CronScheduleBuilder.cronSchedule("0/60 * * ? * * *"))
.build();
Scheduler scheduler = new StdSchedulerFactory().getScheduler();
scheduler.scheduleJob(job, trigger);
scheduler.start();
} catch (SchedulerException e) {
e.printStackTrace();
}
}
}
Then I have my class TestExecute to print a simple output:
#DisallowConcurrentExecution
public class TestDemonio implements Job {
public void execute(JobExecutionContext arg0) throws JobExecutionException {
System.out.println("QUARTZ JOB MESSAGE");
}
}
I have researched on how to achieve what I want by adding the annotation #DisallowConcurrentExecution, to only execute once the job, but I receive a message printed on each instance.
This is my quartz.properties file:
# Default Properties file for use by StdSchedulerFactory
# to create a Quartz Scheduler Instance, if a different
# properties file is not explicitly specified.
#
org.quartz.scheduler.instanceName: DefaultQuartzScheduler
org.quartz.scheduler.rmi.export: false
org.quartz.scheduler.rmi.proxy: false
org.quartz.scheduler.wrapJobExecutionInUserTransaction: false
org.quartz.threadPool.class: org.quartz.simpl.SimpleThreadPool
org.quartz.threadPool.threadCount: 10
org.quartz.threadPool.threadPriority: 5
org.quartz.threadPool.threadsInheritContextClassLoaderOfInitializingThread: true
org.quartz.jobStore.misfireThreshold: 60000
org.quartz.jobStore.class: org.quartz.simpl.RAMJobStore
You need to add following property to your quartz.property file(source: click here):
org.quartz.jobStore.isClustered : true
Read this for more information about isClustered property, refer to this link.
Please note:
#DisallowConcurrentExecution works when you have 2 different jobs with same jobkey running on same node.
While isClustered property is used to make sure that single instance of a job is executed when app is running of multiple nodes communicating via database tables for atomicity.

Schedule job using cron expression from class field

I tried to find it, but without results. I'd like to have object having path to bash script and cron expression specifying when to run it. It's SpringBoot project. I see it like this:
public class TestScript {
private String cronExpression;
private String pathToFile;
public void execute() {
// either it's #Scheduled or execute another way
}
}
Is it possible to do? Please guide me even a little if you can.
Ok, I managed to make my custom service that dynamically creates jobs:
#Service
public class DynamicJob {
public void schedule(TestScript testScript) {
try {
JobDetail job = JobBuilder.newJob(TestScript.class)
.withIdentity(testScript.getName(), "default group")
.build();
Trigger trigger = TriggerBuilder.newTrigger()
.withIdentity(testScript.getName().concat(" trigger"), "groupAll")
.withSchedule(CronScheduleBuilder.cronSchedule(testScript.getCronExpression()))
.build();
Scheduler scheduler = new StdSchedulerFactory().getScheduler();
scheduler.start();
scheduler.scheduleJob(job, trigger);
} catch (Exception e) {
e.printStackTrace();
}
}
}
TestScript class implements org.quartz.Job and I use quartz library version 2.2.1

How to create Master Job to process multiple spring batch job?

We have multiple spring batch job.But Each of them needs to be started individually.
Is there any way to create a Master Job or any controller in spring which will be responsible for executing all other batch jobs? So that we just have to execute the master job only,and all other started automatically.
I just explained on this question how you can start spring application with all jobs loaded in separate context. We have restart job which is scheduled each 10min and it checks for latest failed execution and attempts to restart couple of more times.
Your use case is pretty much the same, you can define all jobs in separate contexts with own configuration files, in config you can tell spring batch not to run them on startup by setting spring.batch.job.enabled: false and you can write your own launcher which uses JobExplorer to start jobs for you and you can schedule it or something.
Something like:
#Component
#EnableScheduling
public class AllJobLauncher {
#Autowired
JobExplorer jobExplorer;
#Autowired
private JobLauncher jobLauncher;
#Autowired
private JobRegistry jobRegistry;
#Scheduled(cron = "${some.cron:0 0/10 * * * ?}")
public void launchAllJobs() throws JobExecutionException {
final List<String> jobNames = jobExplorer.getJobNames();
for (final String jobName : jobNames) {
final Job job = jobRegistry.getJob(getJobName(organizationId));
final JobParameters jobParameters = new JobParametersBuilder() //build parameters
jobLauncher.run(job, jobParameters);
}
}
Just pay attention that JobLauncher in spring batch is by default sync so launcher will wait until job finishes. If you want to start jobs async you must place this configuration somewhere:
#Bean
public JobLauncher jobLauncher() {
final SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
final SimpleAsyncTaskExecutor simpleAsyncTaskExecutor = new SimpleAsyncTaskExecutor();
jobLauncher.setTaskExecutor(simpleAsyncTaskExecutor);
return jobLauncher;
}

how to stop/interrupt quartz scheduler job manually

I am running a simple quartz job in main class which runs every 30 secs.
public class Start {
public static void main(String[] args) throws Exception {
SchedulerFactory sf = new StdSchedulerFactory();
Scheduler sched = sf.getScheduler();
JobDetail job = newJob(MyJob.class).withIdentity("myJob","XXX").build();
Trigger trigger = TriggerBuilder.newTrigger()
.withSchedule(
SimpleScheduleBuilder.simpleSchedule()
.withIntervalInSeconds(30)
.repeatForever())
.build();
sched.scheduleJob(job, trigger);
sched.start();
}
}
Here i am implementing InterruptableJob like
public class MyJob implements InterruptableJob {
private volatile boolean isJobInterrupted = false;
private JobKey jobKey = null;
private volatile Thread thisThread;
public MyJob() {
}
#Override
public void interrupt() throws UnableToInterruptJobException {
// TODO Auto-generated method stub
System.err.println("calling interrupt:"+thisThread+"==>"+jobKey);
isJobInterrupted = true;
if (thisThread != null) {
// this call causes the ClosedByInterruptException to happen
thisThread.interrupt();
}
}
#Override
public void execute(JobExecutionContext context) throws JobExecutionException {
// TODO Auto-generated method stub
thisThread = Thread.currentThread();
jobKey = context.getJobDetail().getKey();
System.err.println("calling execute:"+thisThread+"==>"+jobKey);
}
}
Now i tried to stop the job using another main class like in every possible way with no luck
public class Stop {
public static void main(String[] args) throws Exception {
SchedulerFactory sf = new StdSchedulerFactory();
Scheduler sched = sf.getScheduler();
// get a "nice round" time a few seconds in the future...
Date startTime = nextGivenSecondDate(null, 1);
JobDetail job = newJob(MyJob.class).withIdentity("myJob", "XXX").build();
Trigger trigger = TriggerBuilder.newTrigger()
.withSchedule(
SimpleScheduleBuilder.simpleSchedule()
.withIntervalInSeconds(30)
.repeatForever())
.build();
sched.scheduleJob(job, trigger);
sched.start();
try {
// if you want to see the job to finish successfully, sleep for about 40 seconds
Thread.sleep(60000) ;
// tell the scheduler to interrupt our job
sched.interrupt(job.getKey());
Thread.sleep(3 * 1000L);
} catch (Exception e) {
e.printStackTrace();
}
System.err.println("------- Shutting Down --------");
TriggerKey tk=TriggerKey.triggerKey("myJob","group1");
System.err.println("tk"+tk+":"+job.getKey());
sched.unscheduleJob(tk);
sched.interrupt(job.getKey());
sched.interrupt("myJob");
sched.deleteJob(job.getKey());
sched.shutdown();
System.err.println("------- Shutting Down ");
sched.shutdown(false);
System.err.println("------- Shutdown Complete ");
System.err.println("------- Shutdown Complete ");
}
}
Can anyone please tell me the correct way to stop the job? Thanks a lot.
This question seems to answer the exact problem you're describing:
You need to write a your job as an implementation of InterruptableJob. To interrupt this job, you need handle to Scheduler, and call interrupt(jobKey<<job name & job group>>)
As-per the InterruptableJob documentation:
The interface to be implemented by Jobs that provide a mechanism for having their execution interrupted. It is NOT a requirement for jobs to implement this interface - in fact, for most people, none of their jobs will.
Interrupting a Job is very analogous in concept and challenge to normal interruption of a Thread in Java.
The means of actually interrupting the Job must be implemented within the Job itself (the interrupt() method of this interface is simply a means for the scheduler to inform the Job that a request has been made for it to be interrupted). The mechanism that your jobs use to interrupt themselves might vary between implementations. However the principle idea in any implementation should be to have the body of the job's execute(..) periodically check some flag to see if an interruption has been requested, and if the flag is set, somehow abort the performance of the rest of the job's work.
Emphasis mine. It is analogous but not the same. You're not expected to use Threads (but indeed you could if that's what your Job does...).
An example of interrupting a job can be found in the java source for the class org.quartz.examples.DumbInterruptableJob. It is legal to use some combination of wait() and notify() synchronization within interrupt() and execute(..) in order to have the interrupt() method block until the execute(..) signals that it has noticed the set flag.
So I recommend reading the documentation and inspecting examples in the full download.

Categories

Resources