Execute a Quartz Job only once in a multi-instance environment - java

I'm trying to create a Job in Quartz 1.6, but with the necessity to execute only once, because I have two test instances with the same version of a .war file.
This is my TestPlugin class, the Job will be executed every 60 seconds:
public class TestPlugin implements PlugIn {
public TestPlugin() {
super();
}
public void destroy() {
}
public void init(ActionServlet arg0, ModuleConfig arg1)
throws ServletException {
try {
JobDetail job = JobBuilder.newJob(TestDemonio.class)
.withIdentity("anyJobName", "group1").build();
Trigger trigger = TriggerBuilder
.newTrigger()
.withIdentity("anyTriggerName", "group1")
.withSchedule(CronScheduleBuilder.cronSchedule("0/60 * * ? * * *"))
.build();
Scheduler scheduler = new StdSchedulerFactory().getScheduler();
scheduler.scheduleJob(job, trigger);
scheduler.start();
} catch (SchedulerException e) {
e.printStackTrace();
}
}
}
Then I have my class TestExecute to print a simple output:
#DisallowConcurrentExecution
public class TestDemonio implements Job {
public void execute(JobExecutionContext arg0) throws JobExecutionException {
System.out.println("QUARTZ JOB MESSAGE");
}
}
I have researched on how to achieve what I want by adding the annotation #DisallowConcurrentExecution, to only execute once the job, but I receive a message printed on each instance.
This is my quartz.properties file:
# Default Properties file for use by StdSchedulerFactory
# to create a Quartz Scheduler Instance, if a different
# properties file is not explicitly specified.
#
org.quartz.scheduler.instanceName: DefaultQuartzScheduler
org.quartz.scheduler.rmi.export: false
org.quartz.scheduler.rmi.proxy: false
org.quartz.scheduler.wrapJobExecutionInUserTransaction: false
org.quartz.threadPool.class: org.quartz.simpl.SimpleThreadPool
org.quartz.threadPool.threadCount: 10
org.quartz.threadPool.threadPriority: 5
org.quartz.threadPool.threadsInheritContextClassLoaderOfInitializingThread: true
org.quartz.jobStore.misfireThreshold: 60000
org.quartz.jobStore.class: org.quartz.simpl.RAMJobStore

You need to add following property to your quartz.property file(source: click here):
org.quartz.jobStore.isClustered : true
Read this for more information about isClustered property, refer to this link.
Please note:
#DisallowConcurrentExecution works when you have 2 different jobs with same jobkey running on same node.
While isClustered property is used to make sure that single instance of a job is executed when app is running of multiple nodes communicating via database tables for atomicity.

Related

Quartz job scheduler using Java, stuck on standby mode without the job being executed

i have a class where i perform some activities, and i want to create a job that will handle this operation automatically, scheduled every x minutes for example.
I am using Quartz, this class implements Job, and in my driver class i'm creating my jobdetail, scheduler and trigger and then starting it. However, the job isn't being executed, log info :
NOT STARTED.
Currently in standby mode.
Number of jobs executed: 0
The code for the scheduler in my driver class:
try {
JobDetail job = JobBuilder.newJob(TestMkFPMJob.class).withIdentity("TestMkFPMJob").build();
Trigger trigger = TriggerBuilder.newTrigger().withSchedule(SimpleScheduleBuilder.simpleSchedule().withIntervalInSeconds(Integer.parseInt(strTimeSched)).repeatForever()).build();
SchedulerFactory schFactory = new StdSchedulerFactory();
Scheduler sch = schFactory.getScheduler();
sch.start();
sch.scheduleJob(job, trigger);
}
catch (SchedulerException e)
{
e.printStackTrace();
System.out.println("Scheduler Error");
}
With "TestMkFPMJob" being the job class where my operations are handled, and strTimeSched is already fetched and set as 120 fetched from
I've been looking for a similar issue but can't seem to find any tip to move forward, appreciate any.
Please note that this is my first time using Quartz/Job scheduling.
The log entry with NOT STARTED is misleading, as it is shown whenever a QuartzScheduler instance is created. It does not mean that the jobs are not running. It is written after the line Scheduler sch = schFactory.getScheduler(); is executed and the scheduler is started in the next line.
If I take your example and run it on my pc, it is working as designed:
public class Quartz {
public static void main(String[] args) {
try {
JobDetail job = JobBuilder.newJob(MyJob.class).withIdentity("myJob").build();
Trigger trigger = TriggerBuilder.newTrigger().withSchedule(SimpleScheduleBuilder.simpleSchedule().withIntervalInSeconds(Integer.parseInt("10")).repeatForever()).build();
SchedulerFactory schFactory = new StdSchedulerFactory();
Scheduler sch = schFactory.getScheduler();
sch.start();
sch.scheduleJob(job, trigger);
}
catch (SchedulerException e)
{
e.printStackTrace();
System.out.println("Scheduler Error");
}
}
public static class MyJob implements Job {
#Override
public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException {
System.out.println("runnning job");
}
}
}

How to execute db/jpa operations inside quartz context?

I have this service Scheduling a task:
#ApplicationScoped
public class PaymentService {
#Transactional
public Payment scheduleNewPayment(Payment payment) throws ParseException, SchedulerException {
Payment.persist(payment);
JobDetail job = JobBuilder.newJob(PaymentJob.class)
.withIdentity(String.format("job%d", payment.id), "payment-job-group")
.build();
Date parsed = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse(payment.dueDate);
SimpleTrigger trigger = (SimpleTrigger) TriggerBuilder.newTrigger()
.withIdentity(String.format("trigger%d", payment.id), "trigger-group")
.startAt(parsed)
.forJob(job)
.build();
SchedulerFactory schedulerFactory = new StdSchedulerFactory();
Scheduler scheduler = schedulerFactory.getScheduler();
scheduler.scheduleJob(job, trigger);
scheduler.start();
return payment;
}
}
And this job:
#ApplicationScoped
public class PaymentJob implements Job {
#Override
public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException {
System.out.println(Payment.count());
}
}
But I can not perform a DB operation inside the Job context (jobExecutionContext.getScheduler().getContext() is null by the way).
I'm running my app with quarkus, the hibernate operation comes from Hibernate Panache and the Scheduler is quartz.
First of all, you should use the underlying managed Quartz Scheduler instance : #Inject org.quartz.Scheduler (I suppose you're using the quarkus-quartz extension).
The other "problem" is that the default Quartz job factory simply calls new PaymentJob() and so no injection/initialization is performed. Quarkus is only using a custom factory for the jobs generated for methods annotated with #Scheduled. If you don't need injection then simply remove the superfluous#ApplicationScoped from the PaymentJob class.
Finally, you need to activate all the necessary CDI contexts manually. It's very likely that the request context is needed. You can copy the following snippet: https://github.com/quarkusio/quarkus/blob/master/extensions/arc/runtime/src/main/java/io/quarkus/arc/runtime/BeanInvoker.java#L14-L24 into your execute() method.
jobExecutionContext.getScheduler().getContext() is null by the way
This is really odd. What exception/error do you actually get?

shutdown all the quartz jobs as soon as app server is shut down?

I have a web application in which I am running some jobs periodically so for that I am using quartz framework here. Below is how I am starting all my jobs:
As soon as the server gets started up, it calls postInit method autmatically. And then I start all my jobs and it works fine:
#PostConstruct
public void postInit() {
logger.logInfo("Starting all jobs");
StdSchedulerFactory factory = new StdSchedulerFactory();
try {
factory.initialize(App.class.getClassLoader().getResourceAsStream("quartz.properties"));
Scheduler scheduler = factory.getScheduler();
// starts all our jobs using quartz_config.xml file
scheduler.start();
} catch (SchedulerException ex) {
logger.logError("error while starting scheduler= ", ExceptionUtils.getStackTrace(ex));
}
}
#PreDestroy
public void shutdown() {
logger.logInfo("Shutting down all jobs");
}
Now I want to stop all the jobs that are running as soon as we try to shutdown the app server. So whenever we try to shutdown the app server, it will call shutdown method automatically. Now I need some way where we can shutdown all the jobs as soon as shutdown method is called. What is the best way by which I can shutdown all the jobs as soon as shutdown method is called?
Below is my "quartz.properties" file. Do I really need "quartz.properties" file since I guess I am using default values anyways I think?
#------------------------- Threads ---------------------------------#
# how many jobs we should run at the same time?
org.quartz.threadPool.threadCount=15
# ----------------------------- Plugins --------------------------- #
# class from where we should load the configuration data for each job and trigger.
org.quartz.plugin.jobInitializer.class=org.quartz.plugins.xml.XMLSchedulingDataProcessorPlugin
org.quartz.plugin.jobInitializer.fileNames = quartz_config.xml
org.quartz.plugin.jobInitializer.failOnFileNotFound = true
org.quartz.jobStore.class = org.quartz.simpl.RAMJobStore
You can use Scheduler.shutdown() method as below and it is a good idea to externalize the quartz configuration even if you use the default parameters. This will make your code flexible.
private Scheduler scheduler;
#PostConstruct
public void postInit() {
logger.logInfo("Starting all jobs");
StdSchedulerFactory factory = new StdSchedulerFactory();
try {
factory.initialize(App.class.getClassLoader().getResourceAsStream("quartz.properties"));
scheduler = factory.getScheduler();
// starts all our jobs using quartz_config.xml file
scheduler.start();
} catch (SchedulerException ex) {
logger.logError("error while starting scheduler= ", ExceptionUtils.getStackTrace(ex));
}
}
#PreDestroy
public void shutdown() throws SchedulerException {
logger.logInfo("Shutting down all jobs");
scheduler.shutdown();
}

Schedule job using cron expression from class field

I tried to find it, but without results. I'd like to have object having path to bash script and cron expression specifying when to run it. It's SpringBoot project. I see it like this:
public class TestScript {
private String cronExpression;
private String pathToFile;
public void execute() {
// either it's #Scheduled or execute another way
}
}
Is it possible to do? Please guide me even a little if you can.
Ok, I managed to make my custom service that dynamically creates jobs:
#Service
public class DynamicJob {
public void schedule(TestScript testScript) {
try {
JobDetail job = JobBuilder.newJob(TestScript.class)
.withIdentity(testScript.getName(), "default group")
.build();
Trigger trigger = TriggerBuilder.newTrigger()
.withIdentity(testScript.getName().concat(" trigger"), "groupAll")
.withSchedule(CronScheduleBuilder.cronSchedule(testScript.getCronExpression()))
.build();
Scheduler scheduler = new StdSchedulerFactory().getScheduler();
scheduler.start();
scheduler.scheduleJob(job, trigger);
} catch (Exception e) {
e.printStackTrace();
}
}
}
TestScript class implements org.quartz.Job and I use quartz library version 2.2.1

Java quartz suddenly stop firing events

I am using quartz to schedule a daily batch process, and it runs the first days, but it had happened that fires the event for 2 days or so, and then, it stops firing the job.
The java version i'm using is:
java version "1.7.0_25"
Quartz version (in POM):
org.quartz-scheduler
quartz
2.2.1
Here is my code:
Main function for the batch:
public static void main(String[] args) {
try {
SimpleDateFormat sd=new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
JobDetail job = JobBuilder.newJob(MyJobClass.class).withIdentity("MyJobClass", "group1").build();
Trigger trigger = TriggerBuilder
.newTrigger()
.withIdentity("MyTrigger", "group1")
.withSchedule(
SimpleScheduleBuilder.simpleSchedule()
.withIntervalInHours(24).repeatForever())
.startAt(sd.parse("2015-01-12 07:30:00"))
.build();
Scheduler scheduler = new StdSchedulerFactory().getScheduler();
scheduler.start();
scheduler.scheduleJob(job, trigger);
String strLog="Batch initiated on " + new Date();
System.out.println(strLog);
log.info(strLog);
} catch (Exception e) {
//log error
}
}
And here is my execute method in the job:
public void execute(JobExecutionContext arg0) throws JobExecutionException {
generateBatchProcess();
}
public void generateBatchProcess(){
try{
//do lots of interesting stuff, calling MyBatis Daos, generating excel files and sending an email
}catch (Exception e){
//log error
}
}
Does someone have an idea of why this happens? Is it the garbage collector something to do with this?
I'm ashamed to admit it was only a "BadProgrammerException" because I had a call to a database connection outside the try...catch block that i hadn't seen before, so the problem was that I was not reaching the database and I couldn't figure it out with the log information (the database connection was intermitent). After I found this out, I corrected the database issue and quartz worked ok.

Categories

Resources