I have created a simple scheduled task using Spring Framework's #Scheduled annotation.
#Scheduled(fixedRate = 2000)
public void doSomething() {}
Now I want to stop this task, when no longer needed.
I know there could be one alternative to check one conditional flag at the start of this method, but this will not stop execution of this method.
Is there anything Spring provides to stop #Scheduled task ?
Option 1: Using a post processor
Supply ScheduledAnnotationBeanPostProcessor and explicitly invoke postProcessBeforeDestruction(Object bean, String beanName), for the bean whose scheduling should be stopped.
Option 2: Maintaining a map of target beans to its Future
private final Map<Object, ScheduledFuture<?>> scheduledTasks =
new IdentityHashMap<>();
#Scheduled(fixedRate = 2000)
public void fixedRateJob() {
System.out.println("Something to be done every 2 secs");
}
#Bean
public TaskScheduler poolScheduler() {
return new CustomTaskScheduler();
}
class CustomTaskScheduler extends ThreadPoolTaskScheduler {
#Override
public ScheduledFuture<?> scheduleAtFixedRate(Runnable task, long period) {
ScheduledFuture<?> future = super.scheduleAtFixedRate(task, period);
ScheduledMethodRunnable runnable = (ScheduledMethodRunnable) task;
scheduledTasks.put(runnable.getTarget(), future);
return future;
}
#Override
public ScheduledFuture<?> scheduleAtFixedRate(Runnable task, Date startTime, long period) {
ScheduledFuture<?> future = super.scheduleAtFixedRate(task, startTime, period);
ScheduledMethodRunnable runnable = (ScheduledMethodRunnable) task;
scheduledTasks.put(runnable.getTarget(), future);
return future;
}
}
When the scheduling for a bean has to be stopped, you can lookup the map to get the corresponding Future to it and explicitly cancel it.
Here is an example where we can stop , start , and list also all the scheduled running tasks:
#RestController
#RequestMapping("/test")
public class TestController {
private static final String SCHEDULED_TASKS = "scheduledTasks";
#Autowired
private ScheduledAnnotationBeanPostProcessor postProcessor;
#Autowired
private ScheduledTasks scheduledTasks;
#Autowired
private ObjectMapper objectMapper;
#GetMapping(value = "/stopScheduler")
public String stopSchedule(){
postProcessor.postProcessBeforeDestruction(scheduledTasks, SCHEDULED_TASKS);
return "OK";
}
#GetMapping(value = "/startScheduler")
public String startSchedule(){
postProcessor.postProcessAfterInitialization(scheduledTasks, SCHEDULED_TASKS);
return "OK";
}
#GetMapping(value = "/listScheduler")
public String listSchedules() throws JsonProcessingException{
Set<ScheduledTask> setTasks = postProcessor.getScheduledTasks();
if(!setTasks.isEmpty()){
return objectMapper.writeValueAsString(setTasks);
}else{
return "No running tasks !";
}
}
}
Some time ago I had this requirement in my project that any component should be able to create a new scheduled task or to stop the scheduler (all tasks). So I did something like this
#Configuration
#EnableScheduling
#ComponentScan
#Component
public class CentralScheduler {
private static AnnotationConfigApplicationContext CONTEXT = null;
#Autowired
private ThreadPoolTaskScheduler scheduler;
public static CentralScheduler getInstance() {
if (!isValidBean()) {
CONTEXT = new AnnotationConfigApplicationContext(CentralScheduler.class);
}
return CONTEXT.getBean(CentralScheduler.class);
}
#Bean
public ThreadPoolTaskScheduler taskScheduler() {
return new ThreadPoolTaskScheduler();
}
public void start(Runnable task, String scheduleExpression) throws Exception {
scheduler.schedule(task, new CronTrigger(scheduleExpression));
}
public void start(Runnable task, Long delay) throws Exception {
scheduler.scheduleWithFixedDelay(task, delay);
}
public void stopAll() {
scheduler.shutdown();
CONTEXT.close();
}
private static boolean isValidBean() {
if (CONTEXT == null || !CONTEXT.isActive()) {
return false;
}
try {
CONTEXT.getBean(CentralScheduler.class);
} catch (NoSuchBeanDefinitionException ex) {
return false;
}
return true;
}
}
So I can do things like
Runnable task = new MyTask();
CentralScheduler.getInstance().start(task, 30_000L);
CentralScheduler.getInstance().stopAll();
Have in mind that, for some reasons, I did it without having to worry about concurrency. There should be some synchronization otherwise.
A working example implementation of #Mahesh 's Option 1, using ScheduledAnnotationBeanPostProcessor.postProcessBeforeDestruction(bean, beanName).
import org.springframework.beans.factory.BeanNameAware;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.scheduling.annotation.ScheduledAnnotationBeanPostProcessor;
public class ScheduledTaskExample implements ApplicationContextAware, BeanNameAware
{
private ApplicationContext applicationContext;
private String beanName;
#Scheduled(fixedDelay = 1000)
public void someTask()
{
/* Do stuff */
if (stopScheduledTaskCondition)
{
stopScheduledTask();
}
}
private void stopScheduledTask()
{
ScheduledAnnotationBeanPostProcessor bean = applicationContext.getBean(ScheduledAnnotationBeanPostProcessor.class);
bean.postProcessBeforeDestruction(this, beanName);
}
#Override
public void setApplicationContext(ApplicationContext applicationContext)
{
this.applicationContext = applicationContext;
}
#Override
public void setBeanName(String beanName)
{
this.beanName = beanName;
}
}
There is a bit of ambiguity in this question
When you say "stop this task", did you mean to stop in such a way that it's later recoverable (if yes, programmatically, using a condition which arises with in the same app? or external condition?)
Are you running any other tasks in the same context? (Possibility of shutting down the entire app rather than a task) -- You can make use of actuator.shutdown endpoint in this scenario
My best guess is, you are looking to shutdown a task using a condition that may arise with in the same app, in a recoverable fashion. I will try to answer based on this assumption.
This is the simplest possible solution that I can think of, However I will make some improvements like early return rather than nested ifs
#Component
public class SomeScheduledJob implements Job {
private static final Logger LOGGER = LoggerFactory.getLogger(SomeScheduledJob.class);
#Value("${jobs.mediafiles.imagesPurgeJob.enable}")
private boolean imagesPurgeJobEnable;
#Override
#Scheduled(cron = "${jobs.mediafiles.imagesPurgeJob.schedule}")
public void execute() {
if(!imagesPurgeJobEnable){
return;
}
Do your conditional job here...
}
Properties for the above code
jobs.mediafiles.imagesPurgeJob.enable=true or false
jobs.mediafiles.imagesPurgeJob.schedule=0 0 0/12 * * ?
Another approach that I have not found yet. Simple, clear and thread safe.
In your configuration class add annotation:
#EnableScheduling
This and next step in your class where you need start/stop scheduled task inject:
#Autowired TaskScheduler taskScheduler;
Set fields:
private ScheduledFuture yourTaskState;
private long fixedRate = 1000L;
Create inner class that execute scheduled tasks eg.:
class ScheduledTaskExecutor implements Runnable{
#Override
public void run() {
// task to be executed
}
}
Add start() method:
public void start(){
yourTaskState = taskScheduler.scheduleAtFixedRate(new ScheduledTaskExecutor(), fixedRate);
}
Add stop() method:
public void stop(){
yourTaskState.cancel(false);
}
TaskScheduler provide other common way for scheduling like: cron or delay.
ScheduledFuture provide also isCancelled();
Minimalist answer:
#mahesh's option 1, expanded here in minimal form for convenience, will irreversibly cancel all scheduled tasks on this bean:
#Autowired
private ScheduledAnnotationBeanPostProcessor postProcessor;
#Scheduled(fixedRate = 2000)
public void doSomething() {}
public void stopThis() {
postProcessBeforeDestruction(this, "")
}
Alternatively, this will irreversibly cancel all tasks on all beans:
#Autowired
private ThreadPoolTaskScheduler scheduler;
#Scheduled(fixedRate = 2000)
public void doSomething() {}
public void stopAll() {
scheduler.shutdown();
}
Thanks, all the previous responders, for solving this one for me.
Scheduled
When spring process Scheduled, it will iterate each method annotated this annotation and organize tasks by beans as the following source shows:
private final Map<Object, Set<ScheduledTask>> scheduledTasks =
new IdentityHashMap<Object, Set<ScheduledTask>>(16);
Cancel
If you just want to cancel the a repeated scheduled task, you can just do like following (here is a runnable demo in my repo):
#Autowired
private ScheduledAnnotationBeanPostProcessor postProcessor;
#Autowired
private TestSchedule testSchedule;
public void later() {
postProcessor.postProcessBeforeDestruction(test, "testSchedule");
}
Notice
It will find this beans's ScheduledTask and cancel it one by one. What should be noticed is it will also stopping the current running method (as postProcessBeforeDestruction source shows).
synchronized (this.scheduledTasks) {
tasks = this.scheduledTasks.remove(bean); // remove from future running
}
if (tasks != null) {
for (ScheduledTask task : tasks) {
task.cancel(); // cancel current running method
}
}
Define a custom annotation like below.
#Documented
#Retention (RUNTIME)
#Target(ElementType.TYPE)
public #interface ScheduledSwitch {
// do nothing
}
Define a class implements org.springframework.scheduling.annotation.ScheduledAnnotationBeanPostProcessor.
public class ScheduledAnnotationBeanPostProcessorCustom
extends ScheduledAnnotationBeanPostProcessor {
#Value(value = "${prevent.scheduled.tasks:false}")
private boolean preventScheduledTasks;
private Map<Object, String> beans = new HashMap<>();
private final ReentrantLock lock = new ReentrantLock(true);
#Override
public Object postProcessAfterInitialization(Object bean, String beanName) {
ScheduledSwitch switch = AopProxyUtils.ultimateTargetClass(bean)
.getAnnotation(ScheduledSwitch.class);
if (null != switch) {
beans.put(bean, beanName);
if (preventScheduledTasks) {
return bean;
}
}
return super.postProcessAfterInitialization(bean, beanName);
}
public void stop() {
lock.lock();
try {
for (Map.Entry<Object, String> entry : beans.entrySet()) {
postProcessBeforeDestruction(entry.getKey(), entry.getValue());
}
} finally {
lock.unlock();
}
}
public void start() {
lock.lock();
try {
for (Map.Entry<Object, String> entry : beans.entrySet()) {
if (!requiresDestruction(entry.getKey())) {
super.postProcessAfterInitialization(
entry.getKey(), entry.getValue());
}
}
} finally {
lock.unlock();
}
}
}
Replace ScheduledAnnotationBeanPostProcessor bean by the custom bean in configuration.
#Configuration
public class ScheduledConfig {
#Bean(name = TaskManagementConfigUtils.SCHEDULED_ANNOTATION_PROCESSOR_BEAN_NAME)
#Role(BeanDefinition.ROLE_INFRASTRUCTURE)
public ScheduledAnnotationBeanPostProcessor scheduledAnnotationBeanPostProcessor() {
return new ScheduledAnnotationBeanPostProcessorCustom();
}
}
Add #ScheduledSwitch annotation to the beans that you want to prevent or stop #Scheduled tasks.
Using #conditional will help you check a value from condition method, if it's true? run the scheduler. else don't run.
First: create your condition class that implements the Condition interface and its matches method
public class MyCondition implements Condition{
public boolean matches(ConditionContext context, AnnotatedTypeMetaData metadata) {
// here implement your condition using if-else or checking another object
// or call another method that can return boolean value
//return boolean value : true or false
return true;
}
}
Then, back to your configuration or service class where you have the #Scheduled
#Service
#Conditional(value = MyCondition.class)
// this Service will only run if the condition is true
public class scheduledTask {
// the #Scheduled method should be void
#Scheduled(fixedRate= 5000)
public void task(){
System.out.println(" This is scheduled task started....");
}
}
This definitely worked for me.
import com.google.common.collect.Maps;
import org.redisson.liveobject.misc.ClassUtils;
import org.springframework.core.annotation.AnnotationUtils;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.scheduling.annotation.ScheduledAnnotationBeanPostProcessor;
import org.springframework.scheduling.config.CronTask;
import org.springframework.scheduling.config.ScheduledTask;
import org.springframework.stereotype.Component;
import java.lang.reflect.Method;
import java.util.Map;
import java.util.Set;
import static java.util.Collections.emptySet;
import com.google.common.collect.Maps;
import org.redisson.liveobject.misc.ClassUtils;
import org.springframework.core.annotation.AnnotationUtils;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.scheduling.annotation.ScheduledAnnotationBeanPostProcessor;
import org.springframework.scheduling.config.CronTask;
import org.springframework.scheduling.config.ScheduledTask;
import org.springframework.stereotype.Component;
import java.lang.reflect.Method;
import java.util.Map;
import java.util.Set;
import static java.util.Collections.emptySet;
/**
* #author uhfun
*/
#Component
public class ConfigurableScheduler {
private final InnerScheduledAnnotationBeanPostProcessor postProcessor;
public ConfigurableScheduler(InnerScheduledAnnotationBeanPostProcessor postProcessor) {
this.postProcessor = postProcessor;
}
public void registerScheduleTask(String cron, Method method, Object target) {
Map<String, Object> attributes = Maps.newHashMap();
attributes.put("cron", cron);
Scheduled scheduled = AnnotationUtils.synthesizeAnnotation(attributes, Scheduled.class, null);
postProcessor.registerScheduleTask(scheduled, method, target);
}
public void unregister(String cron, Object target) {
postProcessor.unregister(target, cron);
}
#Component
public static class InnerScheduledAnnotationBeanPostProcessor extends ScheduledAnnotationBeanPostProcessor {
private final Map<Object, Set<ScheduledTask>> scheduledTasksMap;
public InnerScheduledAnnotationBeanPostProcessor() {
scheduledTasksMap = ClassUtils.getField(this, "scheduledTasks");
}
public void registerScheduleTask(Scheduled scheduled, Method method, Object bean) {
super.processScheduled(scheduled, method, bean);
}
public void unregister(Object bean, String cron) {
synchronized (scheduledTasksMap) {
Set<ScheduledTask> tasks = getScheduledTasks();
for (ScheduledTask task : tasks) {
if (task.getTask() instanceof CronTask
&& ((CronTask) task.getTask()).getExpression().equals(cron)) {
task.cancel();
scheduledTasksMap.getOrDefault(bean, emptySet()).remove(task);
}
}
}
}
}
}
How about using System.exit(1)?
It is simple and works.
Related
Want to clear the thread contex post execution of the functions annotated with #Scheduled in spring boot
Usage
#Scheduled(fixedDelayString = "10000")
publi void doSomething() {
}
Config for scheduled thread pool
#Bean(destroyMethod = "shutdownNow")
public ScheduledExecutorService scheduledExecutorService() {
return Executors.newScheduledThreadPool(5);
}
Have created a simple decorator to solve the same
package com.demo.decorator;
import com.demo.utils.GeneralUtils;
import org.springframework.core.task.TaskDecorator;
public class ThreadContextDecorator implements TaskDecorator {
#Override
public Runnable decorate(Runnable runnable) {
return () -> {
try {
runnable.run();
} finally {
GeneralUtils.clearContext();
}
};
}
}
Not sure how to add it in bean of ScheduledExecutorService
Requirements
I need to be able to trigger a (long running) job via a POST call and return immediately.
Only one thread can run the job at one time.
The job being an expensive one, I want all future triggers of this job to not do anything if one job is already in progress.
Code
#RestController
public class SomeTask {
private SomeService someService;
#Autowired
public SomeTask(SomeService someService) {
this.someService = someService;
}
#Async // requirement 1
#RequestMapping(method = RequestMethod.POST, path = "/triggerJob")
public void triggerJob() {
expensiveLongRunningJob();
}
/**
* Synchronized in order to restrict multiple invocations. // requirement 2
*
*/
private synchronized void expensiveLongRunningJob() {
someService.executedJob();
}
}
Question
With the above code requirements 1 and 2 are satisfied. What is the best way to satisfy requirement 3 as well (have the new thread, created as a result of a POST call, skip the synchronised method and return immediately on failure to acquire a lock)?
Synchronization isn't the right tool for the job. You can do it like this:
#RestController
public class SomeTask {
private SomeService someService;
private final AtomicBoolean isTriggered = new AtomicBoolean();
#Autowired
public SomeTask(SomeService someService) {
this.someService = someService;
}
#Async // requirement 1
#RequestMapping(method = RequestMethod.POST, path = "/triggerJob")
public void triggerJob() {
if (!isTriggered.getAndSet(true)) {
try {
expensiveLongRunningJob();
} finally {
isTriggered.set(false);
}
}
}
/**
* only runs once at a time, in the thread that sets isTriggered to true
*/
private void expensiveLongRunningJob() {
someService.executedJob();
}
}
For requirement 1, if you want to use just #Async, you should have it on the service method and not the controller method. But be aware that by making it async, you would lose control over the job and failure handling will be not possible, unless you implement #Async with Future and handle failures by implementing AsyncUncaughtExceptionHandler interface.
For requirement 3, you can have a volatile boolean field in the service, which gets set just before beginning the job process and unset after the job process completes. In your controller method, you can check the service's volatile boolean field to decide if the job is being executed or not and just return with appropriate message if the job is in progress. Also, make sure to unset the boolean field while handling the failure in the implementation of AsyncUncaughtExceptionHandler interface.
Service:
#Service
public class SomeService {
public volatile boolean isJobInProgress = false;
#Async
public Future<String> executeJob() {
isJobInProgress = true;
//Job processing logic
isJobInProgress = false;
}
}
Controller:
#RestController
public class SomeTask {
#Autowired
private SomeService someService;
#RequestMapping(method = RequestMethod.POST, path = "/triggerJob")
public void triggerJob() {
if (!someService.isJobInProgress){
someService.executeJob(); //can have this in a sync block to be on the safer side.
} else {
return;
}
}
}
Implementation of AsyncUncaughtExceptionHandler:
public class CustomAsyncExceptionHandler implements AsyncUncaughtExceptionHandler {
#Autowired
private SomeService someService;
#Override
public void handleUncaughtException(
Throwable throwable, Method method, Object... obj) {
//Handle failure
if (someService.isJobInProgress){
someService.isJobInProgress = false;
}
}
}
#Async configuration:
#Configuration
#EnableAsync
public class SpringAsyncConfig implements AsyncConfigurer {
#Override
public Executor getAsyncExecutor() {
return new ThreadPoolTaskExecutor();
}
#Override
public AsyncUncaughtExceptionHandler getAsyncUncaughtExceptionHandler() {
return new CustomAsyncExceptionHandler();
}
}
I have created a simple service like ... where I have to take some data from database at later
package com.spring.scheduler.example.springscheduler;
import org.springframework.stereotype.Service;
#Service
public class ExampleService {
private String serviceName;
private String repository;
public String getServiceName() {
return serviceName;
}
public void setServiceName(String serviceName) {
this.serviceName = serviceName;
}
public String getRepository() {
return repository;
}
public void setRepository(String repository) {
this.repository = repository;
}
}
Here is my task scheduler created to run different threads.
#Component
public class TaskSchedulerService {
#Autowired
ThreadPoolTaskScheduler threadPoolTaskScheduler;
public ScheduledFuture<?> job1;
public ScheduledFuture<?> job2;
#Autowired
ApplicationContext applicationContext;
#PostConstruct
public void job1() {
//NewDataCollectionThread thread1 = new NewDataCollectionThread();
NewDataCollectionThread thread1 = applicationContext.getBean(NewDataCollectionThread.class);
AutowireCapableBeanFactory factory = applicationContext.getAutowireCapableBeanFactory();
factory.autowireBean(thread1);
factory.initializeBean(thread1, null);
job1 = threadPoolTaskScheduler.scheduleAtFixedRate(thread1, 1000);
}
}
This is a thread trying to call from scheduler. I tried to create service instance forcibly by using application context but it's not created.
#Configurable
#Scope("prototype")
public class NewDataCollectionThread implements Runnable {
private static final Logger LOGGER =
LoggerFactory.getLogger(NewDataCollectionThread.class);
#Autowired
private ExampleService exampleService;
#Override
public void run() {
LOGGER.info("Called from thread : NewDataCollectionThread");
System.out.println(Thread.currentThread().getName() + " The Task1
executed at " + new Date());
try {
exampleService.setRepository("sdasdasd");
System.out.println("Service Name :: " +
exampleService.getServiceName());
Thread.sleep(10000);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Kindly suggest what are the possible ways to achieve it.
Try something like this:
#Autowired
private AutowireCapableBeanFactory beanFactory;
#PostConstruct
public void job1() {
NewDataCollectionThread thread1 = new NewDataCollectionThread();
beanFactory.autowireBean(thread1);
job1 = threadPoolTaskScheduler.scheduleAtFixedRate(thread1, 1000);
}
In NewDataCollectionThread I injected the Example service successfully.
I'm trying to figure out why my scheduled jobs are not executed parallelly. Maybe there is something wrong with my transaction management? Method JobScheduledExecutionService.execute() is #Scheduled with fixedRate=250, so it should be fired every 250ms no matter if previous job is finished. Due to logs it is not working as expected.
Logs: https://pastebin.com/M6FaXpeE
My code is below.
#Service
#Slf4j
public class JobExecutionService {
private final TransactionalJobExecutionService transactionalJobExecutionService;
#Autowired
public JobExecutionService(TransactionalJobExecutionService transactionalJobExecutionService) {
this.transactionalJobExecutionService = transactionalJobExecutionService;
}
public void execute() {
TestJob job = transactionalJobExecutionService.getJob();
executeJob(job);
transactionalJobExecutionService.finishJob(job);
}
private void executeJob(TestJob testJob) {
log.debug("Execution-0: {}", testJob.toString());
Random random = new Random();
try {
Thread.sleep(random.nextInt(3000) + 200);
} catch (InterruptedException e) {
log.error("Error", e);
}
log.debug("Execution-1: {}", testJob.toString());
}
}
#Service
#Slf4j
public class JobScheduledExecutionService {
private final JobExecutionService jobExecutionService;
#Autowired
public JobScheduledExecutionService(JobExecutionService jobExecutionService) {
this.jobExecutionService = jobExecutionService;
}
#Scheduled(fixedRate = 250)
public void execute() {
log.trace("Job fired");
jobExecutionService.execute();
}
}
#Service
#Slf4j
#Transactional
public class TransactionalJobExecutionService {
private final Environment environment;
private final TestJobRepository testJobRepository;
private final TestJobResultRepository testJobResultRepository;
#Autowired
public TransactionalJobExecutionService(Environment environment, TestJobRepository testJobRepository, TestJobResultRepository testJobResultRepository) {
this.environment = environment;
this.testJobRepository = testJobRepository;
this.testJobResultRepository = testJobResultRepository;
}
public TestJob getJob() {
TestJob testJob = testJobRepository.findFirstByStatusOrderByIdAsc(
0
);
testJob.setStatus(1);
testJobRepository.save(testJob);
return testJob;
}
public void finishJob(TestJob testJob) {
testJobResultRepository.save(
new TestJobResult(
null,
testJob.getId(),
environment.getProperty("local.server.port")
)
);
}
}
#Configuration
public class SchedulingConfigurerConfiguration implements SchedulingConfigurer {
#Override
public void configureTasks(ScheduledTaskRegistrar taskRegistrar) {
ThreadPoolTaskScheduler taskScheduler = new ThreadPoolTaskScheduler();
taskScheduler.setPoolSize(32);
taskScheduler.initialize();
taskRegistrar.setTaskScheduler(taskScheduler);
}
}
The reason is scheduler will fire only one event, which will be executed by one thread and then I don't see you are spawning multiple threads in your logic for parallel execution. That call of jobExecutionService.execute(); in execute() of JobScheduledExecutionService is in that one thread. So overall it ends up being sequential execution.
Seems you need to put multi-threaded [Callable-Future based] logic in JobExecutionService : execute() to pick job [transactionalJobExecutionService.getJob()] and call executeJob() inside it. hope this helps..
I'm writing a Spring-Boot application to monitor a directory and process files that are being added to it. I start a thread by creating a ApplicationRunner in my Application class that calls a method annotated with #Async:
#SpringBootApplication
#EnableAsync
public class Application {
#Autowired
private DirectoryMonitorService directoryMonitorService;
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
#Bean
public ApplicationRunner startDirectoryMonitorService() {
return args -> directoryMonitorService.monitorSourceDirectoty();
}
}
Here is the code for DirectoryMonitorService that has a method annotated with #Async:
#Service
public class DirectoryMonitorService {
private static final Logger logger = LogManager.getLogger(DirectoryMonitorService.class);
#Value("${timeout}")
private long timeout;
#Autowired
private WatchService watchService;
#Async
public void monitorSourceDirectoty() {
while (true) {
WatchKey watchKey;
try {
watchKey = watchService.poll(timeout, TimeUnit.SECONDS);
} catch (ClosedWatchServiceException | InterruptedException e) {
logger.error("Exception occured while polling from source file", e);
return;
}
// process the WatchEvents
if (!watchKey.reset()) {
break;
}
}
}
}
Finally here is where I create the ThreadPoolTaskExecutor:
public class AsyncConfig extends AsyncConfigurerSupport {
private static final Logger logger = LogManager.getLogger(AsyncConfig.class);
private static final String THREAD_NAME_PREFIX = "Parser-";
#Value("${corePoolSize}")
public int corePoolSize;
#Value("${maxPoolSize}")
public int maxPoolSize;
#Value("${queueCapacity}")
public int queueCapacity;
#Override
public Executor getAsyncExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(corePoolSize);
executor.setMaxPoolSize(maxPoolSize);
executor.setQueueCapacity(queueCapacity);
executor.setThreadNamePrefix(THREAD_NAME_PREFIX);
executor.initialize();
return executor;
}
#Override
public AsyncUncaughtExceptionHandler getAsyncUncaughtExceptionHandler() {
return (Throwable ex, Method method, Object... params) -> {
logger.error("Exception message - " + ex.getMessage());
logger.error("Method name - " + method.getName());
for (Object param : params) {
logger.error("Parameter value - " + param);
}
};
}
}
Somehow I feel this is not most elegant way of starting a main thread. Does anybody have a better solution?
Also I would rather have replace while (true) with a Boolean variable that I can set to false when Spring-Boot shuts down. Does anybody know which interface I need to implement for this?
This is correct if you want a very simple implementation and nothing more reliable.
Use #Async to a shorter tasks and it has very limited capability in terms of restarts etc.
And also, #Async will keep creating the separate threads at every watch sequence activation, and it will overwhelm the thread pool and start trowing exceptions, This is quite noticeable, if you have long running task as,
// process the WatchEvents
Other than that your implementation is correct (In my opinion).
Some suggestions (If you want to make things interesting/ complex):
So you can keep track of the files obviously using some sort of persistence mechanism and trigger decoupled batch (can use Spring Batch) to handle the execution and, get those batches into a separate UI or something and there you can have each of these batch process stopped, start, resume on the UI.