Multiple schedular thread submit tasks RejectedExecutionException - java

In my application, I have multiple scheduler threads that create tasks. For example each scheduler threads can create bunch of tasks :
TaskCreator tastCreator;
for (Report report: report) {
taskCreator.createTask(report);
}
The scheduler threads can run concurrently as you can see from the logs:
15:57:20.107 INFO [ scheduler-4] c.task.ReportExportSchedulerTask : Task created
15:57:20.107 INFO [ scheduler-2] c.task.ReportExportSchedulerTask : Task created
I have a TaskCreator component as follows that passes the task to the executeJob():
#Component
public class TaskCreator {
#Autowired
private SftpTaskExecutor sftpTaskExecutor;
#Autowired
SftpConfig sftpConfig;
#Autowired
private SFTPConnectionManager connectionManager;
public void createTask(Report report) {
sftpTaskExecutor.executeJob(new JobProcessorTask(...));
}
public void validateTasksExecution() {
sftpTaskExecutor.getExecutorService().shutdown();
while (!sftpTaskExecutor.getExecutorService().isTerminated()) ;
connectionManager.disconnect();
}
}
SftpTaskExecutor Component as follows that constructs an executorService to which I submit the above tasks to:
#Component
public class SftpTaskExecutor {
private ExecutorService executorService = Executors.newSingleThreadExecutor();
public void executeJob(JobProcessorTask jobProcessorTask) {
executorService.execute(jobProcessorTask);
}
public ExecutorService getExecutorService() {
return executorService;
}
}
My question is, if two or more scheduler threads are creating tasks and submitting to executor service concurrently, the above throws a RejectedExecutionException with one scheduler task not finished (i.e. file not sent)
For each schedule threads, I need to be able to call validateTasksExecution() without interfering with the other scheduler thread. In other words, not disconnect while other scheduler is still processing.
Am I using the ExecutorService correctly in this regard? How can I change the above to be thread safe?

My question is, if two or more scheduler threads are creating tasks and submitting to executor service concurrently, the above throws a RejectedExecutionException with one scheduler task not finished (i.e. file not sent)
Let's take a look at the javadocs for ExecutorService.execute(...)
RejectedExecutionException - if this task cannot be accepted for execution.
In looking at the ThreadPoolExecutor (and associated) code, the jobs get rejected for 2 reasons:
The queue for the jobs is full (this doesn't apply to you because the queues are by default unbounded)
The executor service is no longer running (ding ding ding)
I believe that your executor service has been shutdown, most likely because the first of your threads has called validateTasksExecution() before the 2nd thread calls executeJob(...). Your code is incorrect if you are trying to reuse that thread-pool. That you are also closing the connectionManager() makes me wonder if you want to re-use the SftpTaskExecutor at all.
If you want each thread to see if its operation is done but have the thread-pool stay running then you need to be saving the Future(s) from the ExecutorService.submit(...) method and call get() on them. That will tell you when the jobs are done.
Something like:
public Future<Void> createTask(Report report) {
return sftpTaskExecutor.executeJob(new JobProcessorTask(...));
}
public void validateTasksExecution(Future<Void> future) {
// there is some exceptions here you need to handle
future.get();
}
public void shutdown() {
sftpTaskExecutor.shutdown();
connectionManager.disconnect();
}
...
public Future<Void> executeJob(JobProcessorTask jobProcessorTask) {
return executorService.submit(jobProcessorTask);
}
If you need to monitor multiple jobs then you should store them in a collection and call get() on them serially although the jobs will be running in parallel.
The alternative would be for you to have a separate ExecutorService for each transaction which is wasteful but maybe not so bad considering that it is managing sftp calls.
while (!sftpTaskExecutor.getExecutorService().isTerminated()) ;
Yeah you don't want to spin like that. See awaitTermination(...) javadocs.

Related

ThreadPoolTaskExecutor with just one thread on pool not processing messages from AWS queue

I've created an on demand ChannelAdapter, AsyncTaskExecutor and a Channel for every queue registered on the application. I noticed that when the number of maxPoolSize of the AsyncTaskExecutor is equal to one, the messages are not being processed. This is how the AsyncTaskExecutor bean is created.
static void registerAsyncTaskExecutor(final Consumer consumer, final GenericApplicationContext registry) {
final TaskExecutor executor = consumer.getExecutor();
final BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(ThreadPoolTaskExecutor.class);
builder.addPropertyValue("corePoolSize", executor.getCorePoolSize());
builder.addPropertyValue("maxPoolSize", executor.getMaxPoolSize());
builder.addPropertyValue("threadNamePrefix", consumer.getName() + "-");
final String beanName = executor.getName();
final BeanDefinition beanDefinition = builder.getBeanDefinition();
registry.registerBeanDefinition(beanName, beanDefinition);
}
Another thing that I noticed is when this method is called java.util.concurrent.ThreadPoolExecutor#execute this condition workerCountOf(c) < corePoolSize is always false.
The full project link is over here https://github.com/LeoFuso/spring-integration-aws-demo
It is always bad practice to to provide a thread pool just with one thread to some manageable component. You may not know what that component is going to do with your thread pool and it is really could be a fact that your single thread is taken by some long-living task internally and all new tasks are just going to stall in the queue waiting for that single thread to be free, which is might not going to happen.
In fact that is really what we have with the AsynchronousMessageListener from Spring Cloud AWS which is used by the mentioned SqsMessageDrivenChannelAdapter:
public void run() {
while (isQueueRunning()) {
So, or rely on the the default executor or provide enough threads into your own.
Looks like the logic over there is like this for the number of threads:
int spinningThreads = this.getRegisteredQueues().size();
if (spinningThreads > 0) {
threadPoolTaskExecutor
.setCorePoolSize(spinningThreads * DEFAULT_WORKER_THREADS);
So, we have the exact number of thread as we provide SQS queue, plus 2 multiplier for workers. Looks like we need a thread for each queue to poll and extra thread to process messages from them.
(Not Spring Integration question though - more like Spring Cloud AWS).

Spring - add a low priority multi threaded service (no impact to production performance)

We have a Spring application, I want to add a service that will handle 10Ks IDs with multiple threads but will be as background process without impact production realtime.
Service will update database and send external providers requests.
I don't want service to impact/effect production performance/timing, I want to execute operation on each ID in a low priority
I read previous post about setting priority in Executer, but I want low priority to all other threads that can be outside this specific Executer scope.
Is answer using ThreadPoolExecutor more relevant to my case?
ThreadPoolExecutor threadPool = new ThreadPoolExecutor(1, numOfWorkerThreads, 0L, TimeUnit.MILLISECONDS, new LinkedBlockingQueue<Runnable>());
threadPool.setThreadFactory(new OpJobThreadFactory(Thread.NORM_PRIORITY-2));
public final static class OpJobThreadFactory implements ThreadFactory {
private int priority;
public OpJobThreadFactory(int priority) {
this(priority, true);
}
#Override
public Thread newThread(Runnable r) {
Thread t = new Thread(r, namePrefix + threadNumber.getAndIncrement());
t.setDaemon(daemon);
t.setPriority(priority);
}
}
maybe even use Thread.MIN_PRIORITY
Or I'm fine with using Executors.newCachedThreadPool()
Creates a thread pool that creates new threads as needed, but will reuse previously constructed threads when they are available. These pools will typically improve the performance of programs that execute many short-lived asynchronous tasks.
Also should I use Spring bean? because I need to create pool on demand/request so it seems not needed/wrong
EDIT
Should I use Spring Actuator to get this task or other monitoring tool?
Spring Boot Actuator module helps you monitor and manage your Spring Boot application by providing production-ready features like health check-up, auditing, metrics gathering, HTTP tracing etc. All of these features can be accessed over JMX or HTTP endpoints.
I would like to throw some light on the question
what is a thread priority? According to java SE Docs
Every thread has a priority. Threads with higher priority are executed in preference to threads with lower priority.
Even though you create threads with priority it does not completely guarantee that threads with lower priority get executed first you may have to block the thread with lower priority until other threads are executed
For small java programs you can handle the thread execution by yourself but for larger programs, it's recommended either you use the Executor Framework from vanilla Java which is from the package java.util.concurrent or use the spring TaskExecutor.By using both frameworks you can execute tasks asynchronously that is executing them in the background as part of the main task.
Impact on Production:
The main task, for example, will be a call to your rest endpoint i.e /account and on calling the account endpoint you want to send welcome emails to customers which are a third party API call which can be executed asynchronously either using Executor Framework or Spring TaskExecutor on executing them asynchronously i.e as background process they will not have an impact on the current API but it will surely have an impact on the production server since you are running the threads within the same JVM and they share common memory. if there are a number of threads created and not destroyed .you server will surely go down.
So using Executor Framework or Spring TaskExecutor does not guarantee you that it will not affect your current production it will surely increase the performance of the rest API that is called. since it's executed asynchronously and on your other questions
can i use Executors.newCachedThreadPool()
yes if you have a number of the short-lived task such as updating a single column in a database or triggering a rest endpoint only once and it's not suitable for bulk loading or executing some backend job which updates 10000 records because it will create larger number of threads for each task and you will surely face memory problems.
Also, should I use Spring bean? because I need to create a pool on-demand/request so it seems not needed/wrong
yes you can ThreadPoolTaskExecutor use https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/scheduling/concurrent/ThreadPoolTaskExecutor.html
as per the docs Setting "queueCapacity" to 0 mimics Executors.newCachedThreadPool() .so you can use either Spring TaskExecutor or use Executor Framework from vanilla Java I personally recommend using Spring TaskExecutor which has more feature for a good start on using Spring you can refer the tutorial https://egkatzioura.com/2017/10/25/spring-and-async/ which is a good start
final verdict: if you are looking to execute the task only asynchronous or a background process you can use either use Executor Framework from java or Spring TaskExecutor but both will have an impact on production since they use the same JVM .if you do not want to impact production at all then I recommend creating separate spring boot app on a different server and make the database calls or service call from the new app and expose it as a rest endpoint and invoke these endpoints asynchronously from your main app using Spring Task Executor.
Spring Task Executor: https://egkatzioura.com/2017/10/25/spring-and-async/
Java Executor Framework : https://stackabuse.com/concurrency-in-java-the-executor-framework/
for using threads with low priority :
https://medium.com/#daniyaryeralin/priority-based-thread-pooling-in-spring-framework-d74b91b51dcb
Sorry if the answer is too long :)
Here there is a nice tutorial about priority based task execution in Spring. Maybe this may help you in some ways.
This is a method of creating a configurable ¨heap¨ of task and always keep your main task in the top of the heap.
To sum up this process you should create a custom Task Executor. Firstly you need to create a ThreadPoolTaskExecutor bean with one method being overidden. The properties that should be modified are: CorePoolSize(initial number of threads), QueueCapacity(the number of threads waiting in the queue), and MaxPoolSize(maximum number of threads). With these parameters you can configure your applications limitations in order for this service not to impact the production performance.
#Bean("CustomTaskExecutor")
public TaskExecutor threadPoolTaskExecutor(
#Value("${spring.async.core-pool-size}") int corePoolSize,
#Value("${spring.async.max-pool-size}") int maxPoolSize,
#Value("${spring.async.queue-capacity}") int queueCapacity) {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor() {
#Override
protected BlockingQueue<Runnable> createQueue(int queueCapacity) {
return new PriorityBlockingQueue<Runnable>(queueCapacity);
}
};
executor.setCorePoolSize(corePoolSize);
executor.setMaxPoolSize(maxPoolSize);
executor.setQueueCapacity(queueCapacity);
return executor;
}
After that,you need to make tasks with priorities that the task executor can
understand. For that we would need to create two classes:
1) A custom class that implements Runnable interface that will be running the task
2) A wrapper class that extends FutureTask and implementsComparable interface, so that the task executor could understand the priority picking logic of the tasks
public class Task implements Runnable {
private Consumer<Job> jobConsumer;
private Job job;
public Job getJob() {
return this.job;
}
public Task(Consumer<Job> jobConsumer, Job job) {
this.jobConsumer = jobConsumer;
this.job = job;
}
#Override
public void run() {
this.jobConsumer.accept(job);
}
}
Then you have the FutureCustomTask class:
public class FutureCustomTask extends FutureTask<FutureCustomTask> implements Comparable<FutureCustomTask> {
private Task task;
public FutureCustomTask(Task task) {
super(task, null);
this.task = task;
}
#Override
public int compareTo(FutureCustomTask o) {
return task.getJob().getPriority().compareTo(o.task.getJob().getPriority());
}
}
For the execution the TaskExecutor needs to be Autowired.
Then, you can create your Task object, wrap it inside FutureCustomTask, and pass it to TaskExecutor.The code should look like this:
#Autowired
private TaskExecutor taskExecutor;
#Autowired
private JobBusiness jobBusiness;
...
Task task = new Task(jobBusiness::performSomethingOn, job);
taskExecutor.execute(new FutureCustomTask(task));

ScheduledExecutorService. how to run two tasks after some timeinterval in 2 seperate services

I am trying to implement continuously running background tasks for my app.
For that i used ScheduledExecutorService class.
I have 2 services Service A and Service B both have a task that runs all the time after some time interval. For that i used this following in Service A and Service B
This is the code which is common in both service classes.
Runnable postNotificationRunnable = new Runnable() {
#Override
public void run() {
// statements here}
ScheduledExecutorService scheduledExecutorService = Executors.newScheduledThreadPool(1);
scheduledExecutorService.scheduleAtFixedRate(postNotificationRunnable, 0, 1000, TimeUnit.SECONDS);
Now the problem is when i run app both services starts but only only scheduledExecutorService of Service A runs other one doesn't run. what i am doing wrong?
P.S i am using ScheduledExecutorService for the first time.
The ScheduledExecutorService will wait for your tasks to finish, iff one takes longer than the rate you specify (in your case 1000 seconds). Have a look at the respective Javadoc:
[...] If any execution of this task takes longer than its period, then subsequent executions may start late, but will not concurrently execute.
Since your services seem to behave like daemons (and therefore keep running until the application is shut down), you could do it like this:
ScheduledExecutorService scheduler = Executors.newSingleThreadScheduledExecutor();
ExecutorService pool = Executors.newCachedThreadPool();
scheduler.scheduleAtFixedRate(new Runnable() {
#Override
public void run() {
pool.execute(postNotificationRunnable);
}
}, 0L, 1000L, TimeUnit.SECONDS);
Simply delegate the actual execution of your services to a separate pool that doesn't affect the scheduler.

Is java.util.concurrent.ExecutorService should be closed as java.sql.Connection?

I doubt whether java.util.concurrent.ExecutorService should be shutdown after all tasks had been completed or canceled?
I have a method like this:
public void testProxies() {
// 5 thread
ExecutorService exec = Executors.newFixedThreadPool(5);
try {
while(condition){
exec.execute(new Runnable() {
#Override
public void run() {
//some task
}
});
}
} catch (Exception e) {
e.printStackTrace();
} finally {
exec.shutdown();// should be shutdown here?
}
}
Is that a currect way of using ExecutorService?
How can I reuse the ExecutorService?
ExecutorService should be shutdown or let it go?
If you shut it down, you can't reuse it.
If you don't shut it down, your program won't be able to exit because there will be live non daemon threads.
So you need to call shutdown at some stage to let your program exit, but only when you know that you don't need to submit additional tasks to your executor.
What I generally do:
I make the ExecutorService a field of my class
I provide a stop or shutdown method which the user of my class needs to call which calls the shutdown method of the executor. Note that the executor won't actually shutdown until all the submitted tasks have completed (or have been successfully cancelled).
An alternative is to add a shutdown hook which will shutdown your executor when the JVM exits.
Yes. ExecutorService should be shutdown, if you don;t want to execute any tasks any more.

Mail Sending with a special thread

i have a jsf application running on tomcat 6.0 and somewhere in the app i send e mails to some users.But sending mail slower than i thought, it causes lacks beetwen these related pages.
So my question is; is that a good(or doable) a way to give this proccess to another thread which i create, a thread that gets mail sending requests and put these in a queue and proccess these apart from main application.Hence the mail sending proccess would be out of the main flow and doesnt affect the app's speed.
Yes, that's definitely a good idea. You should only do it with an extreme care. Here's some food for thought:
Is it safe to start a new thread in a JSF managed bean?
Spawning threads in a JSF managed bean for scheduled tasks using a timer
As you're using Tomcat, which does not support EJB out the box (and thus #Asynchronus #Singleton is out of question), I'd create an application scoped bean which holds an ExecutorService to process the mail tasks. Here's a kickoff example:
#ManagedBean(eager=true)
#ApplicationScoped
public class TaskManager {
private ExecutorService executor;
#PostConstruct
public void init() {
executor = Executors.newSingleThreadExecutor();
}
public <T> Future<T> submit(Callable<T> task) {
return executor.submit(task);
}
// Or just void submit(Runnable task) if you want fire-and-forget.
#PreDestroy
public void destroy() {
executor.shutdown();
}
}
This creates a single thread and puts the tasks in a queue. You can use it in normal beans as follows:
#ManagedBean
#RequestScoped
public class Register {
#ManagedProperty("#{taskManager}")
private TaskManager taskManager;
public void submit() {
// ...
taskManager.submit(new MailTask(mail));
// You might want to hold the return value in some Future<Result>, but
// you should store it in view or session scope in order to get result
// later. Note that the thread will block whenever you call get() on it.
// You can just ignore it altogether (as the current example is doing).
}
}
To learn more about java.util.concurrent API, refer the official tutorial.

Categories

Resources