Executor Service and scheduleWithFixedDelay() - java

Here is my task. I have a static queue of jobs in a class and a static method that adds jobs to the queue. Have n amount of threads that poll from a queue and perform the pulled job. I need to have the n threads poll simultaneously at an interval. AKA, all 3 should poll every 5 seconds and look for jobs.
I have this:
public class Handler {
private static final Queue<Job> queue = new LinkedList<>();
public static void initialize(int maxThreads) { // maxThreads == 3
ScheduledExecutorService executorService =
Executors.newScheduledThreadPool(maxThreads);
executorService.scheduleWithFixedDelay(new Runnable() {
#Override
public void run() {
Job job = null;
synchronized(queue) {
if(queue.size() > 0) {
job = queue.poll();
}
}
if(job != null) {
Log.log("start job");
doJob(job);
Log.log("end job");
}
}
}, 15, 5, TimeUnit.SECONDS);
}
}
I get this output when I add 4 tasks:
startjob
endjob
startjob
endjob
startjob
endjob
startjob
endjob
It is obvious that these threads perform that jobs serially, whereas I need them to be done 3 at a time. What am I doing wrong? Thanks!

From the documentation:
If any execution of this task takes longer than its period, then subsequent executions may start late, but will not concurrently execute.
So you must schedule three independent tasks to have them run concurrently. Also note that the scheduled executor service is a fixed thread pool, which is not flexible enough for many use cases. A good idiom is to use the scheduled service just to submit tasks to a regular executor service, which may be configured as a resizable thread pool.

You are running ScheduledExecutorService with fixed delay, what means, that your jobs will run one after one. Use fixed thread pool, and submit 3 threads at a time. Here is an explanation with examples

If you declare Job extends Runnable then your code simplifies dramatically:
First declare the Executor somewhere globally accessible:
public static final ExecutorService executor = Executors.newFixedThreadPool(MAX_THREADS);
Then add a job like this:
executor.submit(new Job());
You are done.

Related

Task schedule itself inside run() method will cause too many thread created?

According to this post:
public class ConditionCheckingTask implements Runnable
{
private final ScheduledExecutorService ses ;
private final Instant whenInstantiated = Instant.now() ;
// Constructor
public ConditionCheckingTask( final ScheduledExecutorService ses ) {
this.ses = ses ;
}
#Override
public void run() {
if( someConditionIsTrue ) {
doSomething ;
} else if ( ChronoUnit.MINUTES.between( this.whenInstantiated , Instant.now() ) > 100000 ) {
// We have exceeded our time limit, so let this task die.
return ;
} else { // Else wait a minute to check condition again.
this.ses.schedule( this , 1 , TimeUnit.MINUTES ) ;
}
}
}
The task schedule itself inside run() method. Suppose the time limit is 100000, then 100000 threads will be created? Since each schedule call will need a separate thread to run, right?
each schedule call will need a separate thread to run, right?
Probably not.
All we know from the code that you showed is that ses refers to a ScheduledExecutorService, and since that's only an interface, we don't actually know what it will do; but any practical implementation of that interface most likely will be some kind of thread pool.
A simple thread pool has a blocking queue of tasks, and a small number of worker threads that each sit in a loop awaiting tasks and performing them;
while (true) {
Runnable task = queue.take();
task.run();
}
In this way, the number of threads used can be much smaller than the number of tasks that are submitted to the queue.
A ScheduledExecutorService such as ScheduledThreadPoolExecutor works in pretty much the same way, except that the queue is some kind of priority queue. A priority queue returns the tasks in the order in which they are scheduled to run instead of returning them in the same order as they were submitted. Each worker loops, waiting until a task at the head of the queue is due, and then it take()s the task, runs it, and goes back to await the next task, same as in the simple thread pool.

ThreadPoolTaskExecutor should Finish Task Queue before Main thread ends

Problem statement:
I have 1,000 tasks and need to process them via ThreadPoolTaskExecutor. ThreadPoolTaskExecutor has corePoolSize = 5, maxPoolSize = 10 and queueCapacity = 1000.
Now from the main method, I am executing the following code
CountDownLatch latch = new CountDownLatch(5);
Collection<Future<?>> futures = new LinkedList<Future<?>>();
for (Map.Entry<String,Boolean> entry : map.entrySet()){
FutureTask task = new FutureTask(new CustomTask(entry));
executor.execute(task);
}
log.info("ACTIVE COUNT : "+executor.getActiveCount());
log.info("SIZE of the QUEUE : "+executor.getThreadPoolExecutor().getQueue().size());
log.info("LATCH WAIT : "+latch.getCount());
latch.wait();
.....
#Override
public Object call() throws Exception {
latch.countDown();
//some logic
return entry;
}
Now, the map has 1,000 entries in it and I want to process all tasks in queue(1,000) and then print these log lines. Whats happening here is, the corePoolSize(which is equal to the CountDounLatch count) create this number of thread and executes them 'Right-Away'. However, when this number is hit, it starts filling up the queue(which is totally fine and desired). However, this queue tasks are processed ONLY AFTER the main thread reaches the end, only then these tasks start executing. This is something that I don't want. I want the Executor to start picking up items from queue as soon as threads get free from processing batch-1.
But in my case, once the batch-1 is processed, the next task is picked only when the main threads ends(which I do not want).
Anyone with a solution on how can this be achieved? (The processing of queue as soon as the thread is available for processing)
P.S : I do understand that latch.await() waits for the threads to complete their execution, but I am looking for a behavior in which it should wait for all the threads to be finished(which is happening) and all the queue should be empty(my expectations).
Thank You
If you are going to do it this way, you need to initialize the latch with the number of tasks that you are going to submit; i.e. 1,000. Also you should decrement the latch at the end of each task, not at its start (as your code currently seems to be doing.)
But you don't need a latch or a counter or anything to implement this. Instead, if you are using a Java SE ExecutorService directly, just do this:
public static void main(String[] args) {
// Submit lots of tasks
executorService.shutdown();
try {
// Waits until all tasks in the queue have completed
executorService.awaitTermination(1_000_000, TimeUnit.SECONDS);
} catch (InterruptedException ex) {
// OK ... will end now
}
}
And if you are using the SpringFramework specific ThreadPoolTaskExecutor class:
public static void main(String[] args) {
// Submit lots of tasks
executor.setAwaitTerminationSeconds(1_000_000);
executor.setWaitForTasksToCompleteOnShutdown(true);
executor.shutdown();
}

Thread Executor: running multiple process and pool wait for 30 minutes before starts next process

Currently I am working to implement multiprocessing using ThreadPoolExecutor API. Below is the requirement to process
Defined no. of java process to run at a time using ThreadPool using ExecutorService executor = Executors.newFixedThreadPool(3);
And i am submitting all actual process of runnable jar process to pool
for(int i=1; i<50; i++) {
RunnableTask r=new RunnableTask();
executor.submit(r);
}
and actual RunnableTask is follows
public class RunnableTask implements Runnable{
public void run(){
Process p=Runtime.exec("java -jar D:\ProcessIntiate.jar");
}
Now when pool is started only 3 process will starts parallel.
I want to run only 3 process at a time and after completion of 3 process my pool has to be wait for 30 minutes and needs to start next 3 processes. And notification has to be required to pool when all 3 process is completed.
Is there any way using ThreadExcutorFramwork?
A simple solution: simply add a 30 minute wait to your Runnable code.
Or better: give the runable a parameter that controls how long it waits.
For the first 47 processes, you give 30 minutes; and then 0.
Alternatively simply don't submit 50 tasks.
Instead: use an outer thread that pushes 3 tasks and then waits/sleeps until those 3 are done. Then wait 30 minutes and push another 3.
Instead of directly submitting the request to thread pool, store it in a collection.
Use a ScheduledExecutorService with size 1 to schedule a task which will run after specified interval (say 30 mins in your case).
It will take task from the collection in batch and will submit it to another pool where it is actually programmed to process the submitted task.
class CustomScheduledExecutor {
ScheduledExecutorService scheduleExecutor = Executors.newScheduledThreadPool(1);
ExecutorService executorService = Executors.newFixedThreadPool(3);
Queue<Runnable> queue = new ConcurrentLinkedQueue<Runnable>();
public CustomScheduledExecutor(){
scheduleExecutor.scheduleWithFixedDelay(new Runnable() {
#Override
public void run() {
for(int i=0;i<3;i++){
Runnable poll = queue.poll(); // Do the handling for queue size
executorService.submit(poll);
}
}
}, 1000, 30, TimeUnit.MINUTES);
}
public void submitTask(Runnable runnable){
queue.offer(runnable);
}
}
submit also returns future instances so you have a control on the submitted task and can manipulate as per need.

Java running thread cykllu, waiting for the completion of thread

I need to run in a cycle of four parallel threads that will be something to process and thread of late, so the cycle began Dasle 4 parallel threads. I have tried to implement, but the cycle starts over 4 threads. I miss checking if there threads they are completed or not. Can you advise me please? Thank you.
Create List with your jobs.
Wrap your jobs in some class that implements Runnable interface.
Create Thread that will execute your jobs from queue
Something like:
// This will run your jobs in threads
ExecutorService threadPoolExecutor = Executors.newFixedThreadPool(NUMBER_OF_THREADS);
// Adding new jobs to list
List<Job> processingList= Collections.synchronizedList(new ArrayList<Job>());
processingList.add(someJob1);
processingList.add(someJob2);
processingList.add(someJob3);
processingList.add(someJob4);
...
Runnable processor = new Runnable() {
public void run() {
// Run all jobs from List in threads
for (Job job : processingList) {
threadPoolExecutor.execute(job);
}
// wait till jobs are completed
boolean areJobsCompleted = false;
while(!areJobsCompleted) {
boolean areJobsCompleted = true;
for (Job job : processingList) {
areJobsCompleted = areJobsCompleted && job.isComplete();
}
}
}
};
Executors.newSingleThreadExecutor().execute(processor);

How to manage M threads (1 per task) ensuring only N threads at the same time. With N < M. In Java

I have a queue of task in java. This queue is in a table in the DB.
I need to:
1 thread per task only
No more than N threads running at the same time. This is because the threads have DB interaction and I don't want have a bunch of DB connections opened.
I think I could do something like:
final Semaphore semaphore = new Semaphore(N);
while (isOnJob) {
List<JobTask> tasks = getJobTasks();
if (!tasks.isEmpty()) {
final CountDownLatch cdl = new CountDownLatch(tasks.size());
for (final JobTask task : tasks) {
Thread tr = new Thread(new Runnable() {
#Override
public void run() {
semaphore.acquire();
task.doWork();
semaphore.release();
cdl.countDown();
}
});
}
cdl.await();
}
}
I know that an ExecutorService class exists, but I'm not sure if it I can use it for this.
So, do you think that this is the best way to do this? Or could you clarify me how the ExecutorService works in order to solve this?
final solution:
I think the best solution is something like:
while (isOnJob) {
ExecutorService executor = Executors.newFixedThreadPool(N);
List<JobTask> tasks = getJobTasks();
if (!tasks.isEmpty()) {
for (final JobTask task : tasks) {
executor.submit(new Runnable() {
#Override
public void run() {
task.doWork();
}
});
}
}
executor.shutdown();
executor.awaitTermination(Long.MAX_VALUE, TimeUnit.HOURS);
}
Thanks a lot for the awnsers. BTW I am using a connection pool, but the queries to the DB are very heavy and I don't want to have uncontrolled number of task at the same time.
You can indeed use an ExecutorService. For instance, create a new fixed thread pool using the newFixedThreadPool method. This way, besides caching threads, you also guarantee that no more than n threads are running at the same time.
Something along these lines:
private static final ExecutorService executor = Executors.newFixedThreadPool(N);
// ...
while (isOnJob) {
List<JobTask> tasks = getJobTasks();
if (!tasks.isEmpty()) {
List<Future<?>> futures = new ArrayList<Future<?>>();
for (final JobTask task : tasks) {
Future<?> future = executor.submit(new Runnable() {
#Override
public void run() {
task.doWork();
}
});
futures.add(future);
}
// you no longer need to use await
for (Future<?> fut : futures) {
fut.get();
}
}
}
Note that you no longer need to use the latch, as get will wait for the computation to complete, if necessary.
I agree with JG that ExecutorService is the way to go... but I think you're both making it more complicated than it needs to be.
Rather than creating a large number of threads (1 per task) why not just create a fixed sized thread pool (with Executors.newFixedThreadPool(N)) and submit all the tasks to it? No need for a semaphore or anything like that - just submit the jobs to the thread pool as you get them, and the thread pool will handle them with up to N threads at a time.
If you aren't going to use more than N threads at a time, why would you want to create them?
Use a ThreadPoolExecutor instance with an unbound queue and fixed maximum size of Threads, e.g. Executors.newFixedThreadPool(N). This will accept a large number of tasks but will only execute N of them concurrently.
If you choose a bounded queue instead (with a capacity of N) the Executor will reject the execution of the task (how exactly depends on the Policy you can configure when working with ThreadPoolExecutor directly, instead of using the Executors factory - see RejectedExecutionHandler).
If you need "real" congestion control you should setup a bound BlockingQueue with a capacity of N. Fetch the tasks you want done from the database and put them into the queue - if it's full the calling thread will block. In another thread (perhaps also started using the Executor API) you take tasks from the BlockingQueue and submit them to the Executor. If the BlockingQueue is empty the calling thread will also block. To signal that you're done use a "special" object (e.g. a singleton which marks the last/final item in the queue).
Achieving good performance also depends on the kind of work that needs to be done in the threads. If your DB is the bottleneck in processing I would start paying attention to how your threads access the DB. Using a connection pool is probably in order. This might help you to achive more throughput, since worker threads can re-use DB connections from the pool.

Categories

Resources