How i can make sure that my rejectedExecution method works
RejectedExecutionHandler() {
#Override
public void rejectedExecution(Runnable r, ThreadPoolExecutor executor) {
logger.log(Level.INFO, "Name_[" + executorServiceName + "]: All threads busy, processing inline.");
r.run();
}
});
I would personally create a situation where my ExecutorService will always reject a task and check that this task has been called using a counter.
So for example my code could be something like that:
// A single threaded executor service that cannot have more than 1 task in its task queue
// such that I know that if I provide at least 3 tasks, at least 1 task will be rejected.
// Why 3? 1 task in the queue + 1 task executed by the thread of the pool
// = max of tasks that the pool can manage at a given time, so if I add 1 it will be
// rejected.
ExecutorService executor = new ThreadPoolExecutor(
1, 1, 0L, TimeUnit.MILLISECONDS, new LinkedBlockingQueue<>(1),
Executors.defaultThreadFactory(), myHandler
);
// My Counter
AtomicInteger counter = new AtomicInteger();
// Some arbitrary task that lasts long enough to make sure that at least 3
// tasks will be submitted that will increment my counter once completed
Runnable task = () -> {
try {
Thread.sleep(1_000L);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
} finally {
counter.incrementAndGet();
}
};
try {
// Submit 3 tasks
executor.submit(task);
executor.submit(task);
executor.submit(task);
} finally {
// Shutdown the pool and wait until all the submitted tasks have been executed
executor.shutdown();
executor.awaitTermination(1L, TimeUnit.MINUTES);
}
// Ensure that we have 3 tasks that have been executed
assertEquals(3, counter.get());
Related
I submitting List of LinkedBlockingQueue of the Long type to ThreadPoolExecutor and condition should be as each thread pick LinkedBlockingQueue of long and execute in parallel
This is my Method Logic
public void doParallelProcess() {
List<LinkedBlockingQueue<Long>> linkedBlockingQueueList = splitListtoBlockingQueues();
ThreadPoolExecutor executor = new ThreadPoolExecutor(1, linkedBlockingQueueList.size(), 0L,
TimeUnit.MILLISECONDS, new LinkedBlockingQueue<Runnable>(), Executors.defaultThreadFactory());
Long initial = System.currentTimeMillis();
try {
System.out.println("linkedBlockingQueueList begin size is " + linkedBlockingQueueList.size() + "is empty"
+ linkedBlockingQueueList.isEmpty());
while (true) {
linkedBlockingQueueList.parallelStream().parallel().filter(q -> !q.isEmpty()).forEach(queue -> {
Long id = queue.poll();
MyTestRunnable runnab = new MyTestRunnable(id);
executor.execute(runnab);
System.out.println("Task Count: " + executor.getTaskCount() + ", Completed Task Count: "
+ executor.getCompletedTaskCount() + ", Active Task Count: " + executor.getActiveCount());
});
System.out.println("linkedBlockingQueueList end size is " + linkedBlockingQueueList.size() + "is empty"
+ linkedBlockingQueueList.isEmpty());
System.out.println("executor service " + executor);
if (executor.getCompletedTaskCount() == (long) mainList.size()) {
break;
}
while (executor.getActiveCount() != 0) {
System.out.println("Task Count: " + executor.getTaskCount() + ", Completed Task Count: "
+ executor.getCompletedTaskCount() + ", Active Task Count: " + executor.getActiveCount());
Thread.sleep(1000L);
}
}
} catch (Exception e) {
} finally {
executor.shutdown();
while (!executor.isTerminated()) {
}
}
} `
How to submit a list of LinkedBlockingQueue to an individual thread
example :
List<LinkedBlockingQueue<Long>> each LinkedBlockingQueue
contains 50 queue data
size of List<LinkedBlockingQueue<Long>>
is 50
each thread should pick one LinkedBlockingQueue<Long> and execute 50 queue
tasks.
The input to an ExecutorService is either Runnable or Callable. Any task you submit needs to implement one of those two interfaces. If you want to submit a bunch of tasks to a thread pool and wait until they are all complete, then you can use the invokeAll method and loop over the resulting Futures, calling get on each: see this informative answer to a similar question.
You do not need to batch your input tasks into groups, though. You never want an executor service to have idle threads while there is still work left to do! You want it to be able to grab the next task as soon as resources free up, and batching in this fashion runs contrary to that. Your code is doing this:
while non-empty input lists exist {
for each non-empty input list L {
t = new Runnable(L.pop())
executor.submit(t)
}
while (executor.hasTasks()) {
wait
}
}
Once one of those tasks completes, that thread should be free to move on to other work. But it won't because you wait until all N tasks complete before you submit any more. Submit them all at once with invokeAll and let the executor service do what it was built to do.
The Executors class is your main entry to thread pools:
ExecutorService executor = Executors.newCachedThreadPool();
linkedBlockingQueueList.forEach(queue -> executor.submit(() -> { /* process queue */ }));
If you do want to create a ThreadPoolExecutor yourself — it does give you more control over the configuration — there are at least two ways you may specify a default thread factory:
Leave out the thread factory argument:
ThreadPoolExecutor executor = new ThreadPoolExecutor(1, linkedBlockingQueueList.size(),
0L, TimeUnit.MILLISECONDS, new LinkedBlockingQueue<Runnable>());
Use the Executors class again for getting the default thread factory:
ThreadPoolExecutor executor = new ThreadPoolExecutor(1, linkedBlockingQueueList.size(),
0L, TimeUnit.MILLISECONDS, new LinkedBlockingQueue<Runnable>(),
Executors.defaultThreadFactory());
I want to know that when a program waits for Future object of one thread, will other threads continue their execution.
I have tried the below sample, it seems when my program is waiting for one thread, other threads are not continuing their execution. Please tell me whether this is correct or is there any issues with my code for handling threads.
ExecutorService executor = Executors.newFixedThreadPool(3);
for(int i=0; i<5 ;i++)
{
Worker w = new Worker();
Future<String> future = executor.submit(w);
while(!future.isDone())
{
//Wait
}
String s = future.get();
System.out.println(LocalDateTime.now()+" "+s);
}
executor.shutdown();
executor.awaitTermination(Long.MAX_VALUE, TimeUnit.MILLISECONDS);
Below is my worker class:
public class Worker implements Callable<String> {
#Override
public String call() throws Exception {
// TODO Auto-generated method stub
Thread.sleep(3000);
return Thread.currentThread().getName();
}
}
I am getting the below results(Added date time to show that the results are not parallel):
2019-01-04T16:34:22.647 pool-1-thread-1
2019-01-04T16:34:25.661 pool-1-thread-2
2019-01-04T16:34:28.673 pool-1-thread-3
2019-01-04T16:34:31.685 pool-1-thread-1
2019-01-04T16:34:34.699 pool-1-thread-2
The problem
You presented the code which from main thread perspective waits (2) for each execution before submitting new task (1). In other words: in main thread you submit the task, wait for complete execution in main thread and submit next task after.
ExecutorService executor = Executors.newFixedThreadPool(3);
for(int i=0; i<5 ;i++)
{
Worker w = new Worker();
Future<String> future = executor.submit(w); // (1)
while(!future.isDone()) // (2)
{
//Wait
}
String s = future.get();
System.out.println(LocalDateTime.now()+" "+s);
}
executor.shutdown();
executor.awaitTermination(Long.MAX_VALUE, TimeUnit.MILLISECONDS);
Solution
To solve the issue you should (from main thread perspective) submit all tasks without waiting and then wait for results from executor service.
Example: https://stackoverflow.com/a/49746114/1815881
You can construct all tasks then call invokeAll() in ExecutorService.
I have the following test code
public static void main(String[] args){
ForkJoinPool pool = new ForkJoinPool(2);
ForkJoinTask task3 = ForkJoinTask.adapt(() -> {
System.out.println("task 3 executing");
for(int i = 0; i < 10; ++i){
System.out.println("task 3 doing work " + i);
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
});
ForkJoinTask task2 = ForkJoinTask.adapt(() -> {
try {
System.out.println("task 2 executing");
Thread.sleep(5000);
System.out.println("task 2 finishing");
} catch (InterruptedException e) {
e.printStackTrace();
}
return null;
});
pool.submit(task2);
ForkJoinTask task1 = pool.submit(() -> {
System.out.println("task 1 executing");
pool.submit(task3); // EDIT: Original code was task3.fork();
System.out.println("task 1 joining task 2");
task2.join();
System.out.println("task 1 finished");
});
task1.join();
}
It basically submits 3 tasks to a ForkJoinPool of parallelism 2, task 2 and 3 are long running and task 1 waits for task 2.
Labeling the 2 threads t1 and t2, where t1 executes task1 and t2 executes task2.
In my understanding, the work-stealing magic happens within the join() call, where the calling thread would execute a task from either its own work queue or other worker threads' work queue. Thus I'm expecting t1 to execute task1, sees the join() call then steal task3 and execute it to completion.
However, in practice, t1 does not do anything special with the join() call. Task3 is only executed after both task1 and task2 has finished. Why is this the case?
After spending hours looking into the source code of ForkJoinPool and ForkJoinTask, here is what I've found:
join() will cause a thread to look and steal tasks, given one of the following two conditions are met:
The task being joined is located at the top of the current worker thread's work queue, in which case the worker thread will continue to execute that task (see below)
There is task from the work queue of another worker thread, but only if that worker thread has stolen a task from the current worker thread, then the current worker thread will steal back a task and execute it (see below)
For the first case, I deduced it primarily from the doJoin() method in ForkJoinTask.java and below is a working test that illustrates the case:
public static void main(String[] args){
ForkJoinPool pool = new ForkJoinPool(2);
ForkJoinTask task3 = ForkJoinTask.adapt(() -> {
System.out.println("task 3 executing on thread " + Thread.currentThread());
for(int i = 0; i < 10; ++i){
System.out.println("task 3 doing work " + i);
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
});
ForkJoinTask task2 = ForkJoinTask.adapt(() -> {
try {
System.out.println("task 2 executing on thread " + Thread.currentThread());
Thread.sleep(5000);
System.out.println("task 2 finished");
} catch (InterruptedException e) {
e.printStackTrace();
}
return null;
});
ForkJoinTask task1 = ForkJoinTask.adapt(() -> {
System.out.println("task 1 executing on thread " + Thread.currentThread());
pool.submit(task3);
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
System.out.println("task 1 joining task 3");
task3.join();
System.out.println("task 1 finished");
});
pool.submit(task2);
pool.submit(task1);
task1.join();
}
The output is
task 1 executing on thread Thread[ForkJoinPool-1-worker-2,5,main]
task 2 executing on thread Thread[ForkJoinPool-1-worker-1,5,main]
task 1 joining task 3
task 3 executing on thread Thread[ForkJoinPool-1-worker-2,5,main]
task 3 doing work 0
task 3 doing work 1
task 3 doing work 2
task 3 doing work 3
task 2 finished
task 3 doing work 4
task 3 doing work 5
task 3 doing work 6
task 3 doing work 7
task 3 doing work 8
task 3 doing work 9
task 1 finished
Task3 and task1 are executed on the same worker thread, which is expected since task3 is directly submitted to thread2's work queue and therefore according to case 1 it should be executed when task1 calls join() on it.
I deduced the second case based on the awaitJoin() method in ForkJoinPool.java, below is a working test that illustrates the case
public static void main(String[] args){
ForkJoinPool pool = new ForkJoinPool(2);
ForkJoinTask task3 = ForkJoinTask.adapt(() -> {
System.out.println("task 3 executing on thread " + Thread.currentThread());
for(int i = 0; i < 10; ++i){
System.out.println("task 3 doing work " + i);
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
});
ForkJoinTask task2 = ForkJoinTask.adapt(() -> {
try {
System.out.println("task 2 executing on thread " + Thread.currentThread());
pool.submit(task3);
Thread.sleep(5000);
System.out.println("task 2 finished");
} catch (InterruptedException e) {
e.printStackTrace();
}
return null;
});
ForkJoinTask task1 = ForkJoinTask.adapt(() -> {
System.out.println("task 1 executing on thread " + Thread.currentThread());
pool.submit(task2);
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
System.out.println("task 1 joining task 2");
task2.join();
System.out.println("task 1 finished");
});
pool.submit(task1);
task1.join();
task2.join();
task3.join();
}
and the output
task 1 executing on thread Thread[ForkJoinPool-1-worker-1,5,main]
task 2 executing on thread Thread[ForkJoinPool-1-worker-2,5,main]
task 1 joining task 2
task 3 executing on thread Thread[ForkJoinPool-1-worker-1,5,main]
task 3 doing work 0
task 3 doing work 1
task 3 doing work 2
task 3 doing work 3
task 2 finished
task 3 doing work 4
task 3 doing work 5
task 3 doing work 6
task 3 doing work 7
task 3 doing work 8
task 3 doing work 9
task 1 finished
Task3 executed on thread1 as task1 is waiting for task2, this is possible because task2 was submitted to thread1's work queue, but since thread2 is free, it stole the task can became a stealer for thread1. When thread1 sees the join() call from task1, it looks at the stealer (thread2)'s work queue and found task3, took it and execute it.
Also note that task1 finished executing only after task3, which means that once the thread has stolen a task, it must executes it to completion.
Now for the original question, I have submitted both task1 and task2 in a non-ForkJoinWorkerThread (the main thread), thus non of the worker thread steals from each other so the second case doesn't apply. Furthermore, since I called join() on the second task, which is in thread2's work queue, therefore the first case doesn't apply and thus no stealing happens.
Edit:
The is by no mean an answer to F/J in java, if there are any problems please point them out. In fact digging all these details out only created more problems: namely, why won't a worker thread just take an arbitrary task and run it? why must it come from a stealer or its own work queue? if you have the answer please do comment / post.
I have following part of code:
protected ExecutorService parallelExecutor = Executors.newCachedThreadPool();
protected ExecutorService serialExecutor = Executors.newSingleThreadExecutor();
List<?> parallelCommands = new ArrayList<?>();
List<?> serialCommands = new ArrayList<?>();
List<Future<Boolean>> results = null;
LocalDateTime timed = LocalDateTime.now().plusSeconds(60);
results = parallelExecutor.invokeAll(parallelCommands);
results.addAll(serialExecutor.invokeAll(serialCommands));
Now I would like to check if both executors finish their job within a timeout or not:
while (LocalDateTime.now().isBefore(timed)) {
\\ here I need to check if meanwhile my threads finished
\\ if yes, break;}
How can I verify if the executors finished their job?
JDK documentation:
void shutdownAndAwaitTermination(ExecutorService pool) {
pool.shutdown(); // Disable new tasks from being submitted
try {
// Wait a while for existing tasks to terminate
if (!pool.awaitTermination(60, TimeUnit.SECONDS)) {
pool.shutdownNow(); // Cancel currently executing tasks
// Wait a while for tasks to respond to being cancelled
if (!pool.awaitTermination(60, TimeUnit.SECONDS))
System.err.println("Pool did not terminate");
}
} catch (InterruptedException ie) {
// (Re-)Cancel if current thread also interrupted
pool.shutdownNow();
// Preserve interrupt status
Thread.currentThread().interrupt();
}
https://docs.oracle.com/javase/8/docs/api/java/util/concurrent/ExecutorService.html
https://docs.oracle.com/javase/8/docs/api/java/util/concurrent/ExecutorService.html#awaitTermination-long-java.util.concurrent.TimeUnit-
Use a counter to keep track of each task that finishes. You can decrement and check by modifying tasks added to your task list or by using a CompletableFuture.
List<Callable<?>> tasks = ...
ExecutorService executor = ...
// Might want to add the size of your other task list as well
AtomicInteger counter = new AtomicInteger(tasks.size());
for (Callable<?> callable : tasks) {
results.add(executor.submit(new Callable() {
callable.call();
int value = counter.decrementAndGet();
if (value == 0) {
synchronized (this) {
OuterClass.this.notify();
}
}
});
}
long timed = System.currentTimeMillis();
synchronized (this) {
long timeLeft;
// Or however many millis your timeout is
while ((timeLeft = 60_000 - System.currentTimeMillis() - timed) > 0) {
this.wait(timeLeft);
}
}
What you want to do is wait until you run out of time on your main thread, while your tasks are executed by the executor. If a task finishes and it realizes that there are no tasks that haven't finished, it tells the waiting thread to continue. I use notify() instead of notifyAll() because no other threads should be waiting for this object except the main thread, but if you do have other threads, use the latter option.
I just found CompletionService in this blog post. However, this does't really showcases the advantages of CompletionService over a standard ExecutorService. The same code can be written with either. So, when is a CompletionService useful?
Can you give a short code sample to make it crystal clear? For example, this code sample just shows where a CompletionService is not needed (=equivalent to ExecutorService)
ExecutorService taskExecutor = Executors.newCachedThreadPool();
// CompletionService<Long> taskCompletionService =
// new ExecutorCompletionService<Long>(taskExecutor);
Callable<Long> callable = new Callable<Long>() {
#Override
public Long call() throws Exception {
return 1L;
}
};
Future<Long> future = // taskCompletionService.submit(callable);
taskExecutor.submit(callable);
while (!future.isDone()) {
// Do some work...
System.out.println("Working on something...");
}
try {
System.out.println(future.get());
} catch (InterruptedException e) {
e.printStackTrace();
} catch (ExecutionException e) {
e.printStackTrace();
}
Omitting many details:
ExecutorService = incoming queue + worker threads
CompletionService = incoming queue + worker threads + output queue
With ExecutorService, once you have submitted the tasks to run, you need to manually code for efficiently getting the results of the tasks completed.
With CompletionService, this is pretty much automated. The difference is not very evident in the code you have presented because you are submitting just one task. However, imagine you have a list of tasks to be submitted. In the example below, multiple tasks are submitted to the CompletionService. Then, instead of trying to find out which task has completed (to get the results), it just asks the CompletionService instance to return the results as they become available.
public class CompletionServiceTest {
class CalcResult {
long result ;
CalcResult(long l) {
result = l;
}
}
class CallableTask implements Callable<CalcResult> {
String taskName ;
long input1 ;
int input2 ;
CallableTask(String name , long v1 , int v2 ) {
taskName = name;
input1 = v1;
input2 = v2 ;
}
public CalcResult call() throws Exception {
System.out.println(" Task " + taskName + " Started -----");
for(int i=0;i<input2 ;i++) {
try {
Thread.sleep(200);
} catch (InterruptedException e) {
System.out.println(" Task " + taskName + " Interrupted !! ");
e.printStackTrace();
}
input1 += i;
}
System.out.println(" Task " + taskName + " Completed ######");
return new CalcResult(input1) ;
}
}
public void test(){
ExecutorService taskExecutor = Executors.newFixedThreadPool(3);
CompletionService<CalcResult> taskCompletionService = new ExecutorCompletionService<CalcResult>(taskExecutor);
int submittedTasks = 5;
for (int i=0;i< submittedTasks;i++) {
taskCompletionService.submit(new CallableTask (
String.valueOf(i),
(i * 10),
((i * 10) + 10 )
));
System.out.println("Task " + String.valueOf(i) + "subitted");
}
for (int tasksHandled=0;tasksHandled<submittedTasks;tasksHandled++) {
try {
System.out.println("trying to take from Completion service");
Future<CalcResult> result = taskCompletionService.take();
System.out.println("result for a task availble in queue.Trying to get()");
// above call blocks till atleast one task is completed and results availble for it
// but we dont have to worry which one
// process the result here by doing result.get()
CalcResult l = result.get();
System.out.println("Task " + String.valueOf(tasksHandled) + "Completed - results obtained : " + String.valueOf(l.result));
} catch (InterruptedException e) {
// Something went wrong with a task submitted
System.out.println("Error Interrupted exception");
e.printStackTrace();
} catch (ExecutionException e) {
// Something went wrong with the result
e.printStackTrace();
System.out.println("Error get() threw exception");
}
}
}
}
Basically you use a CompletionService if you want to execute multiple tasks in parallel and then work with them in their completion order. So, if I execute 5 jobs, the CompletionService will give me the first one that that finishes. The example where there is only a single task confers no extra value over an Executor apart from the ability to submit a Callable.
I think the javadoc best answers the question of when the CompletionService is useful in a way an ExecutorService isn't.
A service that decouples the production of new asynchronous tasks from the consumption of the results of completed tasks.
Basically, this interface allows a program to have producers which create and submit tasks (and even examine the results of those submissions) without knowing about any other consumers of the results of those tasks. Meanwhile, consumers which are aware of the CompletionService could poll for or take results without being aware of the producers submitting the tasks.
For the record, and I could be wrong because it is rather late, but I am fairly certain that the sample code in that blog post causes a memory leak. Without an active consumer taking results out of the ExecutorCompletionService's internal queue, I'm not sure how the blogger expected that queue to drain.
First of all, if we do not want to waste processor time, we will not use
while (!future.isDone()) {
// Do some work...
}
We must use
service.shutdown();
service.awaitTermination(14, TimeUnit.DAYS);
The bad thing about this code is that it will shut down ExecutorService. If we want to continue work with it (i.e. we have some recursicve task creation), we have two alternatives: invokeAll or ExecutorService.
invokeAll will wait untill all tasks will be complete. ExecutorService grants us ability to take or poll results one by one.
And, finily, recursive example:
ExecutorService executorService = Executors.newFixedThreadPool(THREAD_NUMBER);
ExecutorCompletionService<String> completionService = new ExecutorCompletionService<String>(executorService);
while (Tasks.size() > 0) {
for (final Task task : Tasks) {
completionService.submit(new Callable<String>() {
#Override
public String call() throws Exception {
return DoTask(task);
}
});
}
try {
int taskNum = Tasks.size();
Tasks.clear();
for (int i = 0; i < taskNum; ++i) {
Result result = completionService.take().get();
if (result != null)
Tasks.add(result.toTask());
}
} catch (InterruptedException e) {
// error :(
} catch (ExecutionException e) {
// error :(
}
}
See it by yourself at run time,try to implement both solutions (Executorservice and Completionservice) and you'll see how different they behave and it will be more clear on when to use one or the other.
There is an example here if you want http://rdafbn.blogspot.co.uk/2013/01/executorservice-vs-completionservice-vs.html
Let's say you have 5 long running task(callable task) and you have submitted those task to executer service. Now imagine you don't want to wait for all 5 task to compete instead you want to do some sort of processing on these task if any one completes. Now this can be done either by writing polling logic on future objects or use this API.
package com.barcap.test.test00;
import java.util.concurrent.*;
/**
* Created by Sony on 25-04-2019.
*/
public class ExecutorCompletest00 {
public static void main(String[] args) {
ExecutorService exc= Executors.newFixedThreadPool( 10 );
ExecutorCompletionService executorCompletionService= new ExecutorCompletionService( exc );
for (int i=1;i<10;i++){
Task00 task00= new Task00( i );
executorCompletionService.submit( task00 );
}
for (int i=1;i<20;i++){
try {
Future<Integer> future= (Future <Integer>) executorCompletionService.take();
Integer inttest=future.get();
System.out.println(" the result of completion service is "+inttest);
break;
} catch (InterruptedException e) {
e.printStackTrace();
} catch (ExecutionException e) {
e.printStackTrace();
}
}
}
}
=======================================================
package com.barcap.test.test00;
import java.util.*;
import java.util.concurrent.*;
/**
* Created by Sony on 25-04-2019.
*/
public class ExecutorServ00 {
public static void main(String[] args) {
ExecutorService executorService=Executors.newFixedThreadPool( 9 );
List<Future> futList= new ArrayList <>( );
for (int i=1;i<10;i++) {
Future result= executorService.submit( new Task00( i ) );
futList.add( result );
}
for (Future<Integer> futureEach :futList ){
try {
Integer inm= futureEach.get();
System.out.println("the result of future executorservice is "+inm);
break;
} catch (InterruptedException e) {
e.printStackTrace();
} catch (ExecutionException e) {
e.printStackTrace();
}
}
}
}
===========================================================
package com.barcap.test.test00;
import java.util.concurrent.*;
/**
* Created by Sony on 25-04-2019.
*/
public class Task00 implements Callable<Integer> {
int i;
public Task00(int i) {
this.i = i;
}
#Override
public Integer call() throws Exception {
System.out.println(" the current thread is "+Thread.currentThread().getName() +" the result should be "+i);
int sleepforsec=100000/i;
Thread.sleep( sleepforsec );
System.out.println(" the task complted for "+Thread.currentThread().getName() +" the result should be "+i);
return i;
}
}
======================================================================
difference of logs for executor completion service:
the current thread is pool-1-thread-1 the result should be 1
the current thread is pool-1-thread-2 the result should be 2
the current thread is pool-1-thread-3 the result should be 3
the current thread is pool-1-thread-4 the result should be 4
the current thread is pool-1-thread-6 the result should be 6
the current thread is pool-1-thread-5 the result should be 5
the current thread is pool-1-thread-7 the result should be 7
the current thread is pool-1-thread-9 the result should be 9
the current thread is pool-1-thread-8 the result should be 8
the task complted for pool-1-thread-9 the result should be 9
teh result is 9
the task complted for pool-1-thread-8 the result should be 8
the task complted for pool-1-thread-7 the result should be 7
the task complted for pool-1-thread-6 the result should be 6
the task complted for pool-1-thread-5 the result should be 5
the task complted for pool-1-thread-4 the result should be 4
the task complted for pool-1-thread-3 the result should be 3
the task complted for pool-1-thread-2 the result should be 2
the current thread is pool-1-thread-1 the result should be 1
the current thread is pool-1-thread-3 the result should be 3
the current thread is pool-1-thread-2 the result should be 2
the current thread is pool-1-thread-5 the result should be 5
the current thread is pool-1-thread-4 the result should be 4
the current thread is pool-1-thread-6 the result should be 6
the current thread is pool-1-thread-7 the result should be 7
the current thread is pool-1-thread-8 the result should be 8
the current thread is pool-1-thread-9 the result should be 9
the task complted for pool-1-thread-9 the result should be 9
the task complted for pool-1-thread-8 the result should be 8
the task complted for pool-1-thread-7 the result should be 7
the task complted for pool-1-thread-6 the result should be 6
the task complted for pool-1-thread-5 the result should be 5
the task complted for pool-1-thread-4 the result should be 4
the task complted for pool-1-thread-3 the result should be 3
the task complted for pool-1-thread-2 the result should be 2
the task complted for pool-1-thread-1 the result should be 1
the result of future is 1
=======================================================
for executorservice the result will only be avialable after all tasks complted.
executor completionservice any result avilable make that return.
If the task producer is not interested in the results and it is another component's responsibility to process results of asynchronous task executed by executor service, then you should use CompletionService. It helps you in separating task result processor from task producer. See example http://www.zoftino.com/java-concurrency-executors-framework-tutorial
there is another advantage of using completionservice: Performance
when you call future.get(), you are spin waiting:
from java.util.concurrent.CompletableFuture
private Object waitingGet(boolean interruptible) {
Signaller q = null;
boolean queued = false;
int spins = -1;
Object r;
while ((r = result) == null) {
if (spins < 0)
spins = (Runtime.getRuntime().availableProcessors() > 1) ?
1 << 8 : 0; // Use brief spin-wait on multiprocessors
else if (spins > 0) {
if (ThreadLocalRandom.nextSecondarySeed() >= 0)
--spins;
}
when you have a long-running task, this will be a disaster for performance.
with completionservice, once the task is done, it's result will be enqueued and you can poll the queue with lower performance overhand.
completionservice achieve this by using wrap task with a done hook.
java.util.concurrent.ExecutorCompletionService
private class QueueingFuture extends FutureTask<Void> {
QueueingFuture(RunnableFuture<V> task) {
super(task, null);
this.task = task;
}
protected void done() { completionQueue.add(task); }
private final Future<V> task;
}
assuming you execute a tasks in parallel and you save the Future results in a list:
The practical main difference between ExecutorService and CompletionService is:
ExecutorService get() will try to retrieve the results in the submitted order waiting for completion.
CompletionService take() + get() will try to retrieve the results in the completion order disregarding the submission order.
ExecutorCompletionService class implements CompletionService.
ExecutorCompletionService returns futures objects based on completion order, so whichever task executes first, will be returned first. You just need to call executorCompletionService.take() to get completed Future object.
I found a blog that clear my thought.
java2blog link with example