I am trying to figure out a way to handle exceptions in a multi-thread setting. I would like to execute certain tasks in parallel, each of which might throw an exception that I need to react to (basically, by putting the failed task back into an execution queue). However, it seems to only way to actually get the exception from the thread is to create a Future and call its get() method. However, this essentially turns the calls into synchronous calls.
Maybe some code will illustrate the point:
ExecutorService executor = Executors.newFixedThreadPool(nThreads);
Task task = taskQueue.poll(); // let's assume that task implements Runnable
try {
executor.execute(task);
}
catch(Exception ex) {
// record the failed task, so that it can be re-added to the queue
}
However, in this case all tasks are launched, but the exceptions don't seem to get caught in this catch block here.
An alternative would be to use a Future instead of a thread and retrieve its result:
try {
Future<?> future = executor.submit(task);
future.get();
}
...
In this case, the exceptions are caught alright in the catch block, but at the price of having to wait until this operation is finished. So, the tasks are executed sequentially and not in parallel, as desired.
What am I missing? How can catch each's tasks Exceptions and react to them?
you could trigger all your tasks within one loop and check/await/retry in another:
Map<Future<?>, Task> futures = new HashMap<Future<?>, Task>()
while(!taskQueue.isEmpty()){
Task task = taskQueue.poll();
Future<?> future = executor.submit(task);
futures.put(future, task);
}
for(Map.Entry<Future<?>, Task> entry : futures.entrySet()){
try {
entry.getKey().get();
}
catch(ExecutionException ex) {
// record the failed task, so that it can be re-added to the queue
// you should add a retry counter because you want to prevent endless loops
taskQueue.add(entry.getValue());
}
catch(InterrupredException ex){
// thread interrupted, exit
Thread.interrupt();
return;
}
}
HTH, Mark
Related
Look at this code:
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.*;
public class InvokeAny {
public static void main(String[] args) {
Callable<String> callableTask = () -> {
TimeUnit.MILLISECONDS.sleep(300);
System.out.println("Callable task's execution");
return "Task's execution";
};
List<Callable<String>> callableTasks = new ArrayList<>();
callableTasks.add(callableTask);
callableTasks.add(callableTask);
callableTasks.add(callableTask);
ExecutorService executorService = Executors.newFixedThreadPool(2);
try {
executorService.invokeAny(callableTasks);
} catch (InterruptedException | ExecutionException e) {
e.printStackTrace();
}
shutdownAndAwaitTermination(executorService);
}
private static void shutdownAndAwaitTermination(ExecutorService pool) {
pool.shutdown(); // Disable new tasks from being submitted
try {
// Wait a while for existing tasks to terminate
if (!pool.awaitTermination(1000, TimeUnit.MILLISECONDS)) {
pool.shutdownNow(); // Cancel currently executing tasks
// Wait a while for tasks to respond to being cancelled
if (!pool.awaitTermination(1000, TimeUnit.MILLISECONDS))
System.err.println("Pool did not terminate");
}
} catch (InterruptedException ie) {
// (Re-)Cancel if current thread also interrupted
pool.shutdownNow();
// Preserve interrupt status
Thread.currentThread().interrupt();
}
}
}
Every time I run my program, I get different results in console.
1st run:
Callable task's execution
2nd run:
Callable task's execution
Callable task's execution
3rd run:
Callable task's execution
Could anybody explain me why it happens?
In Oracle's documentation there is only one phrase about method invokeAny(Collection<? extends Callable<T>> tasks):
Executes the given tasks, returning the result of one that has
completed successfully (i.e., without throwing an exception), if any
do.
I want to understand how it works. Does it cancel remaining tasks after one was completed? If it does, why sometimes I get 2 tasks being executed?
Does it cancel remaining tasks after one was completed?
Yes, that's right, but that does not mean it will submit/start the next task only after the current task is finished, that's the whole point of concurrency, it does not wait for the previous task to complete. Submit the tasks one by one, don't wait for them to complete, meanwhile, check if any task is completed, if completed, cancel all currently running tasks, don't submit the remaining tasks and just return the completed task.
Now, before cancelling the running tasks finally, its possible they have done their job or they may not, in your case the print statement, depending on the time slice each thread gets, this is based on various JVM and System factors as pointed out in the comment.
I am new to concurrency and I was trying to implement executor service concurrency for a do-while loop. But I always run into RejectedExecutionException
Here is my sample code:
do {
Future<Void> future = executor.submit(new Callable<Void>() {
#Override
public Void call() throws Exception {
// action
return null;
}
});
futures.add(future);
executor.shutdown();
for (Future<Void> future : futures) {
try {
future.get();
}
catch (InterruptedException e) {
throw new IOException(e)
}
}
}
while (true);
But this seems incorrect. I think I am calling the shutdown at the wrong place. Can anyone please help me implement Executor Service in a do-while loop correctly. Thanks.
ExecutorService.shutdown() stops the ExecutorService from accepting anymore jobs. It should be called when you're done submitting jobs.
Also Future.get() is a blocking method, which means it will block the execution of current thread and next iteration of loop will not continue unless this future (on which the get is called) returns. This will happen in every iteration, which makes the code non parallel.
You can use a CountDownLatch to wait for all the jobs to return.
Following is the correct code.
final List<Object> results = Collections.synchronizedList(new ArrayList<Object>());
final CountDownLatch latch = new CountDownLatch(10);//suppose you'll have 10 futures
do {
Future<Void> future = executor.submit(new Callable<Void>() {
#Override
public Void call() throws Exception {
// action
latch.countDown();//decrease the latch count
results.add(result); // some result
return null;
}
});
futures.add(future);
} while (true);
executor.shutdown();
latch.await(); //This will block till latch.countDown() has been called 10 times.
//Now results has all the outputs, do what you want with them.
Also if you're working with Java 8 then you can take a look at this answer https://stackoverflow.com/a/36261808/5343269
You're right, the shutdown method is not being called at the correct time. The ExecutorService will not accept tasks after shutdown is called (unless you implement your own version that does).
You should call shutdown after you've already submitted all tasks to the executor, so in this case, somewhere after the do-while loop.
From ThreadPoolExecutor documentation:
Rejected tasks
New tasks submitted in method execute(Runnable) will be rejected when the Executor has been shut down, and also when the Executor uses finite bounds for both maximum threads and work queue capacity, and is saturated.
In either case, the execute method invokes the RejectedExecutionHandler.rejectedExecution(Runnable, ThreadPoolExecutor) method of its RejectedExecutionHandler
From your code, it's clearly evident that you are calling shutdown() first and submitting the tasks later.
On a different note, refer to this related SE question for right way of shutting down ExecutorService:
ExecutorService's shutdown() doesn't wait until all threads will be finished
I'm using a global Executor service with some fixed thread pool size. We have bunch of related tasks that we submit for execution and wait on list of futures.
Recently, we faced a high CPU utilization issue and on debugging I found that an exception occurred while calling get() on one of the item in list of futures. Current, we iterate over the list and there is a try catch surrounding the whole loop.
try{
List<Result> results = new ArrayList<>()
for(Future<Result> futureResult: futureResults{
Result result = futureResult.get();
results.add(result);
}
} catch(Exception e){
throw new InternalServiceException(e);
}
//Do something with results
Wanted to know the behaviour of other threads if get is never called on some of the items in future. I tried searching but was not able to find anything.
Also, can this behaviour trigger high CPU utilization ?
http://www.journaldev.com/1650/java-futuretask-example-program
I would still check if the future isDone as in the example above.
If you need to run other operations or want to utilize the CPU better then I would put the collector in a separate thread and perhaps just poll for results every minute or so.
Could be scheduled or handled by Thread.sleep.
Executors class provides various methods to execute Callable in a thread pool. Since callable tasks run in parallel, we have to wait for the returned Object.
Callable tasks return java.util.concurrent.Future object. Using Future we can find out the status of the Callable task and get the returned Object.
It provides get() method that can wait for the Callable to finish and then return the result.
There is an overloaded version of get() method where we can specify the time to wait for the result, it’s useful to avoid current thread getting blocked for longer time.
Future provides cancel() method to cancel the associated Callable task. There are isDone() and isCancelled() methods to find out the current status of associated Callable task.
Here is a simple example of Callable task that returns the name of thread executing the task after some random time.
We are using Executor framework to execute 10 tasks in parallel and use Future to get the result of the submitted tasks.
public class FutureObjectTest implements Callable<String>{
#Override
public String call() throws Exception {
long waitTime = (long) (Math.random()*10000);
System.out.println(Thread.currentThread().getName() + " waiting time in MILISECONDS " + waitTime);
Thread.sleep(waitTime);
return Thread.currentThread().getName() + " exiting call method.";
}
public static void main(String [] args){
List<Future<String>> futureObjectList = new ArrayList<Future<String>>();
ExecutorService executorService = Executors.newFixedThreadPool(5);
Callable<String> futureObjectTest = new FutureObjectTest();
for(int i=0; i<10; i++){
Future<String> futureResult = executorService.submit(futureObjectTest);
futureObjectList.add(futureResult);
}
for(Future<String> futureObj : futureObjectList){
try {
System.out.println(futureObj.get());
} catch (InterruptedException | ExecutionException e) {
e.printStackTrace();
}
}
System.out.println("Starting get method of wait");
////////////get(Timeout) method///////
futureObjectList.clear();
for(int i=0; i<10; i++){
Future<String> futureResult = executorService.submit(futureObjectTest);
futureObjectList.add(futureResult);
}
executorService.shutdown();
for(Future<String> futureObj : futureObjectList){
try {
System.out.println(futureObj.get(2000,TimeUnit.MILLISECONDS));
} catch (InterruptedException | ExecutionException | TimeoutException e) {
e.printStackTrace();
}
}
}
}
I am looking for a way to execute batches of tasks in java. The idea is to have an ExecutorService based on a thread pool that will allow me to spread a set of Callable among different threads from a main thread. This class should provide a waitForCompletion method that will put the main thread to sleep until all tasks are executed. Then the main thread should be awaken, and it will perform some operations and resubmit a set of tasks.
This process will be repeated numerous times, so I would like to use ExecutorService.shutdown as this would require to create multiple instances of ExecutorService.
Currently I have implemented it in the following way using a AtomicInteger, and a Lock/Condition:
public class BatchThreadPoolExecutor extends ThreadPoolExecutor {
private final AtomicInteger mActiveCount;
private final Lock mLock;
private final Condition mCondition;
public <C extends Callable<V>, V> Map<C, Future<V>> submitBatch(Collection<C> batch){
...
for(C task : batch){
submit(task);
mActiveCount.incrementAndGet();
}
}
#Override
protected void afterExecute(Runnable r, Throwable t) {
super.afterExecute(r, t);
mLock.lock();
if (mActiveCount.decrementAndGet() == 0) {
mCondition.signalAll();
}
mLock.unlock();
}
public void awaitBatchCompletion() throws InterruptedException {
...
// Lock and wait until there is no active task
mLock.lock();
while (mActiveCount.get() > 0) {
try {
mCondition.await();
} catch (InterruptedException e) {
mLock.unlock();
throw e;
}
}
mLock.unlock();
}
}
Please not that I will not necessarily submit all the tasks from the batch at once, therefore CountDownLatch does not seem to be an option.
Is this a valid way to do it? Is there a more efficient/elegant way to implement that?
Thanks
I think the ExecutorService itself will be able to perform your requirements.
Call invokeAll([...]) and iterate over all of your Tasks. All Tasks are finished, if you can iterate through all Futures.
As the other answers point out, there doesn't seem to be any part of your use case that requires a custom ExecutorService.
It seems to me that all you need to do is submit a batch, wait for them all to finish while ignoring interrupts on the main thread, then submit another batch perhaps based on the results of the first batch. I believe this is just a matter of:
ExecutorService service = ...;
Collection<Future> futures = new HashSet<Future>();
for (Callable callable : tasks) {
Future future = service.submit(callable);
futures.add(future);
}
for(Future future : futures) {
try {
future.get();
} catch (InterruptedException e) {
// Figure out if the interruption means we should stop.
}
}
// Use the results of futures to figure out a new batch of tasks.
// Repeat the process with the same ExecutorService.
I agree with #ckuetbach that the default Java Executors should provide you with all of the functionality you need to execute a "batch" of jobs.
If I were you I would just submit a bunch of jobs, wait for them to finish with the ExecutorService.awaitTermination() and then just start up a new ExecutorService. Doing this to save on "thread creations" is premature optimization unless you are doing this 100s of times a second or something.
If you really are stuck on using the same ExecutorService for each of the batches then you can allocate a ThreadPoolExecutor yourself, and be in a loop looking at ThreadPoolExecutor.getActiveCount(). Something like:
BlockingQueue jobQueue = new LinkedBlockingQueue<Runnable>();
ThreadPoolExecutor executor = new ThreadPoolExecutor(NUM_THREADS, NUM_THREADS,
0L, TimeUnit.MILLISECONDS, jobQueue);
// submit your batch of jobs ...
// need to wait a bit for the jobs to start
Thread.sleep(100);
while (executor.getActiveCount() > 0 && jobQueue.size() > 0) {
// to slow the spin
Thread.sleep(1000);
}
// continue on to submit the next batch
My code:
String[] torrentFiles = new File("/root/torrents/").list();
if(torrentFiles.length == 0 || torrentFiles == null)
{
System.exit(0);
}
ex = Executors.newFixedThreadPool(3);
for(String torrentFile : torrentFiles)
{
ex.submit(new DownloadTorrent("/root/torrents/" + torrentFile));
}
ex.shutdown();
try
{
ex.awaitTermination(30, TimeUnit.MINUTES);
}
catch(InterruptedException ex1)
{
Logger.getLogger(Main.class.getName()).log(Level.SEVERE, null, ex1);
}
But sometimes torrent downloading takes unknown time value and «awaitTermination» not works as I want. I need to stop all executed threads instantly after half an hour but as I know «awaitTermination» just only use interrupt() method which works only in loops or waiting. So timeout not works if this moment happens. So, how to?
Instant thread termination is never guaranteed, unless the thread checks periodically for isInterrupted() flag (or is waiting in interruptable method, i.e. which throws InterruptedException).
Consider implementing your worker threads in manner, when they check periodically for isInterrupted(). This may be something like that:
public void run() {
byte[] data;
do {
data = receiveDataChunk(timeout);
processData(data);
} while(!isInterrupted() && data != null);
}
ExecutorService.shutdownNow() will try to stop all the executing threads..
Here is a quote from javadoc
List<Runnable> shutdownNow()
Attempts to stop all actively
executing tasks, halts the processing
of waiting tasks, and returns a list
of the tasks that were awaiting
execution.
There are no guarantees
beyond best-effort attempts to stop
processing actively executing tasks.
For example, typical implementations
will cancel via Thread.interrupt(), so
if any tasks mask or fail to respond
to interrupts, they may never
terminate.
Since downloading a torrent probably involves blocking IO operations, simply calling cancel()/shutdownNow() won't be enough, because blocking IO operations are not guaranteed to terminate when their respective threads are interrupted.
You also need to close the underlying sockets in order to cancel blocking IO, see How to terminate a thread blocking on socket IO operation instantly?.
ExecutorService.submit(...) returns a Future<?> that has a cancel() method. You should keep track of these can call it when you want each task to stop.
Am Using this code i have created.
Its generating many pdf files from many html templates using wkhtmltopdf .
so i want to increase performance of creating handreds without keep client waiting, this is only one part of implementation.
About getListOfCallables its returning the correct optimal
threshold for # of threads to use in fixed pool creation.
So i cant handle having lots of un dead threads laying around it made my EC2
CPU 100% stuck.
I used :
shutdown()
shutdownNow() in else of await
shutdownNow() in exception part
List fileGenerationHtmlToPdfList = getListOfCallables(paths, name, options);
ExecutorService executorService = Executors.newFixedThreadPool(fileGenerationHtmlToPdfList.size());
List<Future<ArrayList<File>>> futures = null;
try {
futures = executorService.invokeAll(fileGenerationHtmlToPdfList);
try {
for(Future f: futures) {
files.addAll((ArrayList<File>)f.get());
}
} catch (InterruptedException ex) {
Logger.getLogger(FileUtil.class.getName()).log(Level.SEVERE, "Interrupted Exception " , ex);
} catch (ExecutionException ex) {
Logger.getLogger(FileUtil.class.getName()).log(Level.SEVERE, "Interrupted Exception " , ex);
}
} catch (InterruptedException ex) {
Logger.getLogger(FileUtil.class.getName()).log(Level.SEVERE, "Interrupted Exception " , ex);
}
executorService.shutdown();//try shutdown
try {
if (executorService.awaitTermination(5, TimeUnit.SECONDS)) {
Logger.getLogger(FileUtil.class.getName()).log(Level.SEVERE, "Done ShutDowned");
} else {
executorService.shutdownNow();
}
} catch (InterruptedException ex) {
executorService.shutdownNow();
Logger.getLogger(FileUtil.class.getName()).log(Level.SEVERE, "Interrupted Exception " , ex);
}
Now I have to stop threads from a pool. I am doing it such a way. It may be not a good idea. Comment, please, if so.
boolean isTerminated = mPoolThreads.isTerminated();
while (!isTerminated) {
mPoolThreads.shutdownNow();
isTerminated = mPoolThreads.isTerminated();
Log.i(Constants.LOG_TAG, "Stop threads: the threads are not terminated yet");
}
Log.w(Constants.LOG_TAG, "Stop threads: Terminated");