I'm using ExecutorService for submitting a batch of tasks. I'm doing it something like this:
ListeningExecutorService exec = MoreExecutors.listeningDecorator(Executors.newFixedThreadPool(threads));
List<ListenableFuture<Whatever>> futures = new ArrayList<>();
for (int i = 0; i < 100; i++) {
results.add(exec.submit(new MyTask(i)));
}
ListenableFuture<List<Whatever>> listListenableFuture = Futures.successfulAsList(futures);
try {
List<Whatever> responses = listListenableFuture.get(2000, TimeUnit.MILLISECONDS);
for (Whatever response : responses) {
LOG.info("Yay!");
}
} catch (TimeoutException e) {
LOG.info("Timeout Exception");
} catch (Exception e) {
// Nay!
}
The problem here is - if one of the task takes longer than 2000ms, will throw the TimeoutException and I'll get nothing in the response though some of the tasks might have finished at that very point.
So I want to retrieve the response (be it partial or complete) of the tasks that have been finished till it timeouts (2000ms). Eg:
(time relative to the START_TIME of the batch call)
Task-1: 1000ms
Task-2: 3000ms
Task-3: 1800ms
Output:
Timeout Exception
Desired Output:
Yay! <- corresponds to task-1
Yay! <- corresponds to task-3
One solution I thought of is to fetch the futures individually and set their timeout as MAX(0, TIME_OUT - TIME_NOW - START_TIME). This might work but doesn't seems like a clean solution to me.
You might use a decorate callable which handles the time out.
Suppose this is the original callable:
class OriginalCallable implements Callable<String> {
#Override
public String call() throws Exception {
return "";
}
}
You can construct a decorate callable with this original callable and the executor:
class DecorateCallable implements Callable<String> {
ExecutorService executorService;
OriginalCallable callable;
public DecorateCallable(ExecutorService executorService, OriginalCallable callable) {
this.executorService = executorService;
this.callable = callable;
}
#Override
public String call() throws Exception {
Future<String> future = executorService.submit(callable);
try {
return future.get(2000, TimeUnit.SECONDS);
} catch (TimeoutException | InterruptedException e) {
}
return null;
}
}
If you decide to use this, you need double you pool size:
Executors.newFixedThreadPool(threads * 2);
and add some condition like if(future.get() != null) before put them into the final result set.
If you use Futures.getChecked, the timeout exceptions get swallowed, and the future will return null. Check out the following code, where one of the tasks throws TimeoutException, and the corresponding future returns null.
import java.io.IOException;
import com.google.common.util.concurrent.*;
import java.time.*;
import java.util.*;
import java.util.concurrent.*;
import java.util.stream.*;
public class App
{
public static void main( String[] args ) throws InterruptedException, ExecutionException
{
ListeningExecutorService listeningExecutorService = MoreExecutors.listeningDecorator(Executors.newFixedThreadPool(5));
ListenableFuture<String> future1 =
listeningExecutorService.submit(() -> {
throw new TimeoutException("Timeout exception");
});
ListenableFuture<String> future2 =
listeningExecutorService.submit(() -> "Hello World");
ListenableFuture<List<String>> combined = Futures.successfulAsList(future1, future2);
try {
String greeting = Futures.getChecked(combined, IOException.class, 2000l, TimeUnit.MILLISECONDS).stream().collect(Collectors.joining(" "));
System.out.println(greeting);
} catch (IOException e) {
System.out.println("Exception: " + e.getMessage());
} finally {
listeningExecutorService.shutdown();
}
}
}
Related
TL;DR: I want to perform an asynchronous Call to a REST-API. The standard call would give me a CompleteableFuture<Response>, however because the API has a limit on how many calls it allows in a certain amount of time I want to be able to queue up calls to 1. execute them in order and 2. execute them only when I am not exceeding the APIs limits at that current moment, otherwise wait.
Long verson:
I am using Retrofit to perform Rest calls to an API and Retrofit returns a CompleteableFuture<WhateverResponseClassIDeclare> when I call it. However due to limitations of the API I am calling I want to have tight control over when and in what order my calls go out to it. In detail, too many calls in a certain timeframe would cause me to get IP banned. Similarly I want to maintain the order of my calls, even if they won't get executed immediately. The goal is to call a Wrapper of the API that returns a CompleteableFuture just like the original API but performs those in-between steps asynchronously.
I was playing around with BlockingQueues, Functions, Callables, Suppliers and everything inbetween, but I couldn't get it to work yet.
Following there is my currently NON FUNCTIONAL code I created as a Mockup to test the concept.
import java.util.concurrent.BlockingDeque;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.LinkedBlockingDeque;
import java.util.function.Function;
public class Sandbox2 {
public static void main(String[] args) throws ExecutionException, InterruptedException {
MockApi mockApi = new MockApi();
CompletableFuture<Integer> result1 = mockApi.requestAThing("Req1");
CompletableFuture<Integer> result2 = mockApi.requestAThing("Req2");
CompletableFuture<Integer> result3 = mockApi.requestAThing("Req3");
System.out.println("Result1: " + result1.get());
System.out.println("Result2: " + result2.get());
System.out.println("Result3: " + result3.get());
}
public static class MockApi {
ActualApi actualApi = new ActualApi();
BlockingDeque<Function<String, CompletableFuture<Integer>>> queueBlockingDeque = new LinkedBlockingDeque();
public CompletableFuture<Integer> requestAThing(String req1) {
Function<String, CompletableFuture<Integer>> function = new Function<String, CompletableFuture<Integer>>() {
#Override
public CompletableFuture<Integer> apply(String s) {
return actualApi.requestHandler(s);
}
};
return CompletableFuture
.runAsync(() -> queueBlockingDeque.addLast(function))
.thenRun(() -> waitForTheRightMoment(1000))
.thenCombine(function)
}
private void waitForTheRightMoment(int time) {
try {
Thread.sleep(time);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
public static class ActualApi {
public CompletableFuture<Integer> requestHandler(String request) {
return CompletableFuture.supplyAsync(() -> {
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
return Integer.parseInt(request.substring(3));
});
}
}
}
Pre JDK 9 (JDK 1.8)
You can make use of ScheduledExecutor that accepts items to execute asynchronously on a pre-configured thread pool at a pre-fixed rate / delay.
You can obtain such a service as follows:
private final ScheduledExecutorService executorService = Executors.newSingleThreadScheduledExecutor();
Once an instance of ScheduledExecutorService is created, you can start submitting items (requests) to be executed as follows:
executorService.schedule(
() -> actualApi.requestHandler(req),
delay,
unit
);
Meanwhile, using a direct call want lead a CompletableFuture<Integer> but instead would lead a ScheduledFuture<CompletableFuture<Integer>> on which you will have to block to get the wrapped result.
Instead, you would need to block on your final requests results inside the ScheduledExecutorService then wrap your final request result in a completed ComppletableFuture:
public <T> CompletableFuture<T> scheduleCompletableFuture(
final CompletableFuture<T> command,
final long delay,
final TimeUnit unit) {
final CompletableFuture<T> completableFuture = new CompletableFuture<>();
this.executorService.schedule(
(() -> {
try {
return completableFuture.complete(command.get());
} catch (Throwable t) {
return completableFuture.completeExceptionally(t);
}
}),
delay,
unit
);
return completableFuture;
}
Here down a review version of your implementation:
public class Sandbox2 {
public static void main(String[] args) throws ExecutionException, InterruptedException {
MockApi mockApi = new MockApi();
CompletableFuture<Integer> result1 = mockApi.requestAThing("Req1");
CompletableFuture<Integer> result2 = mockApi.requestAThing("Req2");
CompletableFuture<Integer> result3 = mockApi.requestAThing("Req3");
System.out.println("Result1: " + result1.get());
System.out.println("Result2: " + result2.get());
System.out.println("Result3: " + result3.get());
}
public static class MockApi {
private final AtomicLong delay = new AtomicLong(0);
private final ScheduledExecutorService executorService = Executors.newSingleThreadScheduledExecutor();
public CompletableFuture<Integer> requestAThing(String req1) {
return this.scheduleCompletableFuture(new ActualApi().requestHandler(req1), delay.incrementAndGet(), TimeUnit.SECONDS);
}
public <T> CompletableFuture<T> scheduleCompletableFuture(
final CompletableFuture<T> command,
final long delay,
final TimeUnit unit) {
final CompletableFuture<T> completableFuture = new CompletableFuture<>();
this.executorService.schedule(
(() -> {
try {
return completableFuture.complete(command.get());
} catch (Throwable t) {
return completableFuture.completeExceptionally(t);
}
}),
delay,
unit
);
return completableFuture;
}
}
public static class ActualApi {
public CompletableFuture<Integer> requestHandler(String request) {
return CompletableFuture.supplyAsync(() -> {
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
return Integer.parseInt(request.substring(3));
});
}
}
}
JDK 9 and onward
If you are using a JDK 9 version, you may make use of the supported delayed Executor:
CompletableFuture<String> future = new CompletableFuture<>();
future.completeAsync(() -> {
try {
// do something
} catch(Throwable e) {
// do something on error
}
}, CompletableFuture.delayedExecutor(1, TimeUnit.SECONDS));
Your MockApi#requestAThing would then be cleaner and shorter and you are no more in need of a custom ScheduledExecutor:
public static class MockApi {
private final AtomicLong delay = new AtomicLong(0);
public CompletableFuture<Integer> requestAThing(String req1) {
CompletableFuture<Void> future = new CompletableFuture<>();
return future.completeAsync(() -> null, CompletableFuture.delayedExecutor(delay.incrementAndGet(), TimeUnit.SECONDS))
.thenCombineAsync(new ActualApi().requestHandler(req1), (nil, result) -> result);
}
// ...
}
You might consider using bucket4j
I have found a way to produce my desired behaviour. By limiting my Executor to a single Thread I can queue up calls and they will follow the order I queued them up in.
I will supply the code of my mock classes below for anyone interested:
import java.util.Random;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
public class Sandbox2 {
public static void main(String[] args) throws ExecutionException, InterruptedException {
MockApi mockApi = new MockApi();
CompletableFuture<Integer> result1 = mockApi.requestAThing("Req1");
System.out.println("Request1 queued up");
CompletableFuture<Integer> result2 = mockApi.requestAThing("Req2");
System.out.println("Request2 queued up");
CompletableFuture<Integer> result3 = mockApi.requestAThing("Req3");
System.out.println("Request3 queued up");
//Some other logic happens here
Thread.sleep(10000);
System.out.println("Result1: " + result1.get());
System.out.println("Result2: " + result2.get());
System.out.println("Result3: " + result3.get());
System.exit(0);
}
public static class MockApi {
ActualApi actualApi = new ActualApi();
private ExecutorService executorService = Executors.newSingleThreadExecutor();
;
public CompletableFuture<Integer> requestAThing(String req1) {
CompletableFuture<Integer> completableFutureCompletableFuture = CompletableFuture.supplyAsync(() -> {
try {
System.out.println("Waiting with " + req1);
waitForTheRightMoment(new Random().nextInt(1000) + 1000);
System.out.println("Done Waiting with " + req1);
return actualApi.requestHandler(req1).get();
} catch (InterruptedException e) {
e.printStackTrace();
} catch (ExecutionException e) {
e.printStackTrace();
}
return null;
}, executorService);
return completableFutureCompletableFuture;
}
private void waitForTheRightMoment(int time) {
try {
Thread.sleep(time);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
public static class ActualApi {
public CompletableFuture<Integer> requestHandler(String request) {
return CompletableFuture.supplyAsync(() -> {
try {
Thread.sleep(new Random().nextInt(1000) + 1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
System.out.println("Request Handled " + request);
return Integer.parseInt(request.substring(3));
});
}
}
}
I am trying to write a simple function that long-polls multiple messages tothe downstream dependency without exhausting it and only exist when all messages succeeded.
I came up with a way to wrap each message polling into a callable and use a ExecutorService to submit a list of callables.
public void poll(final List<Long> messageIdList) {
ExecutorService executorService = Executors.newFixedThreadPool(messageIdList.size());
List<MessageStatusCallable> callables = messageIdList.stream()
.map(messageId -> new MessageStatusCallable(messageId)).collect(Collectors.toList());
boolean allSuccess = false;
try {
allSuccess = executorService.invokeAll(callables).stream().allMatch(success -> {
try {
return success.get().equals(Boolean.TRUE);
} catch (InterruptedException e) {
e.printStackTrace();
return false;
} catch (ExecutionException e) {
e.printStackTrace();
return false;
}
});
} catch (InterruptedException e) {
e.printStackTrace();
}
}
private class MessageStatusCallable implements Callable<Boolean> {
private Long messageId;
public MessageStatusCallable(Long messageId) {
this.messageId = messageId;
}
/**
* Computes a result, or throws an exception if unable to do so.
*
* #return computed result
* #throws Exception if unable to compute a result
*/
#Override
public Boolean call() throws Exception {
String messageStatus = downstreamService.getMessageStatus(messageId);
while(messageStatus == null || !messageStatus.equals( STATUS_VALUE_SUCCEEDED) {
messageStatus = messageLogToControlServer.getMessageStatus(messageId);
Thread.sleep(TimeUnit.MICROSECONDS.toMillis(100));
}
LOG.info("Message: " + messageId + " Succeded");
return true;
}
}
I wonder if there is a better way to achieve this since Thread.sleep is blocking and ugly.
I'm not sure this is the best solution but it occurred to me you could use a CountDownLatch and ScheduledExecutorService.
public void poll(final List<Long> messageIdList) throws InterruptedException {
CountDownLatch latch = new CountDownLatch(messageIdList.size());
ScheduledExecutorService executorService = Executors.newScheduledThreadPool(POOL_SIZE);
try {
for (Long messageId : messageIdList) {
MessageStatusCallable callable = new MessageStatusCallable(messageId, latch);
executorService.scheduleWithFixedDelay(
() -> {
String messageStatus = downstreamService.getMessageStatus(messageId);
if (STATUS_VALUE_SUCCEEDED.equals(messageStatus)) {
latch.countDown();
throw new CompletionException("Success - killing the task", null);
}
},
0, 100, TimeUnit.MILLISECONDS);
}
latch.await();
} finally {
executorService.shutdown();
}
}
I probably also wouldn't have the Runnable as a lambda other than for brevity in the answer.
What is the best practice approach to launch a pool of 1000's of tasks (where up to 4 should be able to execute in parallel) and automatically timeout them if they take more than 3 seconds (individually)?
While I found that ExecutorService seems to be helpful (see SSCE from another post below), I don't see how to make this work for multiple tasks running in parallel (as the future.get(3, TimeUnit.SECONDS) is executing on the same thread than the one launching the tasks, hence no opportunity to launch multiple tasks in parallel):
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;
public class Test {
public static void main(String[] args) throws Exception {
ExecutorService executor = Executors.newSingleThreadExecutor();
Future<String> future = executor.submit(new Task());
try {
System.out.println("Started..");
System.out.println(future.get(3, TimeUnit.SECONDS));
System.out.println("Finished!");
} catch (TimeoutException e) {
future.cancel(true);
System.out.println("Terminated!");
}
executor.shutdownNow();
}
}
class Task implements Callable<String> {
#Override
public String call() throws Exception {
Thread.sleep(4000); // Just to demo a long running task of 4 seconds.
return "Ready!";
}
}
Thanks!
If you have to monitor each task to kill it when it exceeds the timeout period, either
the task itself has to keep track of time and quit appropriately, OR
you have to create a second watchdog thread for every task. The watchdog thread sets a timer and sleeps, waking up after the timeout interval expires and then terminating the task if it's still running.
This is a tricky one. Here’s what I came up with:
public class TaskQueue<T> {
private static final Logger logger =
Logger.getLogger(TaskQueue.class.getName());
private final Collection<Callable<T>> tasks;
private final int maxTasks;
private int addsPending;
private final Collection<T> results = new ArrayList<T>();
private final ScheduledExecutorService executor;
public TaskQueue() {
this(4);
}
public TaskQueue(int maxSimultaneousTasks) {
maxTasks = maxSimultaneousTasks;
tasks = new ArrayDeque<>(maxTasks);
executor = Executors.newScheduledThreadPool(maxTasks * 3);
}
private void addWhenAllowed(Callable<T> task)
throws InterruptedException,
ExecutionException {
synchronized (tasks) {
while (tasks.size() >= maxTasks) {
tasks.wait();
}
tasks.add(task);
if (--addsPending <= 0) {
tasks.notifyAll();
}
}
Future<T> future = executor.submit(task);
executor.schedule(() -> future.cancel(true), 3, TimeUnit.SECONDS);
try {
T result = future.get();
synchronized (tasks) {
results.add(result);
}
} catch (CancellationException e) {
logger.log(Level.FINE, "Canceled", e);
} finally {
synchronized (tasks) {
tasks.remove(task);
if (tasks.isEmpty()) {
tasks.notifyAll();
}
}
}
}
public void add(Callable<T> task) {
synchronized (tasks) {
addsPending++;
}
executor.submit(new Callable<Void>() {
#Override
public Void call()
throws InterruptedException,
ExecutionException {
addWhenAllowed(task);
return null;
}
});
}
public Collection<T> getAllResults()
throws InterruptedException {
synchronized (tasks) {
while (addsPending > 0 || !tasks.isEmpty()) {
tasks.wait();
}
return new ArrayList<T>(results);
}
}
public void shutdown() {
executor.shutdown();
}
}
I suspect it could be done more cleanly using Locks and Conditions instead of synchronization.
There is a method which i need to update frequently for every some specific time , so i was testing java ExecutorService , but my method is not getting frequently updated , could you please tell me why ?
These are my classes
FutureTask.java
package com;
import java.lang.reflect.Method;
import java.util.concurrent.*;
public class FutureTask {
private static ExecutorService executor = Executors.newCachedThreadPool();
private static FutureTask _instance = new FutureTask();
public static FutureTask getInstance() {
return _instance;
}
private static int timoutsec = 15;
public Object submiteTask(final Object obj, final Method method,
final Object[] params) throws Exception {
return submiteTask(obj, method, params, -1);
}
public Object submiteTask(final Object obj, final Method method,
final Object[] params, int timeoutSeconds) throws Exception {
if (null != obj && method != null) {
Callable<Object> task = new Callable<Object>() {
public Object call() {
try {
method.setAccessible(true);
Object resultObj = method.invoke(obj, params);
return resultObj;
} catch (Exception e) {
}
return null;
}
};
Future<Object> future = executor.submit(task);
try {
Object result = null;
if (timeoutSeconds < 0) {
result = future.get(timoutsec, TimeUnit.SECONDS);
} else {
result = future.get(timeoutSeconds, TimeUnit.SECONDS);
}
return result;
} catch (TimeoutException e) {
} catch (Exception e) {
} finally {
future.cancel(true);
}
}
return null;
}
public static void main(String args[]) {
try {
FutureTask.getInstance().submiteTask(
new TestingFutureTaskUtil(),
TestingFutureTaskUtil.class.getDeclaredMethod(
"updateMethodCalled",
new Class<?>[] { String.class }),
new Object[] { "UBSC!OC1010" }, 1);
} catch (Exception e) {
e.printStackTrace();
}
}
}
TestingFutureTaskUtil.java
package com;
public class TestingFutureTaskUtil {
public void updateMethodCalled(String symbol) {
System.out.println("updateMethodCalled" + symbol);
}
}
Thanks in advance .
You only submit one job, so updateMethodCalled is only called once.
You are using a normal ExecutorService. It doesn't allow to schedule tasks. You need to use a ScheduledExecutorService.
You need to change the following:
private static ScheduledExecutorService executor = Executors.newScheduledThreadPool(poolSize);
and:
Future<Object> future = executor.scheduleAtFixedRate(task, timeoutSeconds, timeoutSeconds, TimeUnit.SECONDS);
Now the task will be executed every "timeoutSeconds" Seconds. Afterwards you can return the ScheduledFuture and can get the updated values from it.
Maybe it is just because of the example but I would create an callable outside and hand that to FutureTask. Than you don't need Reflection. Also the way you doing an asynchronous call is wrong because the calling thread always waits for the computation to finish. Therefore, you don't gain any benefits from the running the method in an other thread. Maybe you need to rethink the whole design of what you are doing.
I have the following code snippet that basically scans through the list of task that needs to be executed and each task is then given to the executor for execution.
The JobExecutor in turn creates another executor (for doing db stuff...reading and writing data to queue) and completes the task.
JobExecutor returns a Future<Boolean> for the tasks submitted. When one of the task fails, I want to gracefully interrupt all the threads and shutdown the executor by catching all the exceptions. What changes do I need to do?
public class DataMovingClass {
private static final AtomicInteger uniqueId = new AtomicInteger(0);
private static final ThreadLocal<Integer> uniqueNumber = new IDGenerator();
ThreadPoolExecutor threadPoolExecutor = null ;
private List<Source> sources = new ArrayList<Source>();
private static class IDGenerator extends ThreadLocal<Integer> {
#Override
public Integer get() {
return uniqueId.incrementAndGet();
}
}
public void init(){
// load sources list
}
public boolean execute() {
boolean succcess = true ;
threadPoolExecutor = new ThreadPoolExecutor(10,10,
10, TimeUnit.SECONDS, new ArrayBlockingQueue<Runnable>(1024),
new ThreadFactory() {
public Thread newThread(Runnable r) {
Thread t = new Thread(r);
t.setName("DataMigration-" + uniqueNumber.get());
return t;
}// End method
}, new ThreadPoolExecutor.CallerRunsPolicy());
List<Future<Boolean>> result = new ArrayList<Future<Boolean>>();
for (Source source : sources) {
result.add(threadPoolExecutor.submit(new JobExecutor(source)));
}
for (Future<Boolean> jobDone : result) {
try {
if (!jobDone.get(100000, TimeUnit.SECONDS) && success) {
// in case of successful DbWriterClass, we don't need to change
// it.
success = false;
}
} catch (Exception ex) {
// handle exceptions
}
}
}
public class JobExecutor implements Callable<Boolean> {
private ThreadPoolExecutor threadPoolExecutor ;
Source jobSource ;
public SourceJobExecutor(Source source) {
this.jobSource = source;
threadPoolExecutor = new ThreadPoolExecutor(10,10,10, TimeUnit.SECONDS, new ArrayBlockingQueue<Runnable>(1024),
new ThreadFactory() {
public Thread newThread(Runnable r) {
Thread t = new Thread(r);
t.setName("Job Executor-" + uniqueNumber.get());
return t;
}// End method
}, new ThreadPoolExecutor.CallerRunsPolicy());
}
public Boolean call() throws Exception {
boolean status = true ;
System.out.println("Starting Job = " + jobSource.getName());
try {
// do the specified task ;
}catch (InterruptedException intrEx) {
logger.warn("InterruptedException", intrEx);
status = false ;
} catch(Exception e) {
logger.fatal("Exception occurred while executing task "+jobSource.getName(),e);
status = false ;
}
System.out.println("Ending Job = " + jobSource.getName());
return status ;
}
}
}
When you submit a task to the executor, it returns you a FutureTask instance.
FutureTask.get() will re-throw any exception thrown by the task as an ExecutorException.
So when you iterate through the List<Future> and call get on each, catch ExecutorException and invoke an orderly shutdown.
Since you are submitting tasks to ThreadPoolExecutor, the exceptions are getting swallowed by FutureTask.
Have a look at this code
**Inside FutureTask$Sync**
void innerRun() {
if (!compareAndSetState(READY, RUNNING))
return;
runner = Thread.currentThread();
if (getState() == RUNNING) { // recheck after setting thread
V result;
try {
result = callable.call();
} catch (Throwable ex) {
setException(ex);
return;
}
set(result);
} else {
releaseShared(0); // cancel
}
}
protected void setException(Throwable t) {
sync.innerSetException(t);
}
From above code, it is clear that setException method catching Throwable. Due to this reason, FutureTask is swallowing all exceptions if you use "submit()" method on ThreadPoolExecutor
As per java documentation, you can extend afterExecute() method in ThreadPoolExecutor
protected void afterExecute(Runnable r,
Throwable t)
Sample code as per documentation:
class ExtendedExecutor extends ThreadPoolExecutor {
// ...
protected void afterExecute(Runnable r, Throwable t) {
super.afterExecute(r, t);
if (t == null && r instanceof Future<?>) {
try {
Object result = ((Future<?>) r).get();
} catch (CancellationException ce) {
t = ce;
} catch (ExecutionException ee) {
t = ee.getCause();
} catch (InterruptedException ie) {
Thread.currentThread().interrupt(); // ignore/reset
}
}
if (t != null)
System.out.println(t);
}
}
You can catch Exceptions in three ways
Future.get() as suggested in accepted answer
wrap entire run() or call() method in try{}catch{}Exceptoion{} blocks
override afterExecute of ThreadPoolExecutor method as shown above
To gracefully interrupt other Threads, have a look at below SE question:
How to stop next thread from running in a ScheduledThreadPoolExecutor
How to forcefully shutdown java ExecutorService
Subclass ThreadPoolExecutor and override its protected afterExecute (Runnable r, Throwable t) method.
If you're creating a thread pool via the java.util.concurrent.Executors convenience class (which you're not), take at look at its source to see how it's invoking ThreadPoolExecutor.