Throw Exception in Function apply for CompletableFutures - java

I'm having some tasks created like follows (this is just for demonstration normally network calls):
public class RandomTask implements Function<String, String> {
private int number;
private int waitTime;
private boolean throwError;
public RandomTask(int number, int waitTime, boolean throwError) {
this.number = number;
this.waitTime = waitTime;
this.throwError = throwError;
}
#Override
public String apply(String s) {
System.out.println("Job " + number + " started");
try {
Thread.sleep(waitTime);
if (throwError) {
throw new InterruptedException("Something happened");
}
} catch (InterruptedException e) {
System.out.println("Error " + e.getLocalizedMessage());
}
return "RandomTask " + number + " finished";
}
}
Then I have a Chain class where I chain some tasks together per job.
static CompletableFuture<String> start(ExecutorService executorService) {
CompletableFuture<String> future2 = CompletableFuture.supplyAsync(() -> "Foo", executorService)
.thenApplyAsync(new RandomTask(3, 100, false), executorService)
.thenApplyAsync(new RandomTask(4, 100, false), executorService);
return future2;
}
I then start 2 chains as follows:
CompletableFuture<Void> combinedFuture = CompletableFuture.allOf(Chain1.start(fixedThreadPool), Chain2.start(fixedThreadPool));
try {
combinedFuture.get();
} catch (InterruptedException | ExecutionException e) {
e.printStackTrace();
}
That way the two chains start off at the same time.
Now I want to throw an exception in a task and catch it where I call combinedFuture.get() so that I know which task has failed in my chain.
The thing is dat I can't adapt the Function because CompletableFutures complains about this. I tried with:
#FunctionalInterface
public interface CheckedFunction<T, R> {
R apply(T t) throws InterruptedException;
}
But this doesn't work. Is this not possible or how can I achieve my goal?

“That way the two chains start off at the same time.” indicates that you have a fundamentally wrong understanding of how CompletableFuture works.
Asynchronous operations are submitted to the executor service right when you create them or as soon as their prerequisites are available. So in case of supplyAsync, which has no dependencies, the asynchronous operation starts right within the supplyAsync invocation.
All, a construct like CompletableFuture.allOf(job1, job2).get() does, is to create a new stage depending on both jobs and waiting for its completion, so the net result is just to wait for the completion of both jobs. It does not start the jobs. They are already running. Waiting for a completion has no influence of the process of completing.
Chaining a CompletableFuture with a custom function type allowing checked exceptions can be done as
public static <T,R> CompletableFuture<R> thenApplyAsync(
CompletableFuture<T> f, CheckedFunction<? super T, ? extends R> cf,
Executor e) {
CompletableFuture<R> r = new CompletableFuture<>();
f.whenCompleteAsync((v,t) -> {
try {
if(t != null) r.completeExceptionally(t);
else r.complete(cf.apply(v));
} catch(Throwable t2) {
r.completeExceptionally(t2);
}
}, e);
return r;
}
To use this method, instead of chaining calls on the CompletableFuture, you have to nest them. E.g.
static CompletableFuture<String> start(ExecutorService executorService) {
CompletableFuture<String> future2 =
thenApplyAsync(thenApplyAsync(
CompletableFuture.supplyAsync(() -> "Foo", executorService),
new RandomTask(3, 100, false), executorService),
new RandomTask(4, 100, false), executorService);
return future2;
}
given
public class RandomTask implements CheckedFunction<String, String> {
private int number, waitTime;
private boolean throwError;
public RandomTask(int number, int waitTime, boolean throwError) {
this.number = number;
this.waitTime = waitTime;
this.throwError = throwError;
}
#Override
public String apply(String s) throws InterruptedException {
System.out.println("Job " + number + " started");
Thread.sleep(waitTime);
if (throwError) {
throw new InterruptedException("Something happened in "+number);
}
return "RandomTask " + number + " finished";
}
}
You can still create two tasks and wait for both like
CompletableFuture.allOf(Chain1.start(fixedThreadPool), Chain2.start(fixedThreadPool))
.join();

Related

Java: How can I queue up asynchronous calls to be executed when a certain condition is met?

TL;DR: I want to perform an asynchronous Call to a REST-API. The standard call would give me a CompleteableFuture<Response>, however because the API has a limit on how many calls it allows in a certain amount of time I want to be able to queue up calls to 1. execute them in order and 2. execute them only when I am not exceeding the APIs limits at that current moment, otherwise wait.
Long verson:
I am using Retrofit to perform Rest calls to an API and Retrofit returns a CompleteableFuture<WhateverResponseClassIDeclare> when I call it. However due to limitations of the API I am calling I want to have tight control over when and in what order my calls go out to it. In detail, too many calls in a certain timeframe would cause me to get IP banned. Similarly I want to maintain the order of my calls, even if they won't get executed immediately. The goal is to call a Wrapper of the API that returns a CompleteableFuture just like the original API but performs those in-between steps asynchronously.
I was playing around with BlockingQueues, Functions, Callables, Suppliers and everything inbetween, but I couldn't get it to work yet.
Following there is my currently NON FUNCTIONAL code I created as a Mockup to test the concept.
import java.util.concurrent.BlockingDeque;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.LinkedBlockingDeque;
import java.util.function.Function;
public class Sandbox2 {
public static void main(String[] args) throws ExecutionException, InterruptedException {
MockApi mockApi = new MockApi();
CompletableFuture<Integer> result1 = mockApi.requestAThing("Req1");
CompletableFuture<Integer> result2 = mockApi.requestAThing("Req2");
CompletableFuture<Integer> result3 = mockApi.requestAThing("Req3");
System.out.println("Result1: " + result1.get());
System.out.println("Result2: " + result2.get());
System.out.println("Result3: " + result3.get());
}
public static class MockApi {
ActualApi actualApi = new ActualApi();
BlockingDeque<Function<String, CompletableFuture<Integer>>> queueBlockingDeque = new LinkedBlockingDeque();
public CompletableFuture<Integer> requestAThing(String req1) {
Function<String, CompletableFuture<Integer>> function = new Function<String, CompletableFuture<Integer>>() {
#Override
public CompletableFuture<Integer> apply(String s) {
return actualApi.requestHandler(s);
}
};
return CompletableFuture
.runAsync(() -> queueBlockingDeque.addLast(function))
.thenRun(() -> waitForTheRightMoment(1000))
.thenCombine(function)
}
private void waitForTheRightMoment(int time) {
try {
Thread.sleep(time);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
public static class ActualApi {
public CompletableFuture<Integer> requestHandler(String request) {
return CompletableFuture.supplyAsync(() -> {
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
return Integer.parseInt(request.substring(3));
});
}
}
}
Pre JDK 9 (JDK 1.8)
You can make use of ScheduledExecutor that accepts items to execute asynchronously on a pre-configured thread pool at a pre-fixed rate / delay.
You can obtain such a service as follows:
private final ScheduledExecutorService executorService = Executors.newSingleThreadScheduledExecutor();
Once an instance of ScheduledExecutorService is created, you can start submitting items (requests) to be executed as follows:
executorService.schedule(
() -> actualApi.requestHandler(req),
delay,
unit
);
Meanwhile, using a direct call want lead a CompletableFuture<Integer> but instead would lead a ScheduledFuture<CompletableFuture<Integer>> on which you will have to block to get the wrapped result.
Instead, you would need to block on your final requests results inside the ScheduledExecutorService then wrap your final request result in a completed ComppletableFuture:
public <T> CompletableFuture<T> scheduleCompletableFuture(
final CompletableFuture<T> command,
final long delay,
final TimeUnit unit) {
final CompletableFuture<T> completableFuture = new CompletableFuture<>();
this.executorService.schedule(
(() -> {
try {
return completableFuture.complete(command.get());
} catch (Throwable t) {
return completableFuture.completeExceptionally(t);
}
}),
delay,
unit
);
return completableFuture;
}
Here down a review version of your implementation:
public class Sandbox2 {
public static void main(String[] args) throws ExecutionException, InterruptedException {
MockApi mockApi = new MockApi();
CompletableFuture<Integer> result1 = mockApi.requestAThing("Req1");
CompletableFuture<Integer> result2 = mockApi.requestAThing("Req2");
CompletableFuture<Integer> result3 = mockApi.requestAThing("Req3");
System.out.println("Result1: " + result1.get());
System.out.println("Result2: " + result2.get());
System.out.println("Result3: " + result3.get());
}
public static class MockApi {
private final AtomicLong delay = new AtomicLong(0);
private final ScheduledExecutorService executorService = Executors.newSingleThreadScheduledExecutor();
public CompletableFuture<Integer> requestAThing(String req1) {
return this.scheduleCompletableFuture(new ActualApi().requestHandler(req1), delay.incrementAndGet(), TimeUnit.SECONDS);
}
public <T> CompletableFuture<T> scheduleCompletableFuture(
final CompletableFuture<T> command,
final long delay,
final TimeUnit unit) {
final CompletableFuture<T> completableFuture = new CompletableFuture<>();
this.executorService.schedule(
(() -> {
try {
return completableFuture.complete(command.get());
} catch (Throwable t) {
return completableFuture.completeExceptionally(t);
}
}),
delay,
unit
);
return completableFuture;
}
}
public static class ActualApi {
public CompletableFuture<Integer> requestHandler(String request) {
return CompletableFuture.supplyAsync(() -> {
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
return Integer.parseInt(request.substring(3));
});
}
}
}
JDK 9 and onward
If you are using a JDK 9 version, you may make use of the supported delayed Executor:
CompletableFuture<String> future = new CompletableFuture<>();
future.completeAsync(() -> {
try {
// do something
} catch(Throwable e) {
// do something on error
}
}, CompletableFuture.delayedExecutor(1, TimeUnit.SECONDS));
Your MockApi#requestAThing would then be cleaner and shorter and you are no more in need of a custom ScheduledExecutor:
public static class MockApi {
private final AtomicLong delay = new AtomicLong(0);
public CompletableFuture<Integer> requestAThing(String req1) {
CompletableFuture<Void> future = new CompletableFuture<>();
return future.completeAsync(() -> null, CompletableFuture.delayedExecutor(delay.incrementAndGet(), TimeUnit.SECONDS))
.thenCombineAsync(new ActualApi().requestHandler(req1), (nil, result) -> result);
}
// ...
}
You might consider using bucket4j
I have found a way to produce my desired behaviour. By limiting my Executor to a single Thread I can queue up calls and they will follow the order I queued them up in.
I will supply the code of my mock classes below for anyone interested:
import java.util.Random;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
public class Sandbox2 {
public static void main(String[] args) throws ExecutionException, InterruptedException {
MockApi mockApi = new MockApi();
CompletableFuture<Integer> result1 = mockApi.requestAThing("Req1");
System.out.println("Request1 queued up");
CompletableFuture<Integer> result2 = mockApi.requestAThing("Req2");
System.out.println("Request2 queued up");
CompletableFuture<Integer> result3 = mockApi.requestAThing("Req3");
System.out.println("Request3 queued up");
//Some other logic happens here
Thread.sleep(10000);
System.out.println("Result1: " + result1.get());
System.out.println("Result2: " + result2.get());
System.out.println("Result3: " + result3.get());
System.exit(0);
}
public static class MockApi {
ActualApi actualApi = new ActualApi();
private ExecutorService executorService = Executors.newSingleThreadExecutor();
;
public CompletableFuture<Integer> requestAThing(String req1) {
CompletableFuture<Integer> completableFutureCompletableFuture = CompletableFuture.supplyAsync(() -> {
try {
System.out.println("Waiting with " + req1);
waitForTheRightMoment(new Random().nextInt(1000) + 1000);
System.out.println("Done Waiting with " + req1);
return actualApi.requestHandler(req1).get();
} catch (InterruptedException e) {
e.printStackTrace();
} catch (ExecutionException e) {
e.printStackTrace();
}
return null;
}, executorService);
return completableFutureCompletableFuture;
}
private void waitForTheRightMoment(int time) {
try {
Thread.sleep(time);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
public static class ActualApi {
public CompletableFuture<Integer> requestHandler(String request) {
return CompletableFuture.supplyAsync(() -> {
try {
Thread.sleep(new Random().nextInt(1000) + 1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
System.out.println("Request Handled " + request);
return Integer.parseInt(request.substring(3));
});
}
}
}

Concurrent polling downstream dependencies and wait until all of them succeed

I am trying to write a simple function that long-polls multiple messages tothe downstream dependency without exhausting it and only exist when all messages succeeded.
I came up with a way to wrap each message polling into a callable and use a ExecutorService to submit a list of callables.
public void poll(final List<Long> messageIdList) {
ExecutorService executorService = Executors.newFixedThreadPool(messageIdList.size());
List<MessageStatusCallable> callables = messageIdList.stream()
.map(messageId -> new MessageStatusCallable(messageId)).collect(Collectors.toList());
boolean allSuccess = false;
try {
allSuccess = executorService.invokeAll(callables).stream().allMatch(success -> {
try {
return success.get().equals(Boolean.TRUE);
} catch (InterruptedException e) {
e.printStackTrace();
return false;
} catch (ExecutionException e) {
e.printStackTrace();
return false;
}
});
} catch (InterruptedException e) {
e.printStackTrace();
}
}
private class MessageStatusCallable implements Callable<Boolean> {
private Long messageId;
public MessageStatusCallable(Long messageId) {
this.messageId = messageId;
}
/**
* Computes a result, or throws an exception if unable to do so.
*
* #return computed result
* #throws Exception if unable to compute a result
*/
#Override
public Boolean call() throws Exception {
String messageStatus = downstreamService.getMessageStatus(messageId);
while(messageStatus == null || !messageStatus.equals( STATUS_VALUE_SUCCEEDED) {
messageStatus = messageLogToControlServer.getMessageStatus(messageId);
Thread.sleep(TimeUnit.MICROSECONDS.toMillis(100));
}
LOG.info("Message: " + messageId + " Succeded");
return true;
}
}
I wonder if there is a better way to achieve this since Thread.sleep is blocking and ugly.
I'm not sure this is the best solution but it occurred to me you could use a CountDownLatch and ScheduledExecutorService.
public void poll(final List<Long> messageIdList) throws InterruptedException {
CountDownLatch latch = new CountDownLatch(messageIdList.size());
ScheduledExecutorService executorService = Executors.newScheduledThreadPool(POOL_SIZE);
try {
for (Long messageId : messageIdList) {
MessageStatusCallable callable = new MessageStatusCallable(messageId, latch);
executorService.scheduleWithFixedDelay(
() -> {
String messageStatus = downstreamService.getMessageStatus(messageId);
if (STATUS_VALUE_SUCCEEDED.equals(messageStatus)) {
latch.countDown();
throw new CompletionException("Success - killing the task", null);
}
},
0, 100, TimeUnit.MILLISECONDS);
}
latch.await();
} finally {
executorService.shutdown();
}
}
I probably also wouldn't have the Runnable as a lambda other than for brevity in the answer.

How can I cancel the Future of a multi-threaded busy task?

In my code I have to run a task that makes heavy use of recursion and parallel stream processing in order to go deep into a tree of possible games moves and decide what's the best move. This takes a lot of time, so to prevent the user from waiting for too long for the computer to "think" I want to set a time out of, say, 1000 milliseconds. If the best move is not found withing 1000 msec then the computer will play a random move.
My problem is that although I call cancel on Future (with may interrupt set to true), the task is not interrupted and the busy threads keep running in the background.
I tried to periodically check for isInterrupted() on the current and then try to bail out, but this didn't help.
Any ideas?
Below is my code:
public Move bestMove() {
ExecutorService executor = Executors.newSingleThreadExecutor();
Callable<Move> callable = () -> bestEntry(bestMoves()).getKey();
Future<Move> future = executor.submit(callable);
try {
return future.get(1000, TimeUnit.MILLISECONDS);
} catch (InterruptedException e) {
System.exit(0);
} catch (ExecutionException e) {
throw new RuntimeException(e);
} catch (TimeoutException e) {
future.cancel(true);
return randomMove();
}
return null;
}
private Move randomMove() {
Random random = new Random();
List<Move> moves = state.possibleMoves();
return moves.get(random.nextInt(moves.size()));
}
private <K> Map.Entry<K, Double> bestEntry(Map<K, Double> map) {
List<Map.Entry<K, Double>> list = new ArrayList<>(map.entrySet());
Collections.sort(list, (e1, e2) -> (int) (e2.getValue() - e1.getValue()));
return list.get(0);
}
private <K> Map.Entry<K, Double> worstEntry(Map<K, Double> map) {
List<Map.Entry<K, Double>> list = new ArrayList<>(map.entrySet());
Collections.sort(list, (e1, e2) -> (int) (e1.getValue() - e2.getValue()));
return list.get(0);
}
private Map<Move, Double> bestMoves() {
Map<Move, Double> moves = new HashMap<>();
state.possibleMoves().stream().parallel().forEach(move -> {
if (!Thread.currentThread().isInterrupted()) {
Game newState = state.playMove(move);
Double score = newState.isTerminal() ? newState.utility()
: worstEntry(new (newState).worstMoves()).getValue();
moves.put(move, score);
}
});
return moves;
}
private Map<Move, Double> worstMoves() {
Map<Move, Double> moves = new HashMap<>();
state.possibleMoves().stream().parallel().forEach(move -> {
if (!Thread.currentThread().isInterrupted()) {
Game newState = state.playMove(move);
Double score = newState.isTerminal() ? -newState.utility()
: bestEntry(new (newState).bestMoves()).getValue();
moves.put(move, score);
}
});
return moves;
}
ps: I also tried without "parallel()" but again there is still a single thread left running.
Thanks in advance.
Thank you all for your answers. I think I found a simpler solution.
First of all , I think the reason that future.cancel(true) didn't work is because it probably only set the interrupted flag on the thread that started the task. (that is, the thread that is associated with the future).
However because the task itself uses parallel stream processing, it spawns workers on different threads which are never get interrupted, and therefore I cannot periodically check the isInterrupted() flag.
The "solution" (or maybe more of work-around) that I found is to keep my own interrupted flag in my algorithm's objects, and manually set it to true when the task is cancelled. Because all threads work on the same instanced they all have access to the interrupted flag and they obey.
Future.cancel just set the thread as interrupted, then your code must treat it as follow:
public static void main(String[] args) throws InterruptedException {
final ExecutorService executor = Executors.newSingleThreadExecutor();
final Future<Integer> future = executor.submit(() -> count());
try {
System.out.println(future.get(1, TimeUnit.SECONDS));
} catch (Exception e){
future.cancel(true);
e.printStackTrace();
}finally {
System.out.printf("status=finally, cancelled=%s, done=%s%n", future.isCancelled(), future.isDone());
executor.shutdown();
}
}
static int count() throws InterruptedException {
while (!Thread.interrupted());
throw new InterruptedException();
}
As you can see the count keep checking if the thread is available to keep running, you have to understand that actually there is not guarantee that a running Thread can be stopped if she don't want to.
Reference:
Why set the interrupt bit in a Callable
how to suspend thread using thread's id?
UPDATE 2017-11-18 23:22
I wrote a FutureTask extension that have the ability to try to stop the Thread even if the code doesn't respect the interrupt signal. Keep in mind that it is unsafe because the Thread.stop method is deprecated, anyway it is working and if you really need that you can use it (Please read Thread.stop deprecation notes before, for example, if you are using locks, then run .stop can cause deadlocks).
Test code
public static void main(String[] args) throws InterruptedException {
final ExecutorService executor = newFixedSizeExecutor(1);
final Future<Integer> future = executor.submit(() -> count());
try {
System.out.println(future.get(1, TimeUnit.SECONDS));
} catch (Exception e){
future.cancel(true);
e.printStackTrace();
}
System.out.printf("status=finally, cancelled=%s, done=%s%n", future.isCancelled(), future.isDone());
executor.shutdown();
}
static int count() throws InterruptedException {
while (true);
}
Custom Executor
static ThreadPoolExecutor newFixedSizeExecutor(final int threads) {
return new ThreadPoolExecutor(threads, threads, 0L, TimeUnit.MILLISECONDS, new LinkedBlockingQueue<>()){
protected <T> RunnableFuture<T> newTaskFor(Callable<T> callable) {
return new StoppableFutureTask<>(new FutureTask<>(callable));
}
};
}
static class StoppableFutureTask<T> implements RunnableFuture<T> {
private final FutureTask<T> future;
private Field runnerField;
public StoppableFutureTask(FutureTask<T> future) {
this.future = future;
try {
final Class clazz = future.getClass();
runnerField = clazz.getDeclaredField("runner");
runnerField.setAccessible(true);
} catch (Exception e) {
throw new Error(e);
}
}
#Override
public boolean cancel(boolean mayInterruptIfRunning) {
final boolean cancelled = future.cancel(mayInterruptIfRunning);
if(cancelled){
try {
((Thread) runnerField.get(future)).stop();
} catch (Exception e) {
throw new Error(e);
}
}
return cancelled;
}
#Override
public boolean isCancelled() {
return future.isCancelled();
}
#Override
public boolean isDone() {
return future.isDone();
}
#Override
public T get() throws InterruptedException, ExecutionException {
return future.get();
}
#Override
public T get(long timeout, TimeUnit unit) throws InterruptedException, ExecutionException, TimeoutException {
return future.get(timeout, unit);
}
#Override
public void run() {
future.run();
}
}
output
java.util.concurrent.TimeoutException
at java.util.concurrent.FutureTask.get(FutureTask.java:205)
at com.mageddo.spark.sparkstream_1.Main$StoppableFutureTask.get(Main.java:91)
at com.mageddo.spark.sparkstream_1.Main.main(Main.java:20)
status=finally, cancelled=true, done=true
Process finished with exit code 0

How to execute an Array of CompletableFuture and compose their results

I am investigating Java 8 CompletableFutures and read (and seen) that I should employ thenCompose instead of thenApply.
I have converted my code to use thenCompose but I have a feeling in an incorrect manner.
Here is my controlling code...
final CompletableFuture<List<String>> extractor = get(htmlPageSource);
#SuppressWarnings("unchecked")
final CompletableFuture<List<Documentable>>[] completableFutures =
new CompletableFuture[ENDPOINT.EXTRACTABLES.size()];
int index = 0;
for( ENDPOINT endpoint : ENDPOINT.EXTRACTABLES ) {
final CompletableFuture<List<Documentable>> metaData =
extractor.thenComposeAsync(
s -> endpoint.contactEndpoit(s), executorService );
completableFutures[index++] = metaData.exceptionally(x -> failedList(x));
}
CompletableFuture
.allOf( completableFutures )
.thenComposeAsync( dummy -> combineDocuments( completableFutures ))
.thenAccept ( x -> finish( x ))
.exceptionally( x -> failed( x ));
private List<Documentable> failedList(final Throwable x) {
LOGGER.error("failedList", x);
final List<Documentable> metaData = new ArrayList<>();
return metaData;
}
private Void failed(final Throwable x) {
LOGGER.error("failed", x);
return null;
}
Which I believe is acceptable
However the code that makes me uneasy is this:-
WWW_SITE_ONE("https://example.site.one/") {
#Override
public <T extends Documentable> CompletionStage<List<T>> contactEndpoit( final List<String> elements) {
LOGGER.info("WWW_SITE_ONE " + Thread.currentThread().getName());
final List<T> SITE_ONEs = new ArrayList<>();
for (final String element : elements) {
try {
final String json = Jsoup.connect(ENDPOINT.WWW_SITE_ONE.getBaseUrl() + element).ignoreContentType(true).ignoreHttpErrors(true).maxBodySize(0).timeout(60000).execute().body();
if (json.contains("errors")) {
continue;
}
final T SITE_ONE = OBJECT_READER_SITE_ONE.readValue(json);
SITE_ONEs.add(SITE_ONE);
}
catch( final Throwable e ) {
LOGGER.error("WWW_SITE_ONE failed", e);
throw new RuntimeException(e);
}
}
return CompletableFuture.supplyAsync(() -> SITE_ONEs);
}
},
WWW_SITE_TWO("https://example.site.two/") {
#Override
public <T extends Documentable> CompletionStage<List<T>> contactEndpoit(final List<String> elements) {
LOGGER.info("WWW_SITE_TWO " + Thread.currentThread().getName());
final List<T> SITE_TWOs = new ArrayList<>();
for (final String element : elements) {
try {
final String json = Jsoup.connect(ENDPOINT.WWW_SITE_TWO.getBaseUrl() + element).ignoreContentType(true).ignoreHttpErrors(true).maxBodySize(0).timeout(60000).execute().body();
if (json.equals("Resource not found.")) {
continue;
}
final T SITE_TWO = OBJECT_READER_SITE_TWO.readValue(json);
SITE_TWOs.add(SITE_TWO);
}
catch (final Throwable e) {
LOGGER.error("WWW_SITE_TWO failed", e);
throw new RuntimeException(e);
}
}
return CompletableFuture.supplyAsync(() -> SITE_TWOs);
}
},
WWW_SITE_THREE("https://example.site.three/") {
#Override
public <T extends Documentable> CompletionStage<List<T>> contactEndpoit(final List<String> elements) {
LOGGER.info("WWW_SITE_THREE " + Thread.currentThread().getName());
final List<T> SITE_THREEs = new ArrayList<>();
for (final String element : elements) {
try {
final String SITE_THREEJsonString = Jsoup
.connect( ENDPOINT.WWW_SITE_THREE.getBaseUrl() + element)
.ignoreContentType(true)
.ignoreHttpErrors(true)
.maxBodySize(0)
.timeout(60000)
.execute()
.body();
final SITE_THREE SITE_THREE_Json = OBJECT_READER_SITE_THREE.readValue(SITE_THREEJsonString);
final T SITE_THREE = (T) SITE_THREE_Json;
if (SITE_THREE_Json.getHitCount() > 0) {
SITE_THREEs.add(SITE_THREE);
}
}
catch (final Throwable e) {
LOGGER.error("WWW_SITE_THREE failed", e);
throw new RuntimeException(e);
}
}
return CompletableFuture.supplyAsync(() -> SITE_THREEs);
}
};
Its where I am returning CompletableFuture.supplyAsync(() -> SITE_THREEs);
Is this the correct approach?
Or does this start another asynchronous thread to simply return my List<>?
As the name suggests, supplyAsync will perform an asynchronous operation, executing the Supplier’s get() method, hence the body of the lambda expression, in a background thread, regardless of how trivial it is. Since the implementation of supplyAsync has no way to check how trivial the code encapsulated by the Supplier is, it has to work this way.
Instead of CompletableFuture.supplyAsync(() -> SITE_THREEs), you should use CompletableFuture.completedFuture(SITE_THREEs) which returns a future that has already been completed with the result, hence, not requiring additional actions.
If the method only returns completed stages or throws an exception, you may also change it to return the result value instead of a CompletionStage and use thenApply instead of thenCompose, simplifying your code—unless you want to keep the option of introducing asynchronous operations in a future version of that method.

Handling Exceptions for ThreadPoolExecutor

I have the following code snippet that basically scans through the list of task that needs to be executed and each task is then given to the executor for execution.
The JobExecutor in turn creates another executor (for doing db stuff...reading and writing data to queue) and completes the task.
JobExecutor returns a Future<Boolean> for the tasks submitted. When one of the task fails, I want to gracefully interrupt all the threads and shutdown the executor by catching all the exceptions. What changes do I need to do?
public class DataMovingClass {
private static final AtomicInteger uniqueId = new AtomicInteger(0);
private static final ThreadLocal<Integer> uniqueNumber = new IDGenerator();
ThreadPoolExecutor threadPoolExecutor = null ;
private List<Source> sources = new ArrayList<Source>();
private static class IDGenerator extends ThreadLocal<Integer> {
#Override
public Integer get() {
return uniqueId.incrementAndGet();
}
}
public void init(){
// load sources list
}
public boolean execute() {
boolean succcess = true ;
threadPoolExecutor = new ThreadPoolExecutor(10,10,
10, TimeUnit.SECONDS, new ArrayBlockingQueue<Runnable>(1024),
new ThreadFactory() {
public Thread newThread(Runnable r) {
Thread t = new Thread(r);
t.setName("DataMigration-" + uniqueNumber.get());
return t;
}// End method
}, new ThreadPoolExecutor.CallerRunsPolicy());
List<Future<Boolean>> result = new ArrayList<Future<Boolean>>();
for (Source source : sources) {
result.add(threadPoolExecutor.submit(new JobExecutor(source)));
}
for (Future<Boolean> jobDone : result) {
try {
if (!jobDone.get(100000, TimeUnit.SECONDS) && success) {
// in case of successful DbWriterClass, we don't need to change
// it.
success = false;
}
} catch (Exception ex) {
// handle exceptions
}
}
}
public class JobExecutor implements Callable<Boolean> {
private ThreadPoolExecutor threadPoolExecutor ;
Source jobSource ;
public SourceJobExecutor(Source source) {
this.jobSource = source;
threadPoolExecutor = new ThreadPoolExecutor(10,10,10, TimeUnit.SECONDS, new ArrayBlockingQueue<Runnable>(1024),
new ThreadFactory() {
public Thread newThread(Runnable r) {
Thread t = new Thread(r);
t.setName("Job Executor-" + uniqueNumber.get());
return t;
}// End method
}, new ThreadPoolExecutor.CallerRunsPolicy());
}
public Boolean call() throws Exception {
boolean status = true ;
System.out.println("Starting Job = " + jobSource.getName());
try {
// do the specified task ;
}catch (InterruptedException intrEx) {
logger.warn("InterruptedException", intrEx);
status = false ;
} catch(Exception e) {
logger.fatal("Exception occurred while executing task "+jobSource.getName(),e);
status = false ;
}
System.out.println("Ending Job = " + jobSource.getName());
return status ;
}
}
}
When you submit a task to the executor, it returns you a FutureTask instance.
FutureTask.get() will re-throw any exception thrown by the task as an ExecutorException.
So when you iterate through the List<Future> and call get on each, catch ExecutorException and invoke an orderly shutdown.
Since you are submitting tasks to ThreadPoolExecutor, the exceptions are getting swallowed by FutureTask.
Have a look at this code
**Inside FutureTask$Sync**
void innerRun() {
if (!compareAndSetState(READY, RUNNING))
return;
runner = Thread.currentThread();
if (getState() == RUNNING) { // recheck after setting thread
V result;
try {
result = callable.call();
} catch (Throwable ex) {
setException(ex);
return;
}
set(result);
} else {
releaseShared(0); // cancel
}
}
protected void setException(Throwable t) {
sync.innerSetException(t);
}
From above code, it is clear that setException method catching Throwable. Due to this reason, FutureTask is swallowing all exceptions if you use "submit()" method on ThreadPoolExecutor
As per java documentation, you can extend afterExecute() method in ThreadPoolExecutor
protected void afterExecute(Runnable r,
Throwable t)
Sample code as per documentation:
class ExtendedExecutor extends ThreadPoolExecutor {
// ...
protected void afterExecute(Runnable r, Throwable t) {
super.afterExecute(r, t);
if (t == null && r instanceof Future<?>) {
try {
Object result = ((Future<?>) r).get();
} catch (CancellationException ce) {
t = ce;
} catch (ExecutionException ee) {
t = ee.getCause();
} catch (InterruptedException ie) {
Thread.currentThread().interrupt(); // ignore/reset
}
}
if (t != null)
System.out.println(t);
}
}
You can catch Exceptions in three ways
Future.get() as suggested in accepted answer
wrap entire run() or call() method in try{}catch{}Exceptoion{} blocks
override afterExecute of ThreadPoolExecutor method as shown above
To gracefully interrupt other Threads, have a look at below SE question:
How to stop next thread from running in a ScheduledThreadPoolExecutor
How to forcefully shutdown java ExecutorService
Subclass ThreadPoolExecutor and override its protected afterExecute (Runnable r, Throwable t) method.
If you're creating a thread pool via the java.util.concurrent.Executors convenience class (which you're not), take at look at its source to see how it's invoking ThreadPoolExecutor.

Categories

Resources