Own ExecutorService used to create CompletableFuture does not terminate - java

I'm trying to use my own ExecutorService to create a set of CompletableFutures to chain several process steps. These steps might throw exceptions.
When they do, it seems to me the thread in the ExecutorService is not released although I'm trying to handle this case.
class Scratch {
public static void main(String[] args) throws InterruptedException {
ExecutorService executor = Executors.newFixedThreadPool(10);
AtomicInteger counter = new AtomicInteger();
Supplier<?> throwingException = () -> { throw new RuntimeException("throw " + counter.incrementAndGet()); };
Function<String, CompletableFuture<?>> process =
url -> CompletableFuture.supplyAsync(throwingException, executor)
.exceptionally(Scratch::log);
var collect = IntStream.range(1, 10).mapToObj(i -> "url" + i)
.map(process)
.toArray(CompletableFuture[]::new);
final CompletableFuture<Void> together = CompletableFuture.allOf(collect);
System.out.println("joining");
together.exceptionally(Scratch::log).join();
System.out.println("finished");
if (executor.awaitTermination(5, TimeUnit.SECONDS)) {
System.out.println("exiting cleanly");
} else {
System.out.println("not terminated");
}
executor.submit(() -> System.out.println("still executing"));
}
static <T> T log(Throwable t) {
System.out.println(t.getMessage());
return null;
}
}
Output is
java.lang.RuntimeException: throw 1
joining
java.lang.RuntimeException: throw 2
java.lang.RuntimeException: throw 3
java.lang.RuntimeException: throw 4
java.lang.RuntimeException: throw 5
java.lang.RuntimeException: throw 6
java.lang.RuntimeException: throw 7
java.lang.RuntimeException: throw 8
java.lang.RuntimeException: throw 9
finished
not terminated
The process started by this also isn't terminated (which is how I noticed).
It seems to me this should mean there are no threads left in the ExecutorService at this point, but that doesn't seem to be the case; if we lower the thread pool capacity, it will still run all submitted tasks, and if we add submit another after the failed termination (eg executor.submit(() -> System.out.println("still executing"));), it will get executed.
If we don't pass our own ExecutorService to the CompletableFutre::supplyAsync, the process will terminate as expected.
I also tried other versions of handling the exceptional state (like using together.whenComplete()), but that has the same result.
Why is this happening, and how can I make sure the ExecutorService terminates correctly?
EDIT: I realized that it's not the exception that's causing the problem, this will occur with any task provided to CompletableFuture with your own executor service, which makes total sense given Eugene's reply. I'm changing the question title.

There are two things going on here. First one is that when you execute without an explicit Executor, your actions run in the ForkJoinPool. That pool uses daemon-threads, which do not stop the VM to exit. So when your main is over, VM exists.
The second point is in the documentation of awaitTermination, actually:
Blocks until all tasks have completed execution after a shutdown request, or the timeout occurs, or the current thread is interrupted, whichever happens first.
Since you did not call shutDown and that pool creates non-daemon threads, the process does not exit.

Related

Java CompletableFuture using allOf : if one thread throws exception, how to immediately stop execution of all threads?

// assume: serviceCall1 throws an exception after 1s, servserviceCall2 runs 10s without exception
CompletableFuture<String> serviceCall1Future = serviceCall1.execute();
CompletableFuture<String> serviceCall2Future = serviceCall2.execute();
CompletableFuture<Void> allOffFuture = CompletableFuture.allOf(serviceCall1Future, serviceCall2Future);
// does not work, will be called after thread 2 has finished
allOffFuture.exceptionally( ex -> { allOffFuture.cancel(true); return null; } );
try {
// waiting for threads to finish
allOffFuture.join();
} catch (CompletionException e) {
// does not work, here we come after thread 2 has finished
allOffFuture.cancel(true);
}
If one thread throws an exception, in my case it doesnt make any sense for the other thread(s) to keep on running, so I want them both (all in case of more than 2 threads) to stop . How can I achieve that ?
I guess something like this should work:
CompletableFuture<String> serviceCall1Future = serviceCall1.execute();
CompletableFuture<String> serviceCall2Future = serviceCall2.execute();
CompletableFuture<String> foo1 = serviceCall1Future.whenComplete((result,exception) -> {if(exception != null) serviceCall2Future.cancel(true);});
CompletableFuture<String> foo2 = serviceCall2Future.whenComplete((result,exception) -> {if(exception != null) serviceCall1Future.cancel(true);});
CompletableFuture<Void> allOffFuture = CompletableFuture.allOf(foo1, foo2);
// ... rest of your code
This cancels the other future when the one completes with an exception.
If you are using an ExecutorService with CompletableFuture, you can use Shutdowns methods like shutdown() or shutdownNow().
If you want to shut down the ExecutorService immediately, you can call the shutdownNow() method. This will attempt to stop all executing tasks right away, and skips all submitted but non-processed tasks. There are no guarantees given about the executing tasks. Perhaps they stop, perhaps the execute until the end. It is a best effort attempt. Here is an example of calling ExecutorService shutdownNow()
See -> https://jenkov.com/tutorials/java-util-concurrent/executorservice.html#executorservice-shutdown

CompletableFuture along with reading using FileReader, the program doesn't exit

Backgroud
Building a data pipeline where each message received is to be processes asynchronously.
Trying to simulate the behavior by
Reading message from file
Processing with CompletableFuture
Code
BufferedReader reader = null;
ExecutorService service = Executors.newFixedThreadPool(4);
try {
String filepath = str[0];
FileReaderAsync fileReaderAsync = new FileReaderAsync();
reader = new BufferedReader(new FileReader(filepath));
Random r = new Random();
String line;
while ((line = reader.readLine()) != null) {
Integer val = Integer.valueOf(line.trim());
int randomInt = r.nextInt(5);
Thread.sleep(randomInt * 100);
CompletableFuture.supplyAsync(() -> {
System.out.println("Square : " + val);
return val * val;
}, service)
.thenApplyAsync(value -> {
System.out.println(":::::::Double : " + value);
return 2 * value;
}, service)
.thenAccept(value -> {
System.out.println("Answer : " + value);
});
}
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
reader.close();
} catch (Exception e) {
throw new RuntimeException(e.getMessage());
}
}
For simplicity just pasting main method code, assume variables are declared and in scope.
Issues
Code
Program works fine but does not exit, tried commenting Async logic and just reading the file. it works fine and ends too.
Design
In Streaming pipeline, will this Async model work for each incoming message if each message is passed to the CompletableFuture for processing?
Or it will block for current message to be processed ?
It is required to introduce another queue and then consume from it instead of consuming incoming messages as they flow in ?
Edit 1
Added
public void shutdown() {
service.shutdown();
}
and
reader.close();
fileReaderAsync.shutdown();
which did the trick.
Problem
You're using a thread pool created by:
ExecutorService service = Executors.newFixedThreadPool(4);
Which by default is configured to use non-daemon threads. And as documented by java.lang.Thread:
When a Java Virtual Machine starts up, there is usually a single non-daemon thread (which typically calls the method named main of some designated class). The Java Virtual Machine continues to execute threads until either of the following occurs:
The exit method of class Runtime has been called and the security manager has permitted the exit operation to take place.
All threads that are not daemon threads have died, either by returning from the call to the run method or by throwing an exception that propagates beyond the run method.
In other words, any non-daemon thread that is still alive will also keep the JVM alive.
Solution
There are at least two solutions to your problem.
Shutdown the Thread Pool
You can shutdown the thread pool when you're finished with it.
service.shutdown(); // Calls ExecutorService#shutdown()
The #shutdown() method starts a graceful shutdown. It prevents any new tasks from being submitted but allows any already-submitted tasks to complete. Once all tasks are complete the pool will terminate (i.e. all threads will be allowed to die). If you want to wait for all tasks to complete before continuing then you can call #awaitTermination(long,TimeUnit) after calling #shutdown() / #shutdownNow().
If you want to try and immediately shutdown the pool then call #shutdownNow(). Any currently-executing tasks will be cancelled and any submitted-but-not-yet-started tasks are simply not executed (and are in fact returned to you in a list). Note whether a task responds to cancellation depends on how that task was implemented.
Use Daemon Threads
A daemon thread will not keep the JVM alive. You can configure the thread pool to use daemon threads via a ThreadFactory.
ExecutorService service = Executors.newFixedThreadPool(4, r -> {
Thread t = new Thread(r); // may want to name the threads
t.setDaemon(true);
return t;
});
Note you should still shutdown the thread pool when finished with it, regardless.
You have 4 threads in the pool but the Thread.sleep() will block the main thread. Your program reads a line, blocks for max. 5 secs, will then fire the async code which does not require any async-ness at all and will in fact creates a huge overhead.
Do not use Thread.sleep() in an async program.
But I tried to get the idea of your code and I can offer this:
public int calcWork(final int x) {
return x*x;
}
public void iter_async_rec(final BufferedReader reader) {
String line = reader.readline();
if (line != null) {
int i = Integer.tryParse(line); // checks required
CompetableFuture.supplyAsync(calcWork(i))
.thenSupplyAsync(i->System.out.println(i))
.thenRunAsync(()->iter_asnc_rec(reader))
}
}
In addition: Most of the time it is the best choice to just use the standard executors. The given sample will not improve speed, on the contrary.
Maybe have a look at the reactive idea!? reactivejava

Behaviour of ForkJoinPool in CompletableFuture.supplyAsync()

I'm comparing the behaviour of CompletableFuture.supplyAsync() in the two cases in which I set a custom ExecutorService or I want my Supplier to be executed by the default executor (if not specified) which is ForkJoinPool.commonPool()
Let's see the difference:
public class MainApplication {
public static void main(final String[] args) throws ExecutionException, InterruptedException {
Supplier<String> action1 = () -> {
try {
Thread.sleep(3000);
}finally {
return "Done";
}
};
Function<String, String> action2 = (input) -> {
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}finally {
return input + "!!";
}
};
final ExecutorService executorService = Executors.newFixedThreadPool(4);
CompletableFuture.supplyAsync(action1, executorService)
.thenApply (action2)
.thenAccept (res -> System.out.println(res));
System.out.println("This is the end of the execution");
}
}
In this case I'm passing executorService to my supplyAsync() and it prints:
This is the end of the execution
Done!!
So "Done" gets printed after the end of the main execution.
BUT if I use instead:
CompletableFuture.supplyAsync(action1)
so I don't pass my custom executorService and the CompletableFuture class uses under the hood the ForkJoinPool.commonPool() then "Done" is not printed at all:
This is the end of the execution
Process finished with exit code 0
Why?
In both cases when you do
CompletableFuture.supplyAsync(action1, executorService)
.thenApply (action2)
.thenAccept (res -> System.out.println(res));
you don't wait for task completition. But then you program is going to exit and there is differences how common fork join pool:
ForkJoinPool.commonPool()
and regular executor service:
final ExecutorService executorService = Executors.newFixedThreadPool(4);
..react on attempt to call System.exit(...) equivalent.
This is what doc says about fork join common pool, you should point attention to that:
However this pool and any ongoing processing are automatically
terminated upon program System.exit(int). Any program that relies on
asynchronous task processing to complete before program termination
should invoke commonPool().awaitQuiescence, before exit.
That is link to ExecutorService docs, you may point attention to:
The shutdown() method will allow previously submitted tasks to execute
before terminating
I think that may be a difference you asking about.
ForkJoinPool uses daemon threads that does not prevent JVM from exiting. On the other hand the threads in the ExecutorService created by Executors are non-daemon threads, hence it keeps JVM from exiting until you explicitly shutdown the thread pool.
Also notice that in your example you need to shutdown the pool at the end in order to terminate the JVM.
executorService.shutdown();
So, one solution would be to keep the main thread waiting for few seconds until your computation is completed like so,
Thread.sleep(4000);

ExecuterService stopped processing one thread out of two

I have a list of 40000 records that needs to be processed in a for loop. Since I have a two processor system. I have created a fixedThreadPool like this:
int threads = Runtime.getRuntime().availableProcessors();
ExecutorService service = Executors.newFixedThreadPool(threads);
And divided my ArrayList into two sublists. For each of these sublists, I have created a Callable that performs the same function (involves iterating over the sublist and doing some processing) and returns me a Future object.
I submitted both these Callable using executorServiceObject.submit(callable) and added the returned Future object into my list of Future objects
Here is my question:
I have written a System.Out.printLn("Processed Item" +item.id) // consider item as the name of reference variable for current iteration
Everything was fine for some time and i could see two threads working simutaneously. But after some time, one of the threads have stopped processing. Only one thread is running. (I know this because i can see on the console that the id's given to thread 2 are not being printed anymore).
Does anyone know how this happened? I mean why ExecutorService stopped running 2nd thread.
Thanks for your help in advance.
Adding sample code as I should have done before:
public List<Output> processInputs(List<Input> inputs)
throws InterruptedException, ExecutionException {
int threads = Runtime.getRuntime().availableProcessors();
ExecutorService service = Executors.newFixedThreadPool(threads);
List<Future<Output>> futures = new ArrayList<Future<Output>>();
for (final Input input : inputs) {
Callable<Output> callable = new Callable<Output>() {
public Output call() throws Exception {
Output output = new Output();
// process your input here and compute the output
return output;
}
};
futures.add(service.submit(callable));
}
service.shutdown();
List<Output> outputs = new ArrayList<Output>();
for (Future<Output> future : futures) {
outputs.add(future.get());
}
return outputs;
Everything was fine for some time and i could see two threads working simultaneously. But after some time, one of the threads have stopped processing. Only one thread is running. (I know this because i can see on the console that the id's given to thread 2 are not being printed anymore).
I suspect that your processing thread has thrown an exception. The Future.get() method can throw ExecutionException "if the computation threw an exception".
// the following might throw an exception if the background job threw
outputs.add(future.get());
If there was a NPE, an IOException, etc. thrown by your "process your input" code then that exception is thrown by the Callable and stored in the Future so it can be thrown by the get() method but wrapped in an ExecutionException. This is useful so the thread that is waiting can get and and handle (log, etc.) the exception thrown by the background thread.
Instead of just having your processInputs(...) method throw the exception to the caller where it might be getting lost, I'd do something like the following in your while loop:
try {
outputs.add(future.get());
} catch (InterruptedException ie) {
// always a good pattern if the thread that is waiting was interrupted
Thread.currentThread().interrupt();
return;
} catch (ExecutionException ee) {
// somehow log the error
logger.error("Computation failed to process", ee);
// now continue and get the next future in the list
}
If you don't catch and properly handle that ExecutionException then the processing exception will also kill the thread that calls processInputs(...).

What happens to remaining thread of invokeAny Executor Service

InWhen invokeAny successfully returns, what happens to remaining threads? Does it get killed automatically? If not how can I make sure that thread is stopped and return back to threadpool
ExecutorService executorService = Executors.newFixedThreadPool(10);
executorService.invokeAny(callables);
Just elaborating more on the topic.
What happens to remaining threads
If the treads are executing methods which throw InterruptedException then they receive the exception. Otherwise they get their interrupted flag set to true.
Does it get killed automaticlly?
Not really.- If they are running in infinite loop then you need to make sure you do not swallow InterruptedException and exit the thread in the catch block.- If you are not expecting the exception then you need to keep checking flag using Thread.interrupted() or Thread.currentThread().isInterrupted() and exit when it's true.
- If you are not running infinite loop then the threads will complete their tasks and stop. But their results will not be considered.
In following code both task, and task2 keep running even the service is stopped and main method exits:
public class Test {
public static void main(String[] args) throws Exception {
Callable<String> task1 = () -> {
for (;;) {
try {
Thread.sleep(9000);
System.out.println(Thread.currentThread().getName()+
" is still running..");
} catch (InterruptedException e) {
System.out.println(Thread.currentThread().getName()
+ " has swallowed the exception.");
//it is a good practice to break the loop here or return
}
}
};
Callable<String> task2 = () -> {
for(;;) {
if(Thread.interrupted()) {
//it is a good practice to break the loop here or return
System.out.println(Thread.currentThread().getName()+
" is interrupted but it is still running..");
}
}
};
List<Callable<String>> tasks = List.of(task1, task2, () -> "small task done!");
ExecutorService service = Executors.newFixedThreadPool(4);
String result = service.invokeAny(tasks);
System.out.println(result);
service.shutdownNow();
System.out.println("main thread done");
}
}
Output:
small task done!
pool-1-thread-2 is interrupted but it is still running..
pool-1-thread-1 has swallowed the exception.
pool-1-thread-1 has swallowed the exception.
main thread done
pool-1-thread-1 is still running..
pool-1-thread-1 is still running..
Upon calling the method invokeAny they are all cancelled/stop when the remaining threads are not yet completed.
Here is the documentation of it:
Upon normal or exceptional return, tasks that have not completed are cancelled.

Categories

Resources