CompletetableFuture order of execution - java

I am trying to understand the non-blocking callback nature of completableFutures in java
CompletableFuture.supplyAsync(() -> {
try {
//Thread.sleep(20000);
System.out.println("supplyAsync Thread name " + Thread.currentThread().getName());
return "str";
} catch (Exception e) {
return "";
}
}).thenApply(str -> {
System.out.println("thenApply Thread name " + Thread.currentThread().getName());
return str;
}).thenApply(str1 -> {
System.out.println("thenApply Thread name " + Thread.currentThread().getName());
return str1;
}).thenAccept(str3 -> {
System.out.println("thenAccept Thread name " + Thread.currentThread().getName());
});
System.out.println("Thread name " + Thread.currentThread().getName());
With the above code i always see the out as seen below
supplyAsync Thread name ForkJoinPool.commonPool-worker-1
thenApply Thread name main
thenApply Thread name main
thenAccept Thread name main
Thread name main
This order seems to suggest main thread waits till execution of all the Futures thread. Isn't this indicative of being blocking.
I tried to add the Thread.sleep to supplyAsync method and output was
Thread name main
None of the print statements in futures thread seems to have been executed. My understanding of non-blocking is being able to print
Thread name main
supplyAsync Thread name
ForkJoinPool.commonPool-worker-1
thenApply Thread name main
thenApply Thread name main
thenAccept Thread name main
If a similar code is executed in Javascript , the above console output would be possible
Is the above possible in Java?

No, it is not blocking. By default completable future executes its task on ForkJoinPool.commonPool where common pool's threads are daemon thread that means. When the Main thread completes its execution the JVM will terminate all other daemon thread from the pool that's why in the second case you are seeing only Thread name main.
However, you can pass own executor to the completable future.
CompletableFuture.supplyAsync(() -> {
try {
Thread.sleep(2000);
System.out.println("supplyAsync Thread name " + Thread.currentThread().getName());
System.out.println("Is daemon Thread: " + Thread.currentThread().isDaemon());
return "str";
} catch (Exception e) {
return "";
}
}, Executors.newFixedThreadPool(4)).thenApply(str -> {
System.out.println("thenApply Thread name " + Thread.currentThread().getName());
return str;
}).thenApply(str1 -> {
System.out.println("thenApply Thread name " + Thread.currentThread().getName());
return str1;
}).thenAccept(str3 -> {
System.out.println("thenAccept Thread name " + Thread.currentThread().getName());
});
System.out.println("Thread name " + Thread.currentThread().getName());
If you run the above code you can get the output like :
Thread name main
supplyAsync Thread name pool-1-thread-1
Is daemon Thread: false
thenApply Thread name1 pool-1-thread-1
thenApply Thread name1 pool-1-thread-1
thenAccept Thread name1 pool-1-thread-1

Related

How to share data from Main Thread with Thread on the Common Pool

My idea is to share data from the main to the child thread (from the thread pool).
I found InheritableThreadLocal
Below is my example:
public static void main(String[] args) {
InheritableThreadLocal<String> inheritableThreadLocal =
new InheritableThreadLocal<>();
System.out.println("Thread name: " + Thread.currentThread().getName() + " - Set data into inheritableThreadLocal: " + "ABCXYZ");
inheritableThreadLocal.set("ABCXYZ");
System.out.println();
List<Integer> listIndex = List.of(1, 2);
listIndex
.stream()
.parallel()
.map(index -> {
System.out.println("Thread name: " + Thread.currentThread().getName() + " - Get data from inheritableThreadLocal : " + inheritableThreadLocal.get());
return inheritableThreadLocal.get();
})
.collect(Collectors.toList());
}
When tested I have strange behavior when running this example in different Java versions.
Java 11.0.12:
Thread name: main - Set data into inheritableThreadLocal: ABCXYZ
Thread name: main - Get data from inheritableThreadLocal : ABCXYZ
Thread name: ForkJoinPool.commonPool-worker-3 - Get data from inheritableThreadLocal : ABCXYZ
Java 17.0.1:
Thread name: main - Set data into inheritableThreadLocal: ABCXYZ
Thread name: ForkJoinPool.commonPool-worker-1 - Get data from inheritableThreadLocal : null
Thread name: main - Get data from inheritableThreadLocal : ABCXYZ
I do not know why have different output on different Java version like this (The thread on the pool can not find any data from inheritableThreadLocal area).
Could anyone give me the answer?

How do I timeout a method after fixed time in java using concurrency executor future?

Using java concurrent executor, future cancel method not stopping the current task.
I have followed this solution of timeout and stop processing of current task. But it doesn't stop the processing.
I am trying this with cron job. Every 30 seconds my cron job gets executed and I am putting 10 seconds timeout. Debug comes in future cancel method, but it is not stopping current task.
Thank you.
#Scheduled(cron = "*/30 * * * * *")
public boolean cronTest()
{
System.out.println("Inside cron - start ");
DateFormat dateFormat = new SimpleDateFormat("yyyy/MM/dd HH:mm:ss");
Date date = new Date();
System.out.println(dateFormat.format(date));
System.out.println("Inside cron - end ");
ExecutorService executor = Executors.newCachedThreadPool();
Callable<Object> task = new Callable<Object>() {
public Object call() {
int i=1;
while(i<100)
{
System.out.println("i: "+ i++);
try {
TimeUnit.SECONDS.sleep(1);
}
catch(Exception e)
{
}
}
return null;
}
};
Future<Object> future = executor.submit(task);
try {
Object result = future.get(10, TimeUnit.SECONDS);
} catch (Exception e)
} finally {
future.cancel(true);
return true;
}
}
The expected result is cron job runs every 30 seconds and after 10 seconds it should time out and wait for approximately 20 seconds for a cron job to start again. And should not continue the older loop because we have timeout on 10 seconds.
Current result is:
Inside cron - start
2019/07/25 11:09:00
Inside cron - end
i: 1
i: 2
i: 3
i: 4 ... upto i: 31
Inside cron - start
2019/07/25 11:09:30
Inside cron - end
i: 1
i: 32
i: 2
i: 3
i: 33
...
Expected result is:
Inside cron - start
2019/07/25 11:09:00
Inside cron - end
i: 1
i: 2
i: 3
i: 4 ... upto i: 10
Inside cron - start
2019/07/25 11:09:30
Inside cron - end
i: 1
i: 2
i: 3 ... upto i:10
The first problem is in this part of code:
catch(Exception e)
{
}
When you invoke future.cancel(true); your thread is being interrupted with Thread.interrupt()
Which means that when a thread is sleeping, it gets awoken and throws InterruptedException which is caught by the catch block and ignored. To fix this problem you have to handle this exception:
catch(InterruptedException e) {
break; //breaking from the loop
}
catch(Exception e)
{
}
The second problem: Thread.interrupt() may be invoked while the thread is not sleeping. In this case InterruptedException is not thrown. Instead, the interrupted flag of the thread is raised. What you have to do is to check for this flag from time to time, and if it's raised, handle interruption. The basic code for it would look like:
try {
if (Thread.currentThread().isInterrupted()) {
break;
}
TimeUnit.SECONDS.sleep(1);
}
...
// rest of the code
UPDATE:
Here's the full code of Callable:
Callable<Object> task = new Callable<Object>() {
public Object call() {
int i=1;
while(i<100)
{
System.out.println("i: "+ i++);
try {
if (Thread.currentThread().isInterrupted()) {
break; //breaking from the while loop
}
TimeUnit.SECONDS.sleep(1);
} catch(InterruptedException e) {
break; //breaking from the while loop
} catch(Exception e)
{
}
}
return null;
}
};

Does the call of join in the following CompletableFuture example block the process

I am trying to understand CompletableFutures and chaining of calls that that return completed futures and I have created the bellow example which kind of simulates two calls to a database.
The first method is supposed to be giving a completable future with a list of userIds and then I need to make a call to another method providing a userId to get the user (a string in this case).
to summarise:
1. fetch the ids
2. fetch a list of the users coresponding to those ids.
I created simple methods to simulate the responses with sleap threads.
Please check the code bellow
public class PipelineOfTasksExample {
private Map<Long, String> db = new HashMap<>();
PipelineOfTasksExample() {
db.put(1L, "user1");
db.put(2L, "user2");
db.put(3L, "user3");
db.put(4L, "user4");
}
private CompletableFuture<List<Long>> returnUserIdsFromDb() {
try {
Thread.sleep(500);
} catch (InterruptedException e) {
e.printStackTrace();
}
System.out.println("building the list of Ids" + " - thread: " + Thread.currentThread().getName());
return CompletableFuture.supplyAsync(() -> Arrays.asList(1L, 2L, 3L, 4L));
}
private CompletableFuture<String> fetchById(Long id) {
CompletableFuture<String> cfId = CompletableFuture.supplyAsync(() -> db.get(id));
try {
Thread.sleep(500);
} catch (InterruptedException e) {
e.printStackTrace();
}
System.out.println("fetching id: " + id + " -> " + db.get(id) + " thread: " + Thread.currentThread().getName());
return cfId;
}
public static void main(String[] args) {
PipelineOfTasksExample example = new PipelineOfTasksExample();
CompletableFuture<List<String>> result = example.returnUserIdsFromDb()
.thenCompose(listOfIds ->
CompletableFuture.supplyAsync(
() -> listOfIds.parallelStream()
.map(id -> example.fetchById(id).join())
.collect(Collectors.toList()
)
)
);
System.out.println(result.join());
}
}
My question is, does the join call (example.fetchById(id).join()) ruins the nonblocking nature of process. If the answer is positive how can I solve this problem?
Thank you in advance
Your example is a bit odd as you are slowing down the main thread in returnUserIdsFromDb(), before any operation even starts and likewise, fetchById slows down the caller rather than the asynchronous operation, which defeats the entire purpose of asynchronous operations.
Further, instead of .thenCompose(listOfIds -> CompletableFuture.supplyAsync(() -> …)) you can simply use .thenApplyAsync(listOfIds -> …).
So a better example might be
public class PipelineOfTasksExample {
private final Map<Long, String> db = LongStream.rangeClosed(1, 4).boxed()
.collect(Collectors.toMap(id -> id, id -> "user"+id));
PipelineOfTasksExample() {}
private static <T> T slowDown(String op, T result) {
LockSupport.parkNanos(TimeUnit.MILLISECONDS.toNanos(500));
System.out.println(op + " -> " + result + " thread: "
+ Thread.currentThread().getName()+ ", "
+ POOL.getPoolSize() + " threads");
return result;
}
private CompletableFuture<List<Long>> returnUserIdsFromDb() {
System.out.println("trigger building the list of Ids - thread: "
+ Thread.currentThread().getName());
return CompletableFuture.supplyAsync(
() -> slowDown("building the list of Ids", Arrays.asList(1L, 2L, 3L, 4L)),
POOL);
}
private CompletableFuture<String> fetchById(Long id) {
System.out.println("trigger fetching id: " + id + " thread: "
+ Thread.currentThread().getName());
return CompletableFuture.supplyAsync(
() -> slowDown("fetching id: " + id , db.get(id)), POOL);
}
static ForkJoinPool POOL = new ForkJoinPool(2);
public static void main(String[] args) {
PipelineOfTasksExample example = new PipelineOfTasksExample();
CompletableFuture<List<String>> result = example.returnUserIdsFromDb()
.thenApplyAsync(listOfIds ->
listOfIds.parallelStream()
.map(id -> example.fetchById(id).join())
.collect(Collectors.toList()
),
POOL
);
System.out.println(result.join());
}
}
which prints something like
trigger building the list of Ids - thread: main
building the list of Ids -> [1, 2, 3, 4] thread: ForkJoinPool-1-worker-1, 1 threads
trigger fetching id: 2 thread: ForkJoinPool-1-worker-0
trigger fetching id: 3 thread: ForkJoinPool-1-worker-1
trigger fetching id: 4 thread: ForkJoinPool-1-worker-2
fetching id: 4 -> user4 thread: ForkJoinPool-1-worker-3, 4 threads
fetching id: 2 -> user2 thread: ForkJoinPool-1-worker-3, 4 threads
fetching id: 3 -> user3 thread: ForkJoinPool-1-worker-2, 4 threads
trigger fetching id: 1 thread: ForkJoinPool-1-worker-3
fetching id: 1 -> user1 thread: ForkJoinPool-1-worker-2, 4 threads
[user1, user2, user3, user4]
which might be a surprising number of threads on the first glance.
The answer is that join() may block the thread, but if this happens inside a worker thread of a Fork/Join pool, this situation will be detected and a new compensation thread will be started, to ensure the configured target parallelism.
As a special case, when the default Fork/Join pool is used, the implementation may pick up new pending tasks within the join() method, to ensure progress within the same thread.
So the code will always make progress and there’s nothing wrong with calling join() occasionally, if the alternatives are much more complicated, but there’s some danger of too much resource consumption, if used excessively. After all, the reason to use thread pools, is to limit the number of threads.
The alternative is to use chained dependent operations where possible.
public class PipelineOfTasksExample {
private final Map<Long, String> db = LongStream.rangeClosed(1, 4).boxed()
.collect(Collectors.toMap(id -> id, id -> "user"+id));
PipelineOfTasksExample() {}
private static <T> T slowDown(String op, T result) {
LockSupport.parkNanos(TimeUnit.MILLISECONDS.toNanos(500));
System.out.println(op + " -> " + result + " thread: "
+ Thread.currentThread().getName()+ ", "
+ POOL.getPoolSize() + " threads");
return result;
}
private CompletableFuture<List<Long>> returnUserIdsFromDb() {
System.out.println("trigger building the list of Ids - thread: "
+ Thread.currentThread().getName());
return CompletableFuture.supplyAsync(
() -> slowDown("building the list of Ids", Arrays.asList(1L, 2L, 3L, 4L)),
POOL);
}
private CompletableFuture<String> fetchById(Long id) {
System.out.println("trigger fetching id: " + id + " thread: "
+ Thread.currentThread().getName());
return CompletableFuture.supplyAsync(
() -> slowDown("fetching id: " + id , db.get(id)), POOL);
}
static ForkJoinPool POOL = new ForkJoinPool(2);
public static void main(String[] args) {
PipelineOfTasksExample example = new PipelineOfTasksExample();
CompletableFuture<List<String>> result = example.returnUserIdsFromDb()
.thenComposeAsync(listOfIds -> {
List<CompletableFuture<String>> jobs = listOfIds.parallelStream()
.map(id -> example.fetchById(id))
.collect(Collectors.toList());
return CompletableFuture.allOf(jobs.toArray(new CompletableFuture<?>[0]))
.thenApply(_void -> jobs.stream()
.map(CompletableFuture::join).collect(Collectors.toList()));
},
POOL
);
System.out.println(result.join());
System.out.println(ForkJoinPool.commonPool().getPoolSize());
}
}
The difference is that first, all asynchronous jobs are submitted, then, a dependent action calling join on them is scheduled, to be executed only when all jobs have completed, so these join invocations will never block. Only the final join call at the end of the main method may block the main thread.
So this prints something like
trigger building the list of Ids - thread: main
building the list of Ids -> [1, 2, 3, 4] thread: ForkJoinPool-1-worker-1, 1 threads
trigger fetching id: 3 thread: ForkJoinPool-1-worker-1
trigger fetching id: 2 thread: ForkJoinPool-1-worker-0
trigger fetching id: 4 thread: ForkJoinPool-1-worker-1
trigger fetching id: 1 thread: ForkJoinPool-1-worker-0
fetching id: 4 -> user4 thread: ForkJoinPool-1-worker-1, 2 threads
fetching id: 3 -> user3 thread: ForkJoinPool-1-worker-0, 2 threads
fetching id: 2 -> user2 thread: ForkJoinPool-1-worker-1, 2 threads
fetching id: 1 -> user1 thread: ForkJoinPool-1-worker-0, 2 threads
[user1, user2, user3, user4]
showing that no compensation threads had to be created, so the number of threads matches the configured target parallelism.
Note that if the actual work is done in a background thread rather than within the fetchById method itself, you now don’t need a parallel stream anymore, as there is no blocking join() call. For such scenarios, just using stream() will usually result in a higher performance.

RetryExecutor : How to wait for all tasks to finish?

I am using RetryExecutor from : https://github.com/nurkiewicz/async-retry
Below id my code :
ScheduledExecutorService executorService = Executors.newScheduledThreadPool(10);
RetryExecutor retryExecutor = new AsyncRetryExecutor(executorService)
.retryOn(IOException.class)
.withExponentialBackoff(500, 2)
.withMaxDelay(5_000) // 5 seconds
.withUniformJitter()
.withMaxRetries(5);
I have submitted a few tasks to retryExecutor.
retryExecutor.getWithRetry(ctx -> {
if(ctx.getRetryCount()==0)
System.out.println("Starting download from : " + url);
else
System.out.println("Retrying ("+ctx.getRetryCount()+") dowload from : "+url);
return downloadFile(url);
}
).whenComplete((result, error) -> {
if(result!=null && result){
System.out.println("Successfully downloaded!");
}else{
System.out.println("Download failed. Error : "+error);
}
});
Now, how do I wait for all submitted tasks to finish?
I want to wait until all retries are finished (if any).
don't think it will be as simple as executorService.shutdown();
CompletableFuture<DownloadResult> downloadPromise =
retryExecutor.getWithRetry(...)
.whenComplete(...);
DownloadResult downloadResult = downloadPromise.get()

How can observe my fast source on thread-pool queue

Need help making an observable start on the main thread, and then move on to a pool of threads allowing the source to continue emitting new items (regardless if they are still being processed in the pool of threads).
This is my example:
public static void main(String[] args) {
Observable<Integer> source = Observable.range(1,10);
source.map(i -> sleep(i, 10))
.doOnNext(i -> System.out.println("Emitting " + i + " on thread " + Thread.currentThread().getName()))
.observeOn(Schedulers.computation())
.map(i -> sleep(i * 10, 300))
.subscribe( i -> System.out.println("Received " + i + " on thread " + Thread.currentThread().getName()));
sleep(-1, 30000);
}
private static int sleep(int i, int time) {
try {
Thread.sleep(time);
} catch (InterruptedException e) {
e.printStackTrace();
}
return i;
}
which always prints:
Emitting 1 on thread main
Emitting 2 on thread main
Emitting 3 on thread main
Received 10 on thread RxComputationScheduler-1
Emitting 4 on thread main
Emitting 5 on thread main
Emitting 6 on thread main
Received 20 on thread RxComputationScheduler-1
Emitting 7 on thread main
Emitting 8 on thread main
Emitting 9 on thread main
Received 30 on thread RxComputationScheduler-1
Emitting 10 on thread main
Received 40 on thread RxComputationScheduler-1
Received 50 on thread RxComputationScheduler-1
Received 60 on thread RxComputationScheduler-1
Received 70 on thread RxComputationScheduler-1
Received 80 on thread RxComputationScheduler-1
Received 90 on thread RxComputationScheduler-1
Received 100 on thread RxComputationScheduler-1
Although items are emitted on the main thread as supposed, I want them to move on to the computation/IO thread-pool afterwards.
Should be something like this:
I don't think you were slowing down the source emissions enough, and they were emitting so quickly that all items were emitted before the observeOn() had a chance to schedule them.
Try sleeping to 500ms instead of 10ms. You will then see interleaving like you would expect.
public class JavaLauncher {
public static void main(String[] args) {
Observable<Integer> source = Observable.range(1,10);
source.map(i -> sleep(i, 500))
.doOnNext(i -> System.out.println("Emitting " + i + " on thread " + Thread.currentThread().getName()))
.observeOn(Schedulers.computation())
.map(i -> sleep(i * 10, 250))
.subscribe( i -> System.out.println("Received " + i + " on thread " + Thread.currentThread().getName()));
sleep(-1, 30000);
}
private static int sleep(int i, int time) {
try {
Thread.sleep(time);
} catch (InterruptedException e) {
e.printStackTrace();
}
return i;
}
}
OUTPUT
Emitting 1 on thread main
Emitting 2 on thread main
Emitting 3 on thread main
Received 10 on thread RxComputationThreadPool-3
Emitting 4 on thread main
Received 20 on thread RxComputationThreadPool-3
Emitting 5 on thread main
Emitting 6 on thread main
Received 30 on thread RxComputationThreadPool-3
Emitting 7 on thread main
Emitting 8 on thread main
Received 40 on thread RxComputationThreadPool-3
Emitting 9 on thread main
Emitting 10 on thread main
Received 50 on thread RxComputationThreadPool-3
Received 60 on thread RxComputationThreadPool-3
Received 70 on thread RxComputationThreadPool-3
Received 80 on thread RxComputationThreadPool-3
Received 90 on thread RxComputationThreadPool-3
Received 100 on thread RxComputationThreadPool-3
UPDATE - Parallelized Version
public class JavaLauncher {
public static void main(String[] args) {
Observable<Integer> source = Observable.range(1,10);
source.map(i -> sleep(i, 250))
.doOnNext(i -> System.out.println("Emitting " + i + " on thread " + Thread.currentThread().getName()))
.flatMap(i ->
Observable.just(i)
.subscribeOn(Schedulers.computation())
.map(i2 -> sleep(i2 * 10, 500))
)
.subscribe( i -> System.out.println("Received " + i + " on thread " + Thread.currentThread().getName()));
sleep(-1, 30000);
}
private static int sleep(int i, int time) {
try {
Thread.sleep(time);
} catch (InterruptedException e) {
e.printStackTrace();
}
return i;
}
}
OUTPUT
Emitting 1 on thread main
Emitting 2 on thread main
Emitting 3 on thread main
Received 10 on thread RxComputationThreadPool-3
Emitting 4 on thread main
Received 20 on thread RxComputationThreadPool-4
Received 30 on thread RxComputationThreadPool-1
Emitting 5 on thread main
Received 40 on thread RxComputationThreadPool-2
Emitting 6 on thread main
Received 50 on thread RxComputationThreadPool-3
Emitting 7 on thread main
Received 60 on thread RxComputationThreadPool-4
Emitting 8 on thread main
Received 70 on thread RxComputationThreadPool-1
Emitting 9 on thread main
Received 80 on thread RxComputationThreadPool-2
Emitting 10 on thread main
Received 90 on thread RxComputationThreadPool-3
Received 100 on thread RxComputationThreadPool-4

Categories

Resources