How to wait until JavaRx2 Flowable will finish all its tasks? - java

I am trying to learn the basics of RxJava2 library and right now I am stuck at the following moment:
I have generated myFlowable via Flowable.generate(...) and now I need to wait while all the tasks will finish its execution, before I can proceed further.
This is the code to showcase the problem:
myFlowable.parallel()
.runOn(Schedulers.computation())
.map(val -> myCollection.add(val))
.sequential()
.subscribe(val -> {
System.out.println("Thread from subscribe: " + Thread.currentThread().getName());
System.out.println("Value from subscribe: " + val.toString());
});
System.out.println("Before sleep - Number of objects: " + myCollection.size());
try {
Thread.sleep(1000);
System.out.println("After sleep - Number of objects: " + myCollection.size());
} catch (InterruptedException e) {
e.printStackTrace();
}
I run through all my tasks and add the results to collection. And if I check the collection size right after myFlowable block then it will be different, if I check it after small Thread.sleep(). Is there any way to check that all the tasks finished its execution and we can proceed further? Any help or guidance will be greatly appreciated.

As RxJava is asynchronous the java code below observable will run while the observable will run in a different thread thets why if you want to be notified if Flowable has finished emitting data you should do that in RxJava stream. for that you have an operator .doOnComplete
here you have an example how to detect when stream is finished
Flowable.range(0, 100).parallel()
.runOn(Schedulers.computation())
.map(integer -> {
return integer;
})
.sequential()
.doOnComplete(() -> {
System.out.println("finished");
})
.subscribe(integer -> System.out.println(integer));

You could use an AtomicBoolean, initialize it to false and set it to true using doFinally().
doFinally() is called after the Observable signals onError or onCompleted or it gets disposed by the downstream.
Then sleep the main thread until completed value is true.
Using your example:
AtomicBoolean completed = new AtomicBoolean(false);
myFlowable.parallel()
.runOn(Schedulers.computation())
.map(val -> myCollection.add(val))
.sequential()
.doFinally(() -> completed.set(true))
.subscribe(val -> {
...
});
...
try {
while(!completed.get()){
Thread.sleep(1000);
...
}
...
} catch (InterruptedException e) {
e.printStackTrace();
}

Use Flowable::blockingSubscribe() - Runs the current Flowable to a terminal event, ignoring any values and rethrowing any exception.
http://reactivex.io/RxJava/3.x/javadoc/io/reactivex/rxjava3/core/Flowable.html#blockingSubscribe--

Related

Java Concurrency: Need to make 2 webservice calls simultaneously - is this correct?

I want to make web calls to 2 different services simultaneously. At the end, I zip the 2 Response objects into one stream. I'm using a Callable, but I'm not sure I'm going about this in the correct way. It seems as though I'm still going to be blocked by the first get() call to the Future, right? Can someone tell me if I'm on the right track? This is what I have so far:
// submit the 2 calls to the thread pool
ExecutorService executorService = Executors.newFixedThreadPool(Runtime.getRuntime().availableProcessors());
Future<Mono<Response<ProcessInstance>>> processFuture =
executorService.submit(() -> getProcessInstances(processDefinitionKey, encryptedIacToken));
Future<Mono<Response<Task>>> taskFuture =
executorService.submit(() -> getTaskResponses(processDefinitionKey, encryptedIacToken, 100, 0));
// get the result of the 2 calls
Optional<Tuple2<Response<ProcessInstance>, Response<Task>>> tuple;
try {
Mono<Response<ProcessInstance>> processInstances = processFuture.get();
Mono<Response<Task>> userTasks = taskFuture.get();
tuple = processInstances.zipWith(userTasks).blockOptional();
} catch (InterruptedException e) {
log.error("Exception while processing response", e);
// Restore interrupted state...
Thread.currentThread().interrupt();
return emptyProcessResponseList;
} catch (ExecutionException e) {
log.error("Exception while processing response", e);
return emptyProcessResponseList;
}
Given: You need to wait until both tasks are complete.
If processFuture ends first, you'll immediately fall through and wait until taskFuture ends. If taskFuture ends first you'll block until processFuture ends, but the taskFuture.get() call will return instantly since that task is done. In either case the result is the same.
You could use CompletableFutures instead and then CompletableFuture.allOf() but for something this simple what you have works fine. See also Waiting on a list of Future
Your code will block until the processFuture is finished, then it will block until the taskFuture is finished.
The callables will be processed in parallel, so here you are saving time (assuming thread pool size >= 2).

Completablefuture doesnot complete on exception

I'm kinda new to using CompletableFuture API and I have a question regarding usage of allOf. From what I read, completable-future should be in complete state and allOf logic should be executed when all associated futures complete, including completed-exceptionally. But here's my smaple code for which allOf block never gets executed -
public static void test() {
CompletableFuture<String> r1 = CompletableFuture.supplyAsync(() -> {
try{
Thread.sleep(1000);
throw new RuntimeException("blahh !!!");
}catch (Exception e) {
throw new RuntimeException(e);
}
});
CompletableFuture<String> r2 = CompletableFuture.supplyAsync(() -> "55");
CompletableFuture<String> r3 = CompletableFuture.supplyAsync(() -> "56");
CompletableFuture.allOf(r1, r2, r3).thenRun(() -> { System.out.println(Thread.currentThread()+" --- End."); });
Stream.of(r1, r2, r3).forEach(System.out::println);
try{
System.out.println(Thread.currentThread()+" --- SLEEPING !!!");
Thread.sleep(3000);
System.out.println(Thread.currentThread()+" --- DONE !!!");
} catch (Exception e) {
//e.printStackTrace();
}
Stream.of(r1, r2, r3).forEach(System.out::println);
}
The problem is not that your allOf CompletableFuture never completes. It does.
What causes your code not to run is thenRun's expectation:
Returns a new CompletionStage that, when this stage completes normally, executes the given action. See the CompletionStage documentation for rules covering exceptional completion.
You probably already know that when one of allOf's futures completes exceptionally, the resulting future also completes exceptionally:
Returns a new CompletableFuture that is completed when all of the given CompletableFutures complete. If any of the given CompletableFutures complete exceptionally, then the returned CompletableFuture also does so, with a CompletionException holding this exception as its cause.
In short, don't use thenRun if you want to run an action on your allOf future irrespective of how it comples. As an alternative, you can use whenComplete:
CompletableFuture.allOf(r1, r2, r3)
.whenComplete((a, ex) -> System.out.println(Thread.currentThread() + " --- End."));
You can also use a combination of thenRun + exceptionally, one of which will run:
CompletableFuture<Void> all = CompletableFuture.allOf(r1, r2, r3);
all.thenRun(() -> {
System.out.println(Thread.currentThread() + " --- End.");
});
all.exceptionally(ex -> {
System.out.println(ex);
return null;
});

Understanding RxJava observable when underlying data source has new values

I am trying to experiment RxJava observable and observer code. My objective is to check that how things work when underlying source receives new data values. My code is as:
List<Integer> numbers = new ArrayList<>();
Runnable r = new Runnable() {
#Override
public void run() {
int i = 100;
while(i < 110) {
numbers.add(i);
try {
Thread.sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
}
i++;
}
}
};
numbers.add(0);
numbers.add(1);
numbers.add(2);
Observable.fromIterable(numbers)
.observeOn(Schedulers.io())
.subscribe(i -> System.out.println("Received "+i+ " on "+ Thread.currentThread().getName()),
e -> e.printStackTrace());
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
Thread t = new Thread(r);
t.start();
try {
Thread.sleep(10000);
} catch (InterruptedException e) {
e.printStackTrace();
}
So I have a list of numbers. I then have a runnable which adds new numbers to this list with a time gap between the additions. I don't start thread yet. I add 0,1,2 to the list and then create an observable with it, scheduling the observer on a thread from pool, and finally subscribing to the observable. As subscription happens, observable emits the values 0,1,2 and observer is invoked(lambda passed to subscribe is executed). Then I introduce a delay of 1 sec on the main thread and then I spawn a new thread using runnable I created earlier, and also add a final delay so that application doesn't exit immediately.
What I expect is that as new numbers are added to the list, observer must be invoked, thus printing the message. But that doesn't happen. Surely I have got it wrong in my understanding. Do I need to also put observable on a scheduler?
The Observable.fromIterable() method is a "one time" load of values for an observable each time a subscription is build. What happens "after" building the subscription has no affect anymore. When you use the subscribe(onNext, onError, onComplete) method with the onComplete argument you will see that the subscription has fully consumed and the three initial values has been printed.
You can use a Subject (something like a PublishSubject) where you use the onNext() method to add "new values" while the subscriptions which were built earlier are still active (and not completed). That way you can build the subscriptions first and keep calling onNext() for new values in the subject until you are done and call onCompleted().

RxJava: Merging Observable with Completable does not work

I have an Observable that at some point has to write things to the cache - and we would like to wait that writes are done before finishing the whole operation on the observable (for reporting purposes).
For the purpose of test, the cache write Completable looks like this:
Completable.create(
emitter ->
new Thread(
() -> {
try {
Thread.sleep(2000);
doSomething();
emitter.onComplete();
} catch (InterruptedException e) {
e.printStackTrace();
}
})
.start());
Since I have several cache writes, I try to merge them in a container class:
public class CacheInsertionResultsTracker {
private Completable cacheInsertResultsCompletable;
public CacheInsertionResultsTracker() {
this.cacheInsertResultsCompletable = Completable.complete();
}
public synchronized void add(Completable cacheInsertResult) {
this.cacheInsertResultsCompletable = this.cacheInsertResultsCompletable.mergeWith(cacheInsertResult);
}
public Completable getCompletable() {
return this.cacheInsertResultsCompletable;
}
}
And I try to merge it with Observable in a following way:
CacheInsertionResultsTracker tracker = new ...;
observable
.doOnNext(next->tracker.add(next.writeToCache(...)))
.mergeWith(Completable.defer(()->tracker.getCompletable()))
.subscribe(
// on next
this::logNextElement
// on error
this::finishWithError
// on complete
this::finishWithSuccess
);
How could I make sure that by the time finishWithSuccess is called the doSomething is completed?
The problem is that the Completable reference is updated every time I add a new one, and it happens after the mergeWith runs...
The solution that seems to work for our use case is to use concatWith + defer:
observable
.doOnNext(next->tracker.add(next.writeToCache(...)))
.concatWith(Completable.defer(()->tracker.getCompletable()))
.subscribe(
// on next
this::logNextElement
// on error
this::finishWithError
// on complete
this::finishWithSuccess
);
Concat assures that the subscription to the Completable happens only after the Observable is done, and defer defers getting the final Completable till this subscription (so all the objects are already added to the tracker).
Based on the comments, you could replace the completable cache with ReplaySubject<Completable>, do some timeout to detect inactivity and have the observable sequence end.
ReplaySubject<Completable> cache = ReplaySubject.create();
cache.onNext(completable);
observable.mergeWith(
cache.flatMapCompletable(v -> v)
.timeout(10, TimeUnit.MILLISECONDS, Completable.complete())
)
Edit:
Your updated example implies you want to run Completables in response to items in the main observable, isolated to that sequence, and wait for all of them to complete. This is a typical use case for flatMap:
observable.flatMap(
next -> next.writeToCache(...).andThen(Observable.just(next))
)
.subscribe(
this::logNextElement
// on error
this::finishWithError
// on complete
this::finishWithSuccess
);

How to ask CompletableFuture use non-daemon threads?

I have wrote following code:
System.out.println("Main thread:" + Thread.currentThread().getId());
CompletableFuture<Void> future = CompletableFuture.runAsync(() -> {
try {
System.out.println("Before sleep thread:" + Thread.currentThread().getId(), + " isDaemon:" + Thread.currentThread().isDaemon());
Thread.sleep(100);
System.out.println("After sleep");
} catch (InterruptedException e) {
e.printStackTrace();
}
});
future.whenComplete((r, e) -> System.out.println("whenCompleted thread:" + Thread.currentThread().getId()));
and this one prints:
Main thread:1
Before sleep thread:11 isDaemon:true
and finishes.
How can I change this behaviour?
P.S. I don't see anything related in runAsync java doc
The javadoc for runAsync() says:
Returns a new CompletableFuture that is asynchronously completed by a task running in the ForkJoinPool.commonPool() after it runs the given action.
There is another version of runAsync() where you can pass an ExecutorService.
Thus: when the default commonPool() doesn't do what you want - then create your own ExecutorService instead.
Add this line:
ForkJoinPool.commonPool().awaitTermination(5, TimeUnit.SECONDS);
to the main method after running your future. I'll block until all tasks in the pool have been completed.
Blocks until all tasks have completed execution after a shutdown request, or the timeout occurs, or the current thread is interrupted, whichever happens first.

Categories

Resources