Project Reactor: doOnNext (or the others doOnXXX) async - java

Is there any method like doOnNext, but async?
For example, I need to do some long logging (sent notification by email) for determined element.
Scheduler myParallel = Schedulers.newParallel("my-parallel", 4);
Flux<Integer> ints = Flux.just(1, 2, 3, 4, 5)
.publishOn(myParallel)
.doOnNext(v -> {
// For example, we need to do something time-consuming only for 3
if (v.equals(3)) {
try {
Thread.sleep(3000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
System.out.println("LOG FOR " + v);
});
ints.subscribe(System.out::println);
But why should I wait for logging of 3? I want to do this logic asynchronously.
Now I have only this solution
Thread.sleep(10000);
Scheduler myParallel = Schedulers.newParallel("my-parallel", 4);
Scheduler myParallel2 = Schedulers.newParallel("my-parallel2", 4);
Flux<Integer> ints = Flux.just(1, 2, 3, 4, 5)
.publishOn(myParallel)
.doOnNext(v -> {
Mono.just(v).publishOn(myParallel2).subscribe(value -> {
if (value.equals(3)) {
try {
Thread.sleep(3000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
System.out.println("LOG FOR " + value);
});
});
ints.subscribe(System.out::println);
Is there any "nice" solution for this?

If you're absolutely sure you don't care wether or not the email sending succeeds, then you could use "subscribe-inside-doOnNext" but I'm pretty confident that would be a mistake.
In order to have your Flux propagate an onError signal if the "logging" fails, the recommended approach is to use flatMap.
The good news is that since flatMap merges results from the inner publishers immediately into the main sequence, you can get still emit each element immediately AND trigger the email. The only caveat is that the whole thing will only complete once the email-sending Mono has also completed. You can also check within the flatMap lambda if the logging needs to happen at all (rather than inside the inner Mono):
//assuming sendEmail returns a Mono<Void>, takes care of offsetting any blocking send onto another Scheduler
source //we assume elements are also publishOn as relevant in `source`
.flatMap(v -> {
//if we can decide right away wether or not to send email, better do it here
if (shouldSendEmailFor(v)) {
//we want to immediately re-emit the value, then trigger email and wait for it to complete
return Mono.just(v)
.concatWith(
//since Mono<Void> never emits onNext, it is ok to cast it to V
//which makes it compatible with concat, keeping the whole thing a Flux<V>
sendEmail(v).cast(V.class)
);
} else {
return Mono.just(v);
}
});

Flux<Integer> ints = Flux.just(1, 2, 3, 4, 5)
.flatMap(integer -> {
if (integer != 3) {
return Mono.just(integer)
.map(integer1 -> {
System.out.println(integer1);
return integer;
})
.subscribeOn(Schedulers.parallel());
} else {
return Mono.just(integer)
.delayElement(Duration.ofSeconds(3))
.map(integer1 -> {
System.out.println(integer1);
return integer;
})
.subscribeOn(Schedulers.parallel());
}
}
);
ints.subscribe();

Related

Project Reactor - Parallel Execution

I have the below Flux,
#Test
public void fluxWithRange_CustomTest() {
Flux<Integer> intFlux = Flux.range(1, 10).flatMap(i -> {
if (i % 2 == 0) {
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return Mono.just(i);
} else {
return Mono.just(i);
}
}, 2).subscribeOn(Schedulers.newBoundedElastic(2, 2, "test")).log();
StepVerifier.create(intFlux).expectNext(1, 2, 3, 4, 5, 6, 7, 8, 9, 10).verifyComplete();
}
I was expecting this to run in parallel, however, this just executes in 1 thread.
The subscribeOn method only provides a way to move execution to a different thread when "someone" subscribes to your Flux. What it means is that when you use the StepVerifier you are subscribing to the flux, and because you defined a Schedulers the execution is moved to one of the threads provided by the Schedulers. This does not imply that the Flux is going to be jumping between multiples threads.
The behaviour you are expecting can be archived by adding a second subscribeOn but to the Mono you are using inside the flatMap. When the flatMap now subscribes to the content it will use another thread.
If you change your code to something like this:
#Test
public void fluxWithRange_CustomTest() throws InterruptedException {
Flux<Integer> intFlux = Flux.range(1, 10)
.flatMap(i -> subFlux(i),2)
.subscribeOn(Schedulers.newBoundedElastic(2, 2, "test")).log();
StepVerifier.create(intFlux).expectNext(1, 2, 3, 4, 5, 6, 7, 8,9,10).verifyComplete(); //This now fails.
}
private Mono<Integer> subFlux(int i) {
Mono<Integer> result = Mono.create(sink ->
{
if (i % 2 == 0) {
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
sink.success(i);
});
return result.subscribeOn(Schedulers.newBoundedElastic(2, 2, "other"));
}

Use Fallback Observable x number of times

I have an Observable which implements Error handling in the onErrorResumeNext method.
getMyObservable(params)
.take(1)
.doOnError(e -> {
})
.onErrorResumeNext(throwable -> {
if (throwable.getMessage().contains("401")) {
return getMyObservable(params);
} else {
sendServerCommunicationError();
return Observable.error(throwable);
}
})
.subscribe(result -> {
... }
});
GetMyObservable() returns a web service request from a generated client. The Use Case is: If we receive 401 we may need to refresh the client with a new UserToken. That is why we use the Fallback Observable in onErrorResumeNext() and cannot just use retry.
I have some questions:
Why do I need to implement doOnError? If I donĀ“t implement it, I sometimes get an "onError not implemented" Exception. I thought when I use onErrorResumeNext, this method is automatically used in case of an Error.
How can I achieve that on specific Errors (like 401) I use a fallback Observable with some backoff time and after 5 Times I produce an Error. So can I combine retryWhen and onErrorResumeNext somehow or is it done differently?
Why do I need to implement doOnError?
You don't and doOnError is not an error handler but a peek into the error channel. You have to implement an error handler in subscribe:
.subscribe(result -> {
// ...
},
error -> {
// ...
});
How can I achieve that on specific Errors (like 401) I use a fallback Observable with some backoff time and after 5 Times
Use retryWhen:
Observable.defer(() -> getMyObservable(params))
.retryWhen(errors -> {
AtomicInteger count = new AtomicInteger();
return errors.flatMap(error -> {
if (error.toString().contains("401")) {
int c = count.incrementAndGet();
if (c <= 5) {
return Observable.timer(c, TimeUnit.SECONDS);
}
return Observable.error(new Exception("Failed after 5 retries"));
}
return Observable.error(error);
})
})

Chain CompletableFuture and stop on first success

I'm consuming an API that returns CompletableFutures for querying devices (similar to digitalpetri modbus).
I need to call this API with a couple of options to query a device and figure out what it is - this is basically trial and error until it succeeds. These are embedded device protocols that I cannot change, but you can think of the process as working similar to the following:
Are you an apple?
If not, then are you a pineapple?
If not, then are you a pen?
...
While the API uses futures, in reality, the communications are serial (going over the same physical piece of wire), so they will never be executed synchronously. Once I know what it is, I want to be able to stop trying and let the caller know what it is.
I already know that I can get the result of only one of the futures with any (see below), but that may result in additional attempts that should be avoided.
Is there a pattern for chaining futures where you stop once one of them succeeds?
Similar, but is wasteful of very limited resources.
List<CompletableFuture<String>> futures = Arrays.asList(
CompletableFuture.supplyAsync(() -> "attempt 1"),
CompletableFuture.supplyAsync(() -> "attempt 2"),
CompletableFuture.supplyAsync(() -> "attempt 3"));
CompletableFuture<String>[] futuresArray = (CompletableFuture<String>[]) futures.toArray();
CompletableFuture<Object> c = CompletableFuture.anyOf(futuresArray);
Suppose that you have a method that is "pseudo-asynchronous" as you describe, i.e. it has an asynchronous API but requires some locking to perform:
private final static Object lock = new Object();
private static CompletableFuture<Boolean> pseudoAsyncCall(int input) {
return CompletableFuture.supplyAsync(() -> {
synchronized (lock) {
System.out.println("Executing for " + input);
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
throw new RuntimeException(e);
}
return input > 3;
}
});
}
And a List<Integer> of inputs that you want to check against this method, you can check each of them in sequence with recursive composition:
public static CompletableFuture<Integer> findMatch(List<Integer> inputs) {
return findMatch(inputs, 0);
}
private static CompletableFuture<Integer> findMatch(List<Integer> inputs, int startIndex) {
if (startIndex >= inputs.size()) {
// no match found -- an exception could be thrown here if preferred
return CompletableFuture.completedFuture(null);
}
return pseudoAsyncCall(inputs.get(startIndex))
.thenCompose(result -> {
if (result) {
return CompletableFuture.completedFuture(inputs.get(startIndex));
} else {
return findMatch(inputs, startIndex + 1);
}
});
}
This would be used like this:
public static void main(String[] args) {
List<Integer> inputs = Arrays.asList(0, 1, 2, 3, 4, 5);
CompletableFuture<Integer> matching = findMatch(inputs);
System.out.println("Found match: " + matching.join());
}
Output:
Executing for 0
Executing for 1
Executing for 2
Executing for 3
Executing for 4
Found match: 4
As you can see, it is not called for input 5, while your API (findMatch()) remains asynchronous.
I think the best you can do is, after your retrieval of the result,
futures.forEach(f -> f.cancel(true));
This will not affect the one having produced the result, and tries its best to stop the others. Since IIUC you get them from an outside source, there's no guarantee it will actually interrupt their work.
However, since
this class has no direct control over the computation that causes it to be completed, cancellation is treated as just another form of exceptional completion
(from CompletableFuture doc), I doubt it will do what you actually want.

RxJava 2 Observable that onComplete resubmits itself

I'm new with RxJava. I'm trying to create an observable that when it completes it will start all over again until I call dispose, but I'm facing an OutofMemory error after a while, below is a simplified example of what I'm trying to do
public void start() throws RuntimeException {
log.info("\t * Starting {} Managed Service...", getClass().getSimpleName());
try {
executeObserve();
log.info("\t * Starting {} Managed Service...OK!", getClass().getSimpleName());
} catch (Exception e) {
log.info("Managed Service {} FAILED! Reason is {} ", getClass().getSimpleName(), e.getMessage(), e);
}
}
start is invoked at the initialization phase once, the executeObserve is as follows (in a simplified form..). Notice that on the onComplete I "resubmit" executeObserve
public void executeObserve() throws RuntimeException {
Observable<Book> booksObserve
= manager.getAsObservable();
booksObserve
.map(Book::getAllOrders)
.flatMap(Observable::fromIterable)
.toList()
.subscribeOn(Schedulers.io())
.subscribe(collectedISBN ->
Observable.fromIterable(collectedISBN)
.buffer(10)
// ...some more steps here...
.toList()
.toObservable()
// resubmit
.doOnComplete(this::executeObserve)
.subscribe(validISBN -> {
// do something with the valid ones
})
)
);
}
My guess is that this is not the way to go if I want to resubmit my tasks but it was not possible to find any documentation.
the booksObserve is implemented as follows
public Observable<Book> getAsObservable() {
return Observable.create(e -> {
try (CloseableResultSet<Book> rs = (CloseableResultSet<Book>) datasource.retrieveAll())) {
for (Book r : rs) {
e.onNext(r);
}
e.onComplete();
} catch (Exception ex) {
e.onError(ex);
}
});
}
What is the correct way to constantly resubmit an operation until we call dispose or equivalent? I'm using RxJava 2
You have created an endless recursion, the loop will create more and more resources and sometime it will blow with OutOfMemory/Stack overflow exception.
In order to repeat the Observable work you should use repeat() operator, it will resubscribes to the Observable when it receives onComplete().
Besides that, some general comments on your code:
why are you nesting the second Observable inside the subscriber? you are breaking the chain, you can just continue the chain instead of creating new Observable at the Subscriber.
Moreover, it's seems (assuming Observable.fromIterable(collectedBets) using the collectedISBN that gets with the onNext() o.w. from where does it comes?) you're collecting all items to a list, and then flatting it again using from iterable, so it's seems you can just continue on the stream , something like that:
booksObserve
.map(Book::getAllOrders)
.flatMap(Observable::fromIterable)
.buffer(10)
// ...some more steps here...
.toList()
.toObservable()
// resubmit
.doOnComplete(this::executeObserve)
.subscribeOn(Schedulers.io())
.subscribe(validISBN -> {
// do something with the valid ones
});
Anyhow, with the nested Observable, the repeat() operator will just repeat the nested one, and not the entire stream (which is what you want) as it is not connected to it.
In continuation to my question the repeat as #yosriz suggested is the proper way to go, the following simple snippet demonstrates that the observable source will be called on each repeat
Observable<Integer> recursiveObservable = Observable.create(emitter -> {
System.out.println("Calling to emit data");
Lists.newArrayList(1, 2, 3, 4, 5, 6, 7, 8, 9, 0).forEach(emitter::onNext);
emitter.onComplete();
});
recursiveObservable
.buffer(2)
.repeat()
.subscribe(integers -> {
System.out.println(integers);
TimeUnit.SECONDS.sleep(1);
});

Generate infinite sequence of Natural numbers using RxJava

I am trying to write a simple program using RxJava to generate an infinite sequence of natural numbers. So, far I have found two ways to generate sequence of numbers using Observable.timer() and Observable.interval(). I am not sure if these functions are the right way to approach this problem. I was expecting a simple function like one we have in Java 8 to generate infinite natural numbers.
IntStream.iterate(1, value -> value +1).forEach(System.out::println);
I tried using IntStream with Observable but that does not work correctly. It sends infinite stream of numbers only to first subscriber. How can I correctly generate infinite natural number sequence?
import rx.Observable;
import rx.functions.Action1;
import java.util.stream.IntStream;
public class NaturalNumbers {
public static void main(String[] args) {
Observable<Integer> naturalNumbers = Observable.<Integer>create(subscriber -> {
IntStream stream = IntStream.iterate(1, val -> val + 1);
stream.forEach(naturalNumber -> subscriber.onNext(naturalNumber));
});
Action1<Integer> first = naturalNumber -> System.out.println("First got " + naturalNumber);
Action1<Integer> second = naturalNumber -> System.out.println("Second got " + naturalNumber);
Action1<Integer> third = naturalNumber -> System.out.println("Third got " + naturalNumber);
naturalNumbers.subscribe(first);
naturalNumbers.subscribe(second);
naturalNumbers.subscribe(third);
}
}
The problem is that the on naturalNumbers.subscribe(first);, the OnSubscribe you implemented is being called and you are doing a forEach over an infinite stream, hence why your program never terminates.
One way you could deal with it is to asynchronously subscribe them on a different thread. To easily see the results I had to introduce a sleep into the Stream processing:
Observable<Integer> naturalNumbers = Observable.<Integer>create(subscriber -> {
IntStream stream = IntStream.iterate(1, i -> i + 1);
stream.peek(i -> {
try {
// Added to visibly see printing
Thread.sleep(50);
} catch (InterruptedException e) {
}
}).forEach(subscriber::onNext);
});
final Subscription subscribe1 = naturalNumbers
.subscribeOn(Schedulers.newThread())
.subscribe(first);
final Subscription subscribe2 = naturalNumbers
.subscribeOn(Schedulers.newThread())
.subscribe(second);
final Subscription subscribe3 = naturalNumbers
.subscribeOn(Schedulers.newThread())
.subscribe(third);
Thread.sleep(1000);
System.out.println("Unsubscribing");
subscribe1.unsubscribe();
subscribe2.unsubscribe();
subscribe3.unsubscribe();
Thread.sleep(1000);
System.out.println("Stopping");
Observable.Generate is exactly the operator to solve this class of problem reactively. I also assume this is a pedagogical example, since using an iterable for this is probably better anyway.
Your code produces the whole stream on the subscriber's thread. Since it is an infinite stream the subscribe call will never complete. Aside from that obvious problem, unsubscribing is also going to be problematic since you aren't checking for it in your loop.
You want to use a scheduler to solve this problem - certainly do not use subscribeOn since that would burden all observers. Schedule the delivery of each number to onNext - and as a last step in each scheduled action, schedule the next one.
Essentially this is what Observable.generate gives you - each iteration is scheduled on the provided scheduler (which defaults to one that introduces concurrency if you don't specify it). Scheduler operations can be cancelled and avoid thread starvation.
Rx.NET solves it like this (actually there is an async/await model that's better, but not available in Java afaik):
static IObservable<int> Range(int start, int count, IScheduler scheduler)
{
return Observable.Create<int>(observer =>
{
return scheduler.Schedule(0, (i, self) =>
{
if (i < count)
{
Console.WriteLine("Iteration {0}", i);
observer.OnNext(start + i);
self(i + 1);
}
else
{
observer.OnCompleted();
}
});
});
}
Two things to note here:
The call to Schedule returns a subscription handle that is passed back to the observer
The Schedule is recursive - the self parameter is a reference to the scheduler used to call the next iteration. This allows for unsubscription to cancel the operation.
Not sure how this looks in RxJava, but the idea should be the same. Again, Observable.generate will probably be simpler for you as it was designed to take care of this scenario.
When creating infinite sequencies care should be taken to:
subscribe and observe on different threads; otherwise you will only serve single subscriber
stop generating values as soon as subscription terminates; otherwise runaway loops will eat your CPU
The first issue is solved by using subscribeOn(), observeOn() and various schedulers.
The second issue is best solved by using library provided methods Observable.generate() or Observable.fromIterable(). They do proper checking.
Check this:
Observable<Integer> naturalNumbers =
Observable.<Integer, Integer>generate(() -> 1, (s, g) -> {
logger.info("generating {}", s);
g.onNext(s);
return s + 1;
}).subscribeOn(Schedulers.newThread());
Disposable sub1 = naturalNumbers
.subscribe(v -> logger.info("1 got {}", v));
Disposable sub2 = naturalNumbers
.subscribe(v -> logger.info("2 got {}", v));
Disposable sub3 = naturalNumbers
.subscribe(v -> logger.info("3 got {}", v));
Thread.sleep(100);
logger.info("unsubscribing...");
sub1.dispose();
sub2.dispose();
sub3.dispose();
Thread.sleep(1000);
logger.info("done");

Categories

Resources