RxJava Combining Multiple Observer after filter - java

Following is my Current Code
private final List<Disposable> subscriptions = new ArrayList<>();
for (Instrument instrument : instruments) {
// Waiting for OrderBook to generate Reliable results.
GenericBook Book =
service
.getBook(instrument.getData())
.filter(gob -> onBookUpdate(gob))
.blockingFirst();
subscriptions.add(
service
.getBook(instrument.getData())
.subscribe(
gob -> {
try {
onBookUpdate(gob);
} catch (Exception e) {
logger.error("Error on subscription:", e);
}
},
e -> logger.error("Error on subscription:", e)));
}
So what it does is for each instrument it first Block wait till the output of onBookUpdate(gob) Becomes true. onBookUpdate(gob) returns boolean.
Once we have first onBookUpdate as true then i Will push that subscriber into subscriptions variable.
This slow down as I have to wait foreach instrument and then move on the next instrument.
My Goal is to run all these in parallel then wait all to finish and push them to subscriptions variable.
I tried zip but didn't work
List<Observable<GenericOrderBook>> obsList = null;
for (Instrument instrument : instruments) {
// This throws nullException.
obsList.add(service
.getBook(instrument.getData())
.filter(gob -> onBookUpdate(gob))
.take(1));
}
}
// Some how wait over here until all get first onBookUpdate as true.
String o = Observable.zip(obsList, (i) -> i[0]).blockingLast();

When using observables etc, one should embrace them wholeheartedly. One of the premises for embracing is to separate the configuration and construction of your pipeline from its execution.
In other words, configure your pipeline upfront and then, when the data is available, send the data through it.
Furthermore, embracing observables implies avoiding for-loops.
I'm not 100% what your use case is but what I'd suggest is to create a pipeline that takes an instrument as input and returns a subscription...
So something like
service.getBook(instrument.getData())
.flatMap(gob -> {
onBookUpdate(gob);
return gob;
});
That will return an Observable that you can subscribe to and add the result to the subscriptions.
Then create a seed observable that pumps the instrument objects into it.
Not sure of some of the details of your API, so come back to me if this is not clear or I've made a wrong assumption.

I am assuming instruments to be a List. If yes, then you can do something like this,
Observable
.fromIterable(instruments)
// Returns item from instrument list one by one and passes it to getBook()
.flatmap(
instrument -> getBook(instrument.getData())
)
.filter(
gob -> onBookUpdate(gob)
)
// onComplete will be called if no items from filter
.switchIfEmpty(Observable.empty())
.subscribe(
onBookUpdateResponse -> // Do what you want,
error -> new Throwable(error)
);
Hope this helps.

Related

What's the easiest way to wait for Mono completion in the background?

We are given a Mono, that's handling some action(say a database update), and returns a value.
We want to add that Mono(transformed) to a special list that contains actions to be completed for example during shutdown.
That mono may be eagerly subscribed after adding to the list, to start processing now, or .subscribe() might not be called meaning it will be only subscribed during shutdown.
During shutdown we can iterate on the list in the following way:
for (Mono mono : specialList) {
Object value = mono.block(); // (do something with value)
}
How to transform the original Mono such that when shutdown code executes, and Mono was previously subscribed(), the action will not be triggered again but instead it will either wait for it to complete or replay it's stored return value?
OK, looks like it is as simple as calling mono.cache(), so this is how I used it in practice
public Mono<Void> addShutdownMono(Mono<Void> mono) {
mono = mono.cache();
Mono<Void> newMono = mono.doFinally(signal -> shutdownMonos.remove(mono));
shutdownMonos.add(mono);
return newMono;
}
public Function<Mono<Void>,Mono<Void>> asShutdownAwaitable() {
return mono -> addShutdownMono(mono);
}
database.doSomeAction()
.as(asShutdownAwaitable)
.subscribe() // Or don't subscribe at all, deferring until shutdown
Here is the actual shutdown code.
It was also important to me that they execute in order of being added, if user chose not to eagerly subscribe them, that's reason for Flux.concat instead of Flux.merge.
public void shutdown() {
Flux.concat(Lists.transform(new ArrayList<>(shutdownMonos), mono -> mono.onErrorResume(err -> {
logger.error("Async exception during shutdown, ignoring", err);
return Mono.empty();
}))
).blockLast();
}

how does flatMapDelayError flow working in flux to make parallel call

I want to call two differnt down stream service and have zip the results. I have tired the below, want to confirm is there is any other best approach that we have or how does flatMapDelayError functioning? i am not blocking the call anywhere. does this two calls will happen parallelly or the second call will wait for the first one? i want both the call should happen parallelly
Flux<EmpAddressDetail> empAddDetail = getTimeoutDuration()
.flatMapDelayError(duration -> timeoutWrappedEmpDetailFlux(Service.getemeEmpAddress(empno),
duration, Exception.ErrorCode.TIMED_OUT), CONCURRENCY, PREFETCH);
Flux<Employee> empInfo= getTimeoutDuration()
.flatMap(duration -> mapEmpTypes(empTypes)
.map(empTypedata -> Tuples.of(duration, empTypedata)))
.flatMapDelayError(durationEmpTuple -> getEmpdetails(empno, durationBusinessTuple.getT1(), durationEmpTuple.getT2())
.filter(empdetails -> requestTypes.contains(empdetails.getType()))
.doOnNext(empdetails -> empdetails.setEmpId(empno)), CONCURRENCY, PREFETCH);
return Flux.zip( empInfo, empAddDetail ).map(tuple -> {
Employee emp = tuple.getT1();
emp.setAddressDetail(tuple.getT2());
return emp;
});

RxJava: Merging Observable with Completable does not work

I have an Observable that at some point has to write things to the cache - and we would like to wait that writes are done before finishing the whole operation on the observable (for reporting purposes).
For the purpose of test, the cache write Completable looks like this:
Completable.create(
emitter ->
new Thread(
() -> {
try {
Thread.sleep(2000);
doSomething();
emitter.onComplete();
} catch (InterruptedException e) {
e.printStackTrace();
}
})
.start());
Since I have several cache writes, I try to merge them in a container class:
public class CacheInsertionResultsTracker {
private Completable cacheInsertResultsCompletable;
public CacheInsertionResultsTracker() {
this.cacheInsertResultsCompletable = Completable.complete();
}
public synchronized void add(Completable cacheInsertResult) {
this.cacheInsertResultsCompletable = this.cacheInsertResultsCompletable.mergeWith(cacheInsertResult);
}
public Completable getCompletable() {
return this.cacheInsertResultsCompletable;
}
}
And I try to merge it with Observable in a following way:
CacheInsertionResultsTracker tracker = new ...;
observable
.doOnNext(next->tracker.add(next.writeToCache(...)))
.mergeWith(Completable.defer(()->tracker.getCompletable()))
.subscribe(
// on next
this::logNextElement
// on error
this::finishWithError
// on complete
this::finishWithSuccess
);
How could I make sure that by the time finishWithSuccess is called the doSomething is completed?
The problem is that the Completable reference is updated every time I add a new one, and it happens after the mergeWith runs...
The solution that seems to work for our use case is to use concatWith + defer:
observable
.doOnNext(next->tracker.add(next.writeToCache(...)))
.concatWith(Completable.defer(()->tracker.getCompletable()))
.subscribe(
// on next
this::logNextElement
// on error
this::finishWithError
// on complete
this::finishWithSuccess
);
Concat assures that the subscription to the Completable happens only after the Observable is done, and defer defers getting the final Completable till this subscription (so all the objects are already added to the tracker).
Based on the comments, you could replace the completable cache with ReplaySubject<Completable>, do some timeout to detect inactivity and have the observable sequence end.
ReplaySubject<Completable> cache = ReplaySubject.create();
cache.onNext(completable);
observable.mergeWith(
cache.flatMapCompletable(v -> v)
.timeout(10, TimeUnit.MILLISECONDS, Completable.complete())
)
Edit:
Your updated example implies you want to run Completables in response to items in the main observable, isolated to that sequence, and wait for all of them to complete. This is a typical use case for flatMap:
observable.flatMap(
next -> next.writeToCache(...).andThen(Observable.just(next))
)
.subscribe(
this::logNextElement
// on error
this::finishWithError
// on complete
this::finishWithSuccess
);

Limit for `onErrorContinue(...)` in Flux?

I have a (possibly infinite) Flux source that is supposed to first store each message (e.g. into a database) and then asynchronously forward the messages (e.g. using Spring WebClient).
The forward(s) in case of failure are supposed to log an error, without completing the source Flux.
I however realized that forward(s) wihtin the flow (flatMap(...)) block execution of the source Flux after exactly 256 messages that cause exceptions (e.g. reactor.retry.RetryExhaustedException).
Representative example that fails in the assert since only 256 messages are processed:
#Test
#SneakyThrows
public void sourceBlockAfter256Exceptions() {
int numberOfRequests = 500;
Set<Integer> sink = new HashSet<>();
Flux
.fromStream(IntStream.range(0, numberOfRequests).boxed())
.map(sink::add)
.flatMap(i -> Mono
// normally the forwards are contained here e.g. by means of Mono.when(...).thenReturn(...).retryWhen(...):
.error(new Exception("any"))
)
.onErrorContinue((throwable, o) -> log.error("Error", throwable))
.subscribe();
Thread.sleep(3000);
Assertions.assertEquals(numberOfRequests, sink.size());
}
Doing the forward within the subscribe(...) doesn't block the source Flux but that's certainly no solution, since I don't possibly want to lose messages.
Questions:
What has happened here? (probably related to some state stored in just one bit)
How can I do this correctly?
EDIT:
According to the discussion below I've constructed an example that uses FluxMessageChannel (which up to my understanding is made for infinite streams and definitly not expected to block after 256 Errors) and has exactly the same behaviour:
#Test
#SneakyThrows
public void maxConnectionWithChannelTest() {
int numberOfRequests = 500;
Set<Integer> sink = new HashSet<>();
FluxMessageChannel fluxMessageChannel = MessageChannels.flux().get();
fluxMessageChannel.subscribeTo(
Flux
.fromStream(IntStream
.range(0, numberOfRequests).boxed()
.map(i -> MessageBuilder.withPayload(i).build())
)
.map(Message::getPayload)
.map(sink::add)
.flatMap(i -> Mono.error(new Exception("whatever")))
);
Flux
.from(fluxMessageChannel)
.subscribe();
Thread.sleep(3000);
Assert.assertEquals(numberOfRequests, sink.size());
}
EDIT:
I just raised an issue in the reactor core project: https://github.com/reactor/reactor-core/issues/2011

RxJava polling + manual refresh

I have a list a want to refresh every minute.
For example the user list here : https://github.com/android10/Android-CleanArchitecture/blob/master/domain/src/main/java/com/fernandocejas/android10/sample/domain/interactor/GetUserList.java
I add a periodical refresh using repeatWhen :
public Observable<List<User>> buildUseCaseObservable(Void unused) {
return this.userRepository
.users()
.repeatWhen(new Function<Observable<Object>, ObservableSource<?>>() {
#Override
public ObservableSource<?> apply(Observable<Object> objectObservable) throws Exception {
return objectObservable.delay(1, TimeUnit.MINUTES);
}
});
}
It works fine this way, calling onNext every minute.
But if I want to refresh immediately this list (because of user's action or because of a notification), I don't know how to perform that.
Should I cancel/dispose the observable and restart a new one ?
Thanks
From your code I understand that the users list is generated and emitted upon subscription.
Here are some solutions I can think of, instead of unsubscribing and resubscribing upon the event to which you want to react immediately:
Instead of using the repeatWhen operator, use the interval creation operator combined with the flatMap to invoke the subscription to a new Observable every minute and use the merge operator to add reaction to the other event in which you are interested. Something like this:
#Test
public void intervalObservableAndImmediateReaction() throws InterruptedException {
Observable<String> obs = Observable.interval(1, TimeUnit.SECONDS)
.cast(Object.class)
.mergeWith(
Observable.just("mockedUserClick")
.delay(500, TimeUnit.MILLISECONDS))
.flatMap(
timeOrClick -> Observable.just("Generated upon subscription")
);
obs.subscribe(System.out::println);
Thread.currentThread().sleep(3000); //to see the prints before ending the test
}
or adjusted to your needs (but the principal is the same):
Observable.interval(1, TimeUnit.MINUTES)
.mergeWith(RxView.clicks(buttonView))
.flatMap(timeOrClick -> this.userRepository.users());
You can use the flatMap operator as before, even while keeping you working current implementation and without merging to an interval - just keep your working code and in another area of the programme chain it to the RxBinding of your choosing:
RxView.touches(yourViewVariable)
.flatMatp(motionEvent -> this.userRepository.users())
.subscribe(theObserver);
Note that in this solution the subscription is done independently to the two observables. You'll probably be better off if you use different observers, or manage a subject or something on that line. A small test I ran showed one subscriber handled subscribing to 2 different observables with no problem (in Rxjava1 - didn't check in Rxjava2 yet), but it feels iffy to me.
If you aren't concerned with adjusting the refresh time after one of the other observables emits data you can do something like the following:
// Specific example of a user manually requesting
val request = Observable.create<String> { emitter ->
refresh.setOnClickListener {
emitter.onNext("Click Request")
}
}
.observeOn(Schedulers.io())
.flatMap {
userRepository.users()
}
// Refresh based off of your original work, could use something like interval as well
val interval = userRepository.users()
.subscribeOn(Schedulers.io())
.repeatWhen { objectObservable ->
objectObservable.delay(1, TimeUnit.MINUTES)
}
// Combine them so that both emissions are received you can even add on another source
Observable.merge(request,interval)
.observeOn(AndroidSchedulers.mainThread())
.subscribe({
contents.text = it.toString()
}, {
contents.text = it.toString()
},{
println(contents.text)
})
Then you don't have to dispose and resubscribe every time

Categories

Resources