RxJava with vertx: can't have multiple subscriptions exception - java

I'm trying avoid vertx callback hell with RxJava.
But I have "rx.exceptions.OnErrorNotImplementedException: Cannot have multiple subscriptions". What's wrong here?
public class ShouldBeBetterSetter extends AbstractVerticle {
#Override
public void start(Future<Void> startFuture) throws Exception {
Func1<AsyncMap<String,Long>, Observable<Void>> obtainAndPutValueToMap = stringLongAsyncMap -> {
Long value = System.currentTimeMillis();
return stringLongAsyncMap.putObservable("timestamp", value)
.doOnError(Throwable::printStackTrace)
.doOnNext(aVoid -> System.out.println("succesfully putted"));
};
Observable<AsyncMap<String,Long>> clusteredMapObservable =
vertx.sharedData().<String,Long>getClusterWideMapObservable("mymap")
.doOnError(Throwable::printStackTrace);
vertx.periodicStream(3000).toObservable()
.flatMap(l-> clusteredMapObservable.flatMap(obtainAndPutValueToMap))
.forEach(o -> {
System.out.println("just printing.");
});
}
}
Working Verticle (without Rx) can be found here:
https://gist.github.com/IvanZelenskyy/9d50de8980b7bdf1e959e19593f7ce4a

vertx.sharedData().getClusterWideMapObservable("mymap") returns observable, which supports single subscriber only - hence exception. One solution worth a try is:
Observable<AsyncMap<String,Long>> clusteredMapObservable =
Observable.defer(
() -> vertx.sharedData().<String,Long>getClusterWideMapObservable("mymap")
);
That way every time clusteredMapObservable.flatMap() will be called, it will subscribe to new observable returned by Observable.defer().
EDIT
In case it's OK to use same AsyncMap, as pointed by #Ivan Zelenskyy, solution can be
Observable<AsyncMap<String,Long>> clusteredMapObservable =
vertx.sharedData().<String,Long>getClusterWideMapObservable("mymap").cache()

What's happening is that on each periodic emission, the foreach is re-subscribing to the clusteredMapObservable variable you defined above.
To fix, just move the call to vertx.sharedData().<String,Long>getClusterWideMapObservable("mymap") inside your periodic stream flatmap.
Something like this:
vertx.periodicStream(3000).toObservable()
.flatMap(l-> vertx.sharedData().<String,Long>getClusterWideMapObservable("mymap")
.doOnError(Throwable::printStackTrace)
.flatMap(obtainAndPutValueToMap))
.forEach(o -> {
System.out.println("just printing.");
});
UPDATE
If you don't like labmda in lambda, then don't. Here's an update without
vertx.periodicStream(3000).toObservable()
.flatMap(l-> {
return vertx.sharedData().<String,Long>getClusterWideMapObservable("mymap");
})
.doOnError(Throwable::printStackTrace)
.flatMap(obtainAndPutValueToMap)
.forEach(o -> {
System.out.println("just printing.");
});
PS - Your call to .flatMap(obtainAndPutValueToMap)) is also lambda in lambda - you've just moved it into a function.

Related

Infinite flux of finite fluxes. Or: how to reconnect to a websocket steam without completing the flux

I'm not sure if what I'm trying to do is anywhere near best practice. If not, I'd like to know!
I'm building a system that reacts to external events posted on websockets. After authentication I can request events to be emitted and listen for them on a single flux. I can attach actuators to that flux to do stuff (think home automation, button presses turn on lights).
A problem occurs when the websocket connection drops: The flux completes and I have to re-attach all the listening actuators to a new flux on a new connection. I thought it might be easier to have an infinite flux that just starts emitting again when there is a new connection with new events (auth and requesting the events has to be done again during startup of new connection of course).
I came up with a solution that does what I want it to do, but it feels like a big hack:
private Flux<String> finite() {
return Flux.interval(Duration.ofMillis(10)).handle((i, sink) -> {
if (i == 10) {
sink.complete();
} else {
sink.next(i);
}
}).map(it -> "hi :" + it).take(100);
}
public static class Holder {
#Setter
FluxSink<Object> sink;
public void next(EmitterProcessor<String> output) {
sink.next(output);
}
}
EmitterProcessor<String> output;
Holder holder;
#DisplayName("Should create infinite flux of finite fluxes")
#Test
public void infiniteFlux() throws InterruptedException {
AtomicInteger count1 = new AtomicInteger(0);
AtomicInteger count2 = new AtomicInteger(0);
holder = new Holder();
ConnectableFlux flux = Flux.create(fluxSink -> holder.setSink(fluxSink)).map(it -> (Flux) it).flatMap(it -> it).publish();
flux.connect();
flux.subscribe((it) -> count1.incrementAndGet());
Thread.sleep(150);
output = EmitterProcessor.create();
finite().subscribeWith(output);
holder.next(output);
Thread.sleep(150);
flux.subscribe((it) -> count2.incrementAndGet());
output = EmitterProcessor.create();
finite().subscribeWith(output);
holder.next(output);
Thread.sleep(150);
assertEquals(20, count1.get());
assertEquals(10, count2.get());
}
Can this maybe be done in a way with a processor in stead? Or should I forget about it and reconnect everything on completion of the websocket connection?
This is my current websocket implementation. I don't really like that I need to "know" all the types of events when starting to listen, but that is because I'm still not 100% sure how to handle sending and receiving with websockets
public void startListening(Flux<HaEventRequest> eventRequests) {
Mono<String> login = Mono.just(loginPayload());
Flux<String> subscribe = eventRequests
.doOnNext(req -> log.info("Registering: " + req.getEventType()))
.map(Json::write);
Flux<String> input = login.concatWith(subscribe);
EmitterProcessor<HaEventResponse> output = EmitterProcessor.create();
Mono<Void> sessionMono = client.execute(URI.create(wsUrl), session -> session.send(input.map(session::textMessage))
.thenMany(session.receive()
.map(WebSocketMessage::getPayloadAsText)
.map(message -> Json.read(message, HaEventResponse.class))
.filter(eventResponse -> eventResponse.getEvent() != null)
.subscribeWith(output)
.then())
.then());
eventFlux = output.doOnSubscribe(s -> sessionMono.subscribe())
.onErrorContinue((throwable, o) -> log.error("Error occured during web request. Dumping it: ", throwable));
}
public Flux<HaEventResponse> streamEvents(String eventType) {
return eventFlux.filter(eventResponse -> eventResponse.getEvent()
.getEventType()
.equals(eventType));
}

Filter a List and return the response from a CompletableFuture java async operation

Hello I have to filter and return the result of a CompletableFuture and store it in an object variable to work with this object after the filter, the Completable method which extract the list from the database is and is located in the salonService is :
public CompletableFuture<List<SalonDTO>> listAllSalons() {
return salonRepository.findAllAsync()
.thenApply(salonList -> ObjectMapperUtils.mapAll(salonList, salonDTO.class));
}
Then I'm trying to filter the info in the next way:
public CompletableFuture<List<SalonDTO>> listKidsByGuardian1() {
return salonService.listAll()
.thenApply(salonDTOList -> {
findsalonByChildAge(salonDTOList);
return salonDTOList;
});
}
private SalonDTO findsalonByChildAge(List<SalonDTO> salonDTOList) {
salonDTOList.stream()
.filter(salon -> salon.getMinAge() > 13);
}
I'm not pretty familiar with the CompletableFuture class, so I don't understand how Can I get a simple object from this async operation. Besides that it is not easy to debug these async methods. Any advice?
Thanks!

RxJava -2 Observables that accepts more Observables at any time?

I'm currently using rx-java 2 and have a use case where multiple Observables need to be consumed by single Camel Route subscriber.
Using this solution as a reference, I have a partly working solution. RxJava - Merged Observable that accepts more Observables at any time?
I'm planning to use a PublishProcessor<T> that will be subscribed to one camel reactive stream subscriber and then maintain a ConcurrentHashSet<Flowable<T>> where I can dynamically add new Observable.
I'm currently stuck on how can I add/manage Flowable<T> instances with PublishProcessor?
I'm really new to rx java, so any help is appreciated! This is what I have so far :
PublishProcessor<T> publishProcessor = PublishProcessor.create();
CamelReactiveStreamsService camelReactiveStreamsService =
CamelReactiveStreams.get(camelContext);
Subscriber<T> subscriber =
camelReactiveStreamsService.streamSubscriber("t-class",T.class);
}
Set<Flowable<T>> flowableSet = Collections.newSetFromMap(new ConcurrentHashMap<Flowable<T>, Boolean>());
public void add(Flowable<T> flowableOrder){
flowableSet.add(flowableOrder);
}
public void subscribe(){
publishProcessor.flatMap(x -> flowableSet.forEach(// TODO)
}) .subscribe(subscriber);
}
You can have a single Processor and subscribe to more than one observable stream. You would need to manage the subscriptions by adding and removing them as you add and remove observables.
PublishProcessor<T> publishProcessor = PublishProcessor.create();
Map<Flowable<T>, Disposable> subscriptions = new ConcurrentHashMap<>();
void addObservable( Flowable<T> flowable ) {
subscriptions.computeIfAbsent( flowable, fkey ->
flowable.subscribe( publishProcessor ) );
}
void removeObservable( Flowable<T> flowable ) {
Disposable d = subscriptions.remove( flowable );
if ( d != null ) {
d.dispose();
}
}
void close() {
for ( Disposable d: subscriptions.values() ) {
d.dispose();
}
}
Use the flowable as the key to the map, and add or remove subscriptions.

RxJava: Invoking multiple unrelated methods sequentially

I use RxJava in my project and I have a situation where 2 methods are called one after the other and both return void. Each of these methods internally use RxJava.
Pseudo:
void sendMsg_1() {
...
//Fetch data from DB using RxJava to send message to client.
..
}
void sendMsg_2() {
...
Uses RxJava to send message to client.
..
}
Invoking code:
sendMsg_1();
sendMsg_2();
Practically, sendMsg_2 is faster and client gets it before sendMsg_1 sends his msg. This is not good for me and I would like output of Msg1 be sent before Msg2.
How to do it?
Should I artificially return dummy observable just so I can use .flatMap as follow:
sendMsg_1()
.flatMap(msgObj-> {
return sendMsg_2();
}).subscribe();
Is there a better way?
Thank you!
This is the right way to emit void methods the rxJava's way:
public rx.Completable func_A() {
return Completable.create(subscriber -> {
// func_A logic
if(ok) subscriber.onCompleted();
else subscriber.onError(throwable);
});
}
func_A()
.doOnCompleted(() -> func_B())
.subscribe();
If it helps anyone...enjoy :-)

How do I create and complete a play.libs.F.Promise?

I'd like to create a play.libs.F.Promise from a call to an async third-party service so I can chain the call and return a Promise<Result> instead of blocking inside the controller. Something like so:
final Promise<String> promise = new Promise();
service.execute(new Handler() {
public void onSuccess(String result) {
promise.complete(result);
}
})
return promise;
Unfortunately, there does not appear to be a way to create an empty play.libs.F.Promise, and there is no method to complete a promise, either?
You have to use a F.RedeemablePromise.
RedeemablePromise<String> promise = RedeemablePromise.empty();
promise.map(string ->
// This would apply once the redeemable promise succeed
ok(string + " World!")
);
// In another thread, you now may complete the RedeemablePromise.
promise.success("Hello");
// OR you can fail the promise
promise.failure(new Exception("Problem"));
Assuming the current version of play and the play.libs.F.Promise, a promise can be created in two ways: 1) Using a scala Future and Callback or 2) using a play Function0 (replace A for any class):
import static akka.dispatch.Futures.future;
//Using 1)
Promise<A> promise=Promise.wrap(future(
new Callable<A>() {
public A call() {
//Do whatever
return new A();
}
}, Akka.system().dispatcher()));
//Using 2) - This is described in the Play 2.2.1 Documentation
// http://www.playframework.com/documentation/2.2.1/JavaAsync
Promise<A> promise2= Promise.promise(
new Function0<A>() {
public A apply() {
//Do whatever
return new A();
}
}
);
EDIT: When you can't modify the async block because it's provided by a third party you can use the approach of creating an empty Promise (scala promise, not play framework promise). Then you can use the Future containing the scala Promise to generate a play.libs.F.Promise as follows:
import akka.dispatch.Futures;
final scala.concurrent.Promise<String> promise = Futures.promise();
service.execute(new Handler() {
public void onSuccess(String result) {
promise.success(result);
}
})
return Promise.wrap(promise.future());
You can return empty promise by doing the following:
return F.Promise.pure(null);
You can create F.Promise like this:
F.Promise<User> userPromise = F.Promise.promise(() -> getUserFromDb());
and use its value when it is ready:
userPromise.onRedeem(user -> showUserData(user));

Categories

Resources