I am working on spring reactive and need to call multiple calls sequentially to other REST API using webclient.
The issue is I am able to call multiple calls to other Rest API but response am not able to read without subscribe or block.
I can't use subscribe or block due to non reactive programming. Is there any way, i can merge while reading the response and send it as flux.
Below is the piece of code where I am stuck.
public Mono<DownloadDataLog> getDownload(Token dto, Mono<DataLogRequest> request) {
Mono<GraphQlCustomerResponse> profileResponse = customerProfileHandler.getMyUsageHomeMethods(dto, null);
DownloadDataLog responseObj = new DownloadDataLog();
ArrayList<Mono<List<dataUsageLogs>>> al = new ArrayList<>();
return Mono.zip(profileResponse, request).flatMap(tuple2 -> {
Flux<List<Mono<DataLogGqlRequest>>> userequest = prepareUserRequest(getListOfMdns(tuple2.getT1()),
tuple2.getT2());
Flux.from(userequest).flatMap(req -> {
for (Mono<DataLogGqlRequest> logReq : req) {
al.add(service.execute(logReq, dto));
}
responseObj.setAl(al);
return Mono.empty();
}).subscribe();
return Mono.just(responseObj);
});
}
private Mono<DataLogGqlRequest> prepareInnerRequest(Mono<DataLogGqlRequest> itemRequest, int v1,int v2){
return itemRequest.flatMap(req -> {
DataLogGqlRequest userRequest = new DataLogGqlRequest();
userRequest.setBillDate(req.getBillDate());
userRequest.setMdnNumber(req.getMdnNumber());
userRequest.setCposition(v1+"");
userRequest.setPposition(v2+"");
return Mono.just(userRequest);
});
}
Related
New to reactive programming and also Spring Webflux, I have a method to get value from redis and expire the key under certain conditions. But the code expire key always not work.
my current implmention:
private Mono<MyObject> getMyObjectFromCache(String url) {
RMapReactive<String, String> rMap = redissonReactiveClient.getMap(url);
return rMap.readAllMap()
.flatMap(m ->
rMap.remainTimeToLive()
.flatMap(ttl -> {
final long renewalThreshold = 60 * 60 * 1000;
if (ttl <= renewalThreshold) {
System.out.println("start expiring");
// it doesn't work without subscribe()
rMap.expire(2, TimeUnit.HOURS);
}
return Mono.just(JSONObject.parseObject(JSON.toJSONString(m), MyObject.class));
}
)
);
}
expire method returns Mono<Boolean>
public Mono<MyObject> getMyObjInfo(String url) {
// something else
return getMyObjectFromFromCache(url).switchIfEmpty(Mono.defer(() -> getMyObjectFromRemoteService(url)));
}
CustomGatewayFilter
#Override
public Mono<Void> filter(ServerWebExchange exchange, GatewayFilterChain chain) {
ServerHttpRequest request = exchange.getRequest();
ServerHttpResponse response = exchange.getResponse();
URI uri = request.getURI();
return getMyObjInfo(uri.getPath())
.flatMap(api -> {
// something else
return chain.filter(exchange.mutate().request(request).build());
});
when I test the filter , only print start expiring , but it doesn't work.
if i add subscribe or block, it can work. obviously this is not a good idea, I shouldn't break the reactor chain.
please could I have a correctly way to write this?
thanks
In reactive you need to combine all async operations into a flow, chaining publishers (Mono/Flux) using various reactive operators (assembly time) and then subscribe to it (subscription time). You are right that calling subscribe explicitly is a bad practice and should be avoided. Spring WebFlux subscribes to the provided flow behind the scene.
In your code you are breaking the flow by not chaining rMap.expire(2, TimeUnit.HOURS);. You could rewrite the code like this
private Mono<MyObject> getMyObjectFromCache(String url) {
RMapReactive<String, String> rMap = redissonReactiveClient.getMap(url);
return rMap.readAllMap()
.flatMap(m ->
rMap.remainTimeToLive()
.flatMap(ttl -> {
final long renewalThreshold = 60 * 60 * 1000;
if (ttl <= renewalThreshold) {
System.out.println("start expiring");
// it doesn't work without subscribe()
return rMap.expire(2, TimeUnit.HOURS);
}
return Mono.just(false);
})
.then(JSONObject.parseObject(JSON.toJSONString(m), MyObject.class))
);
}
I have the following fragment where I want to return a Flux from a ResponseEntity<Response>:
#GetMapping("/{id}")
public Mono<ResponseEntity<Response>> findByDocumentClient(#PathVariable("id") String document){
return Mono.just(new ResponseEntity<>(new Response(technomechanicalService.findByDocumentClient(document), HttpStatus.OK.value(), null),
HttpStatus.OK))
.onErrorResume(error -> {
return Mono.just(new ResponseEntity<>(new Response(null, HttpStatus.BAD_REQUEST.value(), error.getMessage()),
HttpStatus.BAD_REQUEST));
});
}
The Response object is as follows:
public class Response{
private Object body;
private Integer status;
private String descStatus;
public Response(Object body, Integer status, String descStatus) {
this.body = body;
this.status = status;
this.descStatus = descStatus;
}
}
When consuming the Get method from postman, the service responds to the following:
{
"body": {
"scanAvailable": true,
"prefetch": -1
},
"status": 200,
"descStatus": null
}
Why does it generate this response? Why is the list of objects not responding?
It's because you are trying to code imperatively (traditional java) and you are serializing a Mono and not the actually value returned from the database. You should be coding functionally as reactor/webflux uses this type of development.
A Mono<T> is a producer that produces elements when someone subscribes to it. The subscriber is the one that started the call, in this case the client/browser.
Thats why you need to return a Mono<ResponseEntity> becuase when the client subscribes it will emit a ResponseEntity
So lets Look at your code:
#GetMapping("/{id}")
public Mono<ResponseEntity<Response>> findByDocumentClient(#PathVariable("id") String document){
return Mono.just(new ResponseEntity<>(new Response(technomechanicalService.findByDocumentClient(document), HttpStatus.OK.value(), null),
HttpStatus.OK))
.onErrorResume(error -> {
return Mono.just(new ResponseEntity<>(new Response(null, HttpStatus.BAD_REQUEST.value(), error.getMessage()),
HttpStatus.BAD_REQUEST));
});
}
The first thing you do, is to put your response straight into a Mono using Mono#just. In webflux a Mono is something that can emit something and as soon as you put something in one you are also telling the server that it can freely change which thread that performs the execution. So we basically want to go into a Mono as quick as possible so we can leverage webflux thread agnostic abilities.
then this line:
technomechanicalService.findByDocumentClient(document)
returns a Mono<T> and you place that in your your Response body. So it tries to serialize that into json, while you think that it takes its internal Value and serializes that its actually serializing the Mono.
So lets rewrite your code *im leaving out the error handling for now since im writing this on mobile:
#GetMapping("/{id}")
public Mono<ServerResponse> findByDocumentClient(#PathVariable("id") String document){
// We place our path variable in a mono so we can leverage
// webflux thread agnostic abilities
return Mono.just(document)
// We access the value by flatMapping and do our call to
// the database which will return a Mono<T>
.flatMap(doc -> technomechanicalService.findByDocumentClient(doc)
// We flatmap again over the db response to a ServerResponse
// with the db value as the body
.flatMap(value -> ServerResponse.ok().body(value)));
}
All this is super basic reactor/webflux stuff. I assume this is your first time using webflux. And if so i highly recommend going through the Reactor getting started of how the basics work because otherwise you will have a very hard time with reactor, and later on understanding webflux.
Agree with #Toerktumlare's answer. Quite comprehensive.
#Juan David Báez Ramos based on your answer(better if it were a comment), seems like what you want is putting technomechanicalService.findByDocumentClient(document) result as body in a Response object.
If so you can use Flux API's collectList() operator.
Example code:
#GetMapping("/{id}")
public Mono<ResponseEntity<Response>> findByDocumentClient(#PathVariable("id") String document) {
return technomechanicalService.findByDocumentClient(document)
.collectList()
.map(
listOfDocuments -> {
return new ResponseEntity<>(
new Response(listOfDocuments, HttpStatus.OK.value(), null), HttpStatus.OK);
}
)
.onErrorResume(
error -> {
return Mono.just(new ResponseEntity<>(
new Response(null, HttpStatus.BAD_REQUEST.value(), error.getMessage()),
HttpStatus.BAD_REQUEST));
}
);
}
I am developing prototype for a new project. The idea is to provide a Reactive Spring Boot microservice to bulk index documents in Elasticsearch. Elasticsearch provides a High Level Rest Client which provides an Async method to bulk process indexing requests. Async delivers callbacks using listeners are mentioned here. The callbacks receive index responses (per requests) in batches. I am trying to send this response back to the client as Flux. I have come up with something based on this blog post.
Controller
#RestController
public class AppController {
#SuppressWarnings("unchecked")
#RequestMapping(value = "/test3", method = RequestMethod.GET)
public Flux<String> index3() {
ElasticAdapter es = new ElasticAdapter();
JSONObject json = new JSONObject();
json.put("TestDoc", "Stack123");
Flux<String> fluxResponse = es.bulkIndex(json);
return fluxResponse;
}
ElasticAdapter
#Component
class ElasticAdapter {
String indexName = "test2";
private final RestHighLevelClient client;
private final ObjectMapper mapper;
private int processed = 1;
Flux<String> bulkIndex(JSONObject doc) {
return bulkIndexDoc(doc)
.doOnError(e -> System.out.print("Unable to index {}" + doc+ e));
}
private Flux<String> bulkIndexDoc(JSONObject doc) {
return Flux.create(sink -> {
try {
doBulkIndex(doc, bulkListenerToSink(sink));
} catch (JsonProcessingException e) {
sink.error(e);
}
});
}
private void doBulkIndex(JSONObject doc, BulkProcessor.Listener listener) throws JsonProcessingException {
System.out.println("Going to submit index request");
BiConsumer<BulkRequest, ActionListener<BulkResponse>> bulkConsumer =
(request, bulkListener) ->
client.bulkAsync(request, RequestOptions.DEFAULT, bulkListener);
BulkProcessor.Builder builder =
BulkProcessor.builder(bulkConsumer, listener);
builder.setBulkActions(10);
BulkProcessor bulkProcessor = builder.build();
// Submitting 5,000 index requests ( repeating same JSON)
for (int i = 0; i < 5000; i++) {
IndexRequest indexRequest = new IndexRequest(indexName, "person", i+1+"");
String json = doc.toJSONString();
indexRequest.source(json, XContentType.JSON);
bulkProcessor.add(indexRequest);
}
System.out.println("Submitted all docs
}
private BulkProcessor.Listener bulkListenerToSink(FluxSink<String> sink) {
return new BulkProcessor.Listener() {
#Override
public void beforeBulk(long executionId, BulkRequest request) {
}
#SuppressWarnings("unchecked")
#Override
public void afterBulk(long executionId, BulkRequest request, BulkResponse response) {
for (BulkItemResponse bulkItemResponse : response) {
JSONObject json = new JSONObject();
json.put("id", bulkItemResponse.getResponse().getId());
json.put("status", bulkItemResponse.getResponse().getResult
sink.next(json.toJSONString());
processed++;
}
if(processed >= 5000) {
sink.complete();
}
}
#Override
public void afterBulk(long executionId, BulkRequest request, Throwable failure) {
failure.printStackTrace();
sink.error(failure);
}
};
}
public ElasticAdapter() {
// Logic to initialize Elasticsearch Rest Client
}
}
I used FluxSink to create the Flux of Responses to send back to the Client. At this point, I have no idea whether this correct or not.
My expectation is that the calling client should receive the responses in batches of 10 ( because bulk processor processess it in batches of 10 - builder.setBulkActions(10); ). I tried to consume the endpoint using Spring Webflix Client. But unable to work it out. This is what I tried
WebClient
public class FluxClient {
public static void main(String[] args) {
WebClient client = WebClient.create("http://localhost:8080");
Flux<String> responseFlux = client.get()
.uri("/test3")
.retrieve()
.bodyToFlux(String.class);
responseFlux.subscribe(System.out::println);
}
}
Nothing is printing on console as I expected. I tried to use System.out.println(responseFlux.blockFirst());. It prints all the responses as a single batch at the end and not in batches at .
If my approach is correct, what is the correct way to consume it? For the solution in my mind, this client will reside is another Webapp.
Notes: My understanding of Reactor API is limited. The version of elasticsearch used is 6.8.
So made the following changes to your code.
In ElasticAdapter,
public Flux<Object> bulkIndex(JSONObject doc) {
return bulkIndexDoc(doc)
.subscribeOn(Schedulers.elastic(), true)
.doOnError(e -> System.out.print("Unable to index {}" + doc+ e));
}
Invoked subscribeOn(Scheduler, requestOnSeparateThread) on the Flux, Got to know about it from, https://github.com/spring-projects/spring-framework/issues/21507
In FluxClient,
Flux<String> responseFlux = client.get()
.uri("/test3")
.headers(httpHeaders -> {
httpHeaders.set("Accept", "text/event-stream");
})
.retrieve()
.bodyToFlux(String.class);
responseFlux.delayElements(Duration.ofSeconds(1)).subscribe(System.out::println);
Added "Accept" header as "text/event-stream" and delayed Flux elements.
With the above changes, was able to get the response in real time from the server.
I have an Play application that handles WebSocket requests. The routes file contains this line:
GET /testsocket controllers.HomeController.defaultRoomSocket
An already working, synchronous version looks like this: (adapted from 2.7.x docs)
public WebSocket defaultRoomSocket() {
return WebSocket.Text.accept(
request -> ActorFlow.actorRef(MyWebSocketActor::props, actorSystem, materializer));
}
As stated in https://www.playframework.com/documentation/2.7.x/JavaWebSockets#Accepting-a-WebSocket-asynchronously I changed the signature to
public CompletionStage<WebSocket> defaultRoomSocket(){
//returning a CompletionStage here, using the "ask pattern"
//to get the needed Flow from an other Actor
}
From here I run into the following problem:
Cannot use a method returning java.util.concurrent.CompletionStage[play.mvc.WebSocket] as a Handler for requests
Further more, 'WebSocket' has no TypeParameter, as the documentation suggests. What is the appropriate way to accept a WebSocket request async?
The documentation indeed need to be updated, I think some bits were missed in the refactoring of the websockets in #5055.
To get async processing, you should use the acceptOrResult method that takes a CompletionStage as return type instead of a flow. This can then return either a Result or an Akka Flow, using a functional programming helper (F.Either). In fact, here's how the accept method is implemented:
public WebSocket accept(Function<Http.RequestHeader, Flow<In, Out, ?>> f) {
return acceptOrResult(
request -> CompletableFuture.completedFuture(F.Either.Right(f.apply(request))));
}
As you can see, all it does is call the async version with a completedFuture.
To fully make it async and get to what I think you're trying to achieve, you'd do something like this:
public WebSocket ws() {
return WebSocket.Json.acceptOrResult(request -> {
if (sameOriginCheck(request)) {
final CompletionStage<Flow<JsonNode, JsonNode, NotUsed>> future = wsFutureFlow(request);
final CompletionStage<Either<Result, Flow<JsonNode, JsonNode, ?>>> stage = future.thenApply(Either::Right);
return stage.exceptionally(this::logException);
} else {
return forbiddenResult();
}
});
}
#SuppressWarnings("unchecked")
private CompletionStage<Flow<JsonNode, JsonNode, NotUsed>> wsFutureFlow(Http.RequestHeader request) {
long id = request.asScala().id();
UserParentActor.Create create = new UserParentActor.Create(Long.toString(id));
return ask(userParentActor, create, t).thenApply((Object flow) -> {
final Flow<JsonNode, JsonNode, NotUsed> f = (Flow<JsonNode, JsonNode, NotUsed>) flow;
return f.named("websocket");
});
}
private CompletionStage<Either<Result, Flow<JsonNode, JsonNode, ?>>> forbiddenResult() {
final Result forbidden = Results.forbidden("forbidden");
final Either<Result, Flow<JsonNode, JsonNode, ?>> left = Either.Left(forbidden);
return CompletableFuture.completedFuture(left);
}
private Either<Result, Flow<JsonNode, JsonNode, ?>> logException(Throwable throwable) {
logger.error("Cannot create websocket", throwable);
Result result = Results.internalServerError("error");
return Either.Left(result);
}
(this is taken from the play-java-websocket-example, which might be of interest)
As you can see, it first goes through a few stages before returning either a websocket connection or a HTTP status.
I want to make an asynchronous rest call for which I'm using spring webclient and getting back a Mono. I'm also doing some database calls in parallel but it can't be done reactively due to some reason.
Map<String, Object> models = new HashMap<>();
Mono<User> users = this.webClient...;
users.map(resp -> new UserState(userRequest, resp))
.subscribe(response -> {
models.put("userState", response);
});
Iterable<Product> messages = this.productRepository.findAll();
models.put("products", messages);
//Wait for users.subscribe to finish <<<<<<<<<<<<<HERE
return new ModelAndView("messages/list", models);
How do I wait for subscribe to finish before returning ModelAndView. This would have been easy if I was using a Future where I can do get() whenever I want.
You can wrap the blocking call in a Mono executed on a separate scheduler, zip it with the Mono containing UserState data and transform their combination into a Mono<ModelAndView> (which can be returned from Spring controller methods). The calls will be executed in parallel, results will be combined when both calls are completed.
You can define a single bounded scheduler per application specifically for blocking calls and provide it as a constructor argument to any class that makes blocking calls.
The code will look as follows:
#Configuration
class SchedulersConfig {
#Bean
Scheduler parallelScheduler(#Value("${blocking-thread-pool-size}") int threadsCount) {
return Schedulers.parallel(threadsCount);
}
}
#RestController
class Controller {
final Scheduler parallelScheduler;
...
Mono<User> userResponse = // webClient...
Mono<Iterable<Product>> productsResponse = Mono.fromSupplier(productRepository::findAll)
.subscribeOn(parallelScheduler);
return Mono.zip(userResponse, productsResponse, (user, products) ->
new ModelAndView("messages/list",
ImmutableMap.of(
"userState", new UserState(userRequest, user),
"products", products
))
);
}
Update based on the comment:
If you just need to execute HTTP call asynchronously and then join it with the database results you can do the following
Map<String, Object> models = new HashMap<>();
Mono<User> userMono = webClient...;
CompletableFuture<User> userFuture = userMono.toFuture();
Iterable<Product> messages = productRepository.findAll();
User user = userFuture.join();
models.put("products", messages);
models.put("userState", new UserState(userRequest, user));
return new ModelAndView("messages/list", models);