The idea is that for REST API I have to return the updated data from a server: e.g., my server method should return Single<Data> but for DB it's have a method that return Completable, it results in the following code:
public Single<Player> updateShirtNumber(int number) {
Player player = new Player();
player.setShirtNumber(number);
return playersDao.savePlayerCompletable(player).andThen(Single.just(player));
}
How can I avoid this andThen(Single.just(player)) of things?
You can't avoid this unless you want/can change DB API (and, basically, do not use reactive approach in the first place, but in this case you don't need rx-java).
Think savePlayerCompletable as asynchronous method. Before you return the value from REST API you want to make sure that save operation finishes. That's what andThen does.
Related
I am trying to get Json String from Mono. I tried to use block() method to get object it worked fine , but when I use map/flatmap ,I don't see following lines of code is executed.And I see account Mono is not empty.
private String getJsonString( Mono<Account> account) {
response.map(it->{
**// call is not coming here**
val json = mapper.writeValueAsString(it)
System.out.println(son)
});
}
am I doing anything wrong here?
If you give a read to the official documentation , you will see that:
Nothing happens until you subscribe
Now to understand, In spring boot webflux based microservice, who is the subscriber?, have a look at this stackoverflow question
Now, if you think, you can have blocking and reactive implementations in the same service, unfortunately, it doesn't work like that. For this you have to understand the event loop model on which reactor works. Thus calling a block method at any point in the flow is of no good and is equivalent to using the old blocking spring-web methods. Because the thread in which the request is being processed gets blocked and waits for the outcome of the I/O operation / network call.
Coming to your question in the comment:
But when i use flatMap in my controller to call handler method it goes service method with Object not mono?serviceRequest-->Mono-->Object how this works?
Let me give you a simple example for this:
Suppose you have an employee application, where you want to fetch details of an employee for a given id.
Now in your controller, you will have an endpoint like this:
#GetMapping("/{tenant}/api/employee/{id}")
public Mono<ResponseEntity> getEmployeeDetails(#PathVariable("id") Long employeeId) {
return employeeService.getDetails(employeeId)
.map(ResponseEntity::ok);
}
Now in your service,
public Mono<EmployeeEntity> getDetails(Long employeeId) {
return employeeRepository.findById(employeeId);
}
And your repository will look like this:
#Repository
public interface EmployeeRepository extends ReactiveCrudRepository<EmployeeEntity, Long> {
}
Please bear with me, i dont usually use spring and havent used newer versions of java (I say newer I mean anything past prob 1.4)
Anyway, I have an issue where I have to do rest calls to do a search using multiple parallel requests. Ive been looking around online and I see you can use CompletableFuture.
So I created my method to get the objects I need form the rest call like:
#Async
public CompletableFuture<QueryObject[]> queryObjects(String url)
{
QueryObject[] objects= restTemplate.getForObject(url, QueryObject[].class);
return CompletableFuture.completedFuture(objects);
}
Now I need to call that with something like:
CompletableFuture<QueryObject> page1 = QueryController.queryObjects("http://myrest.com/ids=[abc, def, ghi]);
CompletableFuture<QueryObject> page2 = QueryController.queryObjects("http://myrest.com/ids=[jkl, mno, pqr]);
The problem I have is that the call needs to only do three ids at a time and there could be a list of variable number ids. So I parse the idlist and create a query string like above. The problem with that I am having is that while I can call the queries I dont have separate objects that I can then call CompletableFuture.allOf on.
Can anyone tell me the way to do this? Ive been at it for a while now and Im not getting any further than where I am now.
Happy to provide more info if the above isnt sufficient
You are not getting any benefit of using the CompletableFuture in a way you're using it right now.
The restTemplate method you're using is a synchronous method, so it has to finish and return a result, before proceeding. Because of that wrapping the final result in a CompletableFuture doesn't cause it to be executed asynchronously (neither in parallel). You just wrap a response, that you have already retrieved.
If you want to benefit from the async execution, then you can use for example the AsyncRestTemplate or the WebClient .
A simplified code example:
public ListenableFuture<ResponseEntity<QueryObject[]>> queryForObjects(String url) {
return asyncRestTemplate.getForEntity(url, QueryObject[].class);
}
public List<QueryObject> queryForList(String[] elements) {
return Arrays.stream(elements)
.map(element -> queryForObjects("http://myrest.com/ids=[" + element + "]"))
.map(future -> future.completable().join())
.filter(Objects::nonNull)
.map(HttpEntity::getBody)
.flatMap(Arrays::stream)
.collect(Collectors.toList());
}
While working with Spring Webflux, I'm trying to insert some data in the realm object server which interacts with Java apps via a Rest API. So basically I have a set of students, who have a set of subjects and my objective is to persist those subjects in a non-blocking manner. So I use a microservice exposed via a rest endpoint which provides me with a Flux of student roll numbers, and for that flux, I use another microservice exposed via a rest endpoint that gets me the Flux of subjects, and for each of these subjects, I want to persist them in the realm server via another rest endpoint. I wanted to make this all very nonblocking which is why I wanted my code to look like this.
void foo() {
studentService.getAllRollnumbers().flatMap(rollnumber -> {
return subjectDirectory.getAllSubjects().map(subject -> {
return dbService.addSubject(subject);
})
});
}
But this doesn't work for some reason. But once I call blocks on the things, they get into place, something like this.
Flux<Done> foo() {
List<Integer> rollNumbers = studentService.getAllRollnumbers().collectList().block();
rollNumbers.forEach(rollNumber -> {
List<Subject> subjects = subjectDirectory.getAllSubjects().collectList().block();
subjects.forEach(subject -> {dbService.addSubject(subject).block();});
});
return Flux.just(new NotUsed());
}
getAllRollnumbers() returns a flux of integers.
getAllSubjects() returns a flux of subject.
and addSubject() returns a Mono of DBResponse pojo.
What I can understand is that the thread executing this function is getting expired before much of it gets triggerred. Please help me work this code in an async non blocking manner.
You are not subscribing to the Publisher at all in the first instance that is why it is not executing. You can do this:
studentService.getAllRollnumbers().flatMap(rollnumber -> {
return subjectDirectory.getAllSubjects().map(subject -> {
return dbService.addSubject(subject);
})
}).subscribe();
However it is usually better to let the framework take care of the subscription, but without seeing the rest of the code I can't advise.
I'm kind of forced to switch to reactive programming (and in a short time frame), using WebFlux and I'm having a really hard time understanding it. Maybe because the lack of examples or because I never did functional programming.
Anyways, my question is where to use Mono/Flux and where can I work with normal objects? E.g. my controller is waiting for a #Valid User object, should that be #Valid Mono or something like Mono<#Valid User>? If let's say it was just a User object, I pass it to my service layer, and I want to encode the password before saving it to the reactive MongoDb, should I write:
User.setPassword(...);
return reactiveMongoDbRepository.save(user); //returns Mono<User> which is returned by the Controller to the View
Or it should be something like
return Mono.just(user).map.flatmap(setPasswordSomewhereInThisHardToFollowChain).then.whatever.doOnSuccess(reactiveMongoDbRepository::save);
In other words, am I forced to use this pipeline thing EVERYWHERE to maintain reactiveness or doing some steps the imperative way, like unwrapping the object, working on it, and wrapping it back is OK?
I know my question seems to be silly but I don't have the big picture at all, reading books about it didn't really help yet, please be gentle on me. :)
Use pipelining when you require sequential, asynchronous and lazy execution. In all other cases (when you are using a non-blocking code) you're free to choose any approach and it's generally better to use the simplest one.
Sequential non-blocking code can be organised in functions that you can integrate with reactive pipeline using map/filter/doOnNext/... components.
For example, consider the following Order price calculation code.
class PriceCalculator {
private final Map<ProductCode, Price> prices;
PriceCalculator(Map<ProductCode, Price> prices) {
this.prices = prices;
}
PricedOrder calculatePrice(Order order) { // doesn't deal with Mono/Flux stuff
Double price = order.orderLines.stream()
.map(orderLine -> prices.get(orderLine.productCode))
.map(Price::doubleValue)
.sum();
return new PricedOrder(order, price);
}
}
class PricingController {
public Mono<PricedOrder> getPricedOrder(ServerRequest request) {
OrderId orderId = new OrderId(request.pathVariable("orderId"));
Mono<Order> order = orderRepository.get(orderId);
return order.map(priceCalculator::calculatePrice)
}
}
I have a webapp in which I have to return the results from a mongodb find() to the front-end from my java back-end.
I am using the Async Java driver, and the only way I think I have to return the results from mongo is something like this:
public String getDocuments(){
...
collection.find(query).map(Document::toJson)
.into(new HashSet<String>(), new SingleResultCallback<HashSet<String>>() {
#Override
public void onResult(HashSet<String> strings, Throwable throwable) {
// here I have to get all the Json Documents in the set,
// make a whole json string and wake the main thread
}
});
// here I have to put the main thread to wait until I get the data in
// the onResult() method so I can return the string back to the front-end
...
return jsonString;
}
Is this assumption right or thereĀ“s another way to do it?
Asynchronous APIs (any API based on callbacks, not necessarily MongoDB) can be a true blessing for multithreaded applications. But to really benefit from them, you need to design your whole application architecture in an asynchronous fashion. This is not always feasible, especially when it is supposed to fit into a given framework which isn't built on callbacks.
So sometimes (like in your case) you just want to use an asynchronous API in a synchronous fashion. In that case, you can use the class CompletableFuture.
This class provides (among others) two methods <T> get() and complete(<T> value). The method get will block until complete is called to provide the return value (should complete get called before get, get returns immediately with the provided value).
public String getDocuments(){
...
CompletableFuture<String> result = new CompletableFuture<>(); // <-- create an empty, uncompleted Future
collection.find(query).map(Document::toJson)
.into(new HashSet<String>(), new SingleResultCallback<HashSet<String>>() {
#Override
public void onResult(HashSet<String> strings, Throwable throwable) {
// here I have to get all the Json Documents in the set and
// make a whole json string
result.complete(wholeJsonString); // <--resolves the future
}
});
return result.get(); // <-- blocks until result.complete is called
}
The the get()-method of CompletableFuture also has an alternative overload with a timeout parameter. I recommend using this to prevent your program from accumulating hanging threads when the callback is not called for whatever reason. It will also be a good idea to implement your whole callback in a try { block and do the result.complete in the finally { block to make sure the result always gets resolved, even when there is an unexpected error during your callback.
Yes, you're right.
That's the correct behaviour of Mongo async driver (see MongoIterable.into).
However, Why don't you use sync driver in this situation? Is there any reason to use async method?