Reactive Spring Boot API wrapping Elasticsearch's async bulk indexing - java

I am developing prototype for a new project. The idea is to provide a Reactive Spring Boot microservice to bulk index documents in Elasticsearch. Elasticsearch provides a High Level Rest Client which provides an Async method to bulk process indexing requests. Async delivers callbacks using listeners are mentioned here. The callbacks receive index responses (per requests) in batches. I am trying to send this response back to the client as Flux. I have come up with something based on this blog post.
Controller
#RestController
public class AppController {
#SuppressWarnings("unchecked")
#RequestMapping(value = "/test3", method = RequestMethod.GET)
public Flux<String> index3() {
ElasticAdapter es = new ElasticAdapter();
JSONObject json = new JSONObject();
json.put("TestDoc", "Stack123");
Flux<String> fluxResponse = es.bulkIndex(json);
return fluxResponse;
}
ElasticAdapter
#Component
class ElasticAdapter {
String indexName = "test2";
private final RestHighLevelClient client;
private final ObjectMapper mapper;
private int processed = 1;
Flux<String> bulkIndex(JSONObject doc) {
return bulkIndexDoc(doc)
.doOnError(e -> System.out.print("Unable to index {}" + doc+ e));
}
private Flux<String> bulkIndexDoc(JSONObject doc) {
return Flux.create(sink -> {
try {
doBulkIndex(doc, bulkListenerToSink(sink));
} catch (JsonProcessingException e) {
sink.error(e);
}
});
}
private void doBulkIndex(JSONObject doc, BulkProcessor.Listener listener) throws JsonProcessingException {
System.out.println("Going to submit index request");
BiConsumer<BulkRequest, ActionListener<BulkResponse>> bulkConsumer =
(request, bulkListener) ->
client.bulkAsync(request, RequestOptions.DEFAULT, bulkListener);
BulkProcessor.Builder builder =
BulkProcessor.builder(bulkConsumer, listener);
builder.setBulkActions(10);
BulkProcessor bulkProcessor = builder.build();
// Submitting 5,000 index requests ( repeating same JSON)
for (int i = 0; i < 5000; i++) {
IndexRequest indexRequest = new IndexRequest(indexName, "person", i+1+"");
String json = doc.toJSONString();
indexRequest.source(json, XContentType.JSON);
bulkProcessor.add(indexRequest);
}
System.out.println("Submitted all docs
}
private BulkProcessor.Listener bulkListenerToSink(FluxSink<String> sink) {
return new BulkProcessor.Listener() {
#Override
public void beforeBulk(long executionId, BulkRequest request) {
}
#SuppressWarnings("unchecked")
#Override
public void afterBulk(long executionId, BulkRequest request, BulkResponse response) {
for (BulkItemResponse bulkItemResponse : response) {
JSONObject json = new JSONObject();
json.put("id", bulkItemResponse.getResponse().getId());
json.put("status", bulkItemResponse.getResponse().getResult
sink.next(json.toJSONString());
processed++;
}
if(processed >= 5000) {
sink.complete();
}
}
#Override
public void afterBulk(long executionId, BulkRequest request, Throwable failure) {
failure.printStackTrace();
sink.error(failure);
}
};
}
public ElasticAdapter() {
// Logic to initialize Elasticsearch Rest Client
}
}
I used FluxSink to create the Flux of Responses to send back to the Client. At this point, I have no idea whether this correct or not.
My expectation is that the calling client should receive the responses in batches of 10 ( because bulk processor processess it in batches of 10 - builder.setBulkActions(10); ). I tried to consume the endpoint using Spring Webflix Client. But unable to work it out. This is what I tried
WebClient
public class FluxClient {
public static void main(String[] args) {
WebClient client = WebClient.create("http://localhost:8080");
Flux<String> responseFlux = client.get()
.uri("/test3")
.retrieve()
.bodyToFlux(String.class);
responseFlux.subscribe(System.out::println);
}
}
Nothing is printing on console as I expected. I tried to use System.out.println(responseFlux.blockFirst());. It prints all the responses as a single batch at the end and not in batches at .
If my approach is correct, what is the correct way to consume it? For the solution in my mind, this client will reside is another Webapp.
Notes: My understanding of Reactor API is limited. The version of elasticsearch used is 6.8.

So made the following changes to your code.
In ElasticAdapter,
public Flux<Object> bulkIndex(JSONObject doc) {
return bulkIndexDoc(doc)
.subscribeOn(Schedulers.elastic(), true)
.doOnError(e -> System.out.print("Unable to index {}" + doc+ e));
}
Invoked subscribeOn(Scheduler, requestOnSeparateThread) on the Flux, Got to know about it from, https://github.com/spring-projects/spring-framework/issues/21507
In FluxClient,
Flux<String> responseFlux = client.get()
.uri("/test3")
.headers(httpHeaders -> {
httpHeaders.set("Accept", "text/event-stream");
})
.retrieve()
.bodyToFlux(String.class);
responseFlux.delayElements(Duration.ofSeconds(1)).subscribe(System.out::println);
Added "Accept" header as "text/event-stream" and delayed Flux elements.
With the above changes, was able to get the response in real time from the server.

Related

How to make parallel calls to the same service using spring Flux

I am working on spring reactive and need to call multiple calls sequentially to other REST API using webclient.
The issue is I am able to call multiple calls to other Rest API but response am not able to read without subscribe or block.
I can't use subscribe or block due to non reactive programming. Is there any way, i can merge while reading the response and send it as flux.
Below is the piece of code where I am stuck.
public Mono<DownloadDataLog> getDownload(Token dto, Mono<DataLogRequest> request) {
Mono<GraphQlCustomerResponse> profileResponse = customerProfileHandler.getMyUsageHomeMethods(dto, null);
DownloadDataLog responseObj = new DownloadDataLog();
ArrayList<Mono<List<dataUsageLogs>>> al = new ArrayList<>();
return Mono.zip(profileResponse, request).flatMap(tuple2 -> {
Flux<List<Mono<DataLogGqlRequest>>> userequest = prepareUserRequest(getListOfMdns(tuple2.getT1()),
tuple2.getT2());
Flux.from(userequest).flatMap(req -> {
for (Mono<DataLogGqlRequest> logReq : req) {
al.add(service.execute(logReq, dto));
}
responseObj.setAl(al);
return Mono.empty();
}).subscribe();
return Mono.just(responseObj);
});
}
private Mono<DataLogGqlRequest> prepareInnerRequest(Mono<DataLogGqlRequest> itemRequest, int v1,int v2){
return itemRequest.flatMap(req -> {
DataLogGqlRequest userRequest = new DataLogGqlRequest();
userRequest.setBillDate(req.getBillDate());
userRequest.setMdnNumber(req.getMdnNumber());
userRequest.setCposition(v1+"");
userRequest.setPposition(v2+"");
return Mono.just(userRequest);
});
}

Spring Webflux: Calling an endpoint inside flatmap not in parallel

In the following code in Spring Webflux application, I am calling an endpoint "myfunction" which internally calls another endpoint. If the list contains 3 values, I will hit the "cancel" endpoint 3 times. Here is the question. I want to hit the endpoint one by one which means once I get response for 1st value in list then only I want to hit for second value and so on. I know it is reactive framework, still do we have any way to do without using delayElements.
#RestController
#RequestMapping("test")
#Slf4j
public class MyRestController {
private final WebClient webClient;
public MyRestController(WebClient webClient) {
this.webClient = webClient.mutate().baseUrl("http://localhost:7076/test/").build();
}
#GetMapping("/myfunction")
public void callTest() {
Flux.fromIterable(List.of("e1", "e2", "e3"))
//.delayElements(Duration.ofMillis(1000))
.flatMap(event -> {
log.info(event);
return sendCancelRequest(event);
}).subscribe(log::info);
}
public Mono<String> sendCancelRequest(String event) {
return webClient.get()
.uri(uriBuilder -> uriBuilder.path("cancel").queryParam("event", event).build())
.accept(MediaType.APPLICATION_JSON)
.retrieve()
.bodyToMono(String.class);
}
#GetMapping("/cancel")
public Mono<String> callMe(#RequestParam String event) {
//try{Thread.sleep(5000);}catch (Exception e){}
return Mono.just(event + " cancelled");
}
}
For example:
Once I get response for "e1" then only I wanna to call "e2" as sequence and response matters for subsequent values in the list. Please assist here guys!

Value is not updating server side events spring boot

I'm new to spring boot and I'm trying to create a streaming API. But when I change the content of value it doesn't get noticed by the Server Side Events emitter.
#RestController
public class SseController {
ExecutorService executor = Executors.newSingleThreadExecutor();
RestTemplate restTemplate = new RestTemplate();
String result;
//This method only returns a single value. But I want it to return value after every 5 second.
#GetMapping("/emitter")
public SseEmitter eventEmitter() {
SseEmitter emitter = new SseEmitter();
executor.execute(() -> {
try {
//I'm Sending Value To the client Using this method
emitter.send(result);
} catch (Exception e) {
emitter.completeWithError(e);
} finally {
emitter.complete();
}
});
executor.shutdown();
return emitter;
}
//I'm updating value here.
#Scheduled(fixedRate = 5000)
private void fetchData() {
result = restTemplate.getForObject("https://api.wazirx.com/api/v2/tickers", String.class);
}
}
I have tried to use thread.sleep() method but after some time it gives me an error async time out.

How to control the response with the Elasticsearch async api

How can I control the IndexResponse when using the Elasticsearch async api w/ the HighLevelRestClient v7.5?
Maybe I need to mock the Low Level REST Client and use that mock for my High Level REST Client? 🤔
#Test
void whenIndexResponseHasFailuresDoItShouldReturnFalse() {
// arrange
var indexResponse = mock(IndexResponse.class);
when(indexResponse.getResult()).thenReturn(Result.UPDATED);
var restHighLevelClient = mock(RestHighLevelClient.class);
when(restHighLevelClient.indexAsync())
//do something here??
var indexReqest = new IndexRequest(...);
//act
var myHelper = new MyHelper(restHighLevelClient);
var result = myHelper.doIt(indexReqest)
.get();
//assert
assert(result).isFalse();
}
class MyHelper {
//injected RestHighLevelClient
CompletableFuture<Boolean> doIt(Customer customer) {
var result = new CompletableFuture<Boolean>();
var indexRequest = new IndexRequest(...);
restHighLevelClient.indexAsync(indexRequest, RequestOptions.DEFAULT
, new ActionListener<IndexResponse>() {
#Override
public void onResponse(IndexResponse indexResponse) { //want to control indexResponse
if (indexResponse.getResult() == Result.UPDATED) {
result.complete(false);
} else {
result.complete(true);
}
}
#Override
public void onFailure(Exception e) {
...
}
});
return result;
}
}
Update
Sample project using Oleg's answer
Mock RestHighLevelClient then inside indexAsync mock IndexResponse and pass it to the ActionListener.
RestHighLevelClient restHighLevelClient = mock(RestHighLevelClient.class);
when(restHighLevelClient.indexAsync(any(), any(), any())).then(a -> {
ActionListener<IndexResponse> listener = a.getArgument(2);
IndexResponse response = mock(IndexResponse.class);
when(response.getResult()).then(b -> {
return Result.UPDATED;
});
listener.onResponse(response);
return null;
});
MyHelper myHelper = new MyHelper(restHighLevelClient);
Boolean result = myHelper.doIt(null).get();
assertFalse(result);
Also, configure Mockito to support mocking final methods otherwise a NPE will be thrown when mocking indexAsync.
Option 1
Instead of using the mockito-core artifact, include the mockito-inline artifact in your project
Option 2
Create a file src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker with mock-maker-inline as the content

render http response in callback

I have been reading the Micronaut documentation but I cannot find the way to render the http response in a callback as I can do it for instance with Jax-Rs Jersey.
Here what I want to achieve
#Get("/scalaFuture")
public void getScalaFuture() {
Futures.successful(new SpringBootEntityDaoDTO())
.onComplete(result -> {
if (result.isSuccess()) {
return HttpResponse.ok(result.get());
} else {
return HttpResponse.serverError(result.failed().get());
}
}, ExecutorContextUtil.defaultExecutionContext());
}
Basically render the response in the callback of the future.
Something similar as I do with JaxRS in the Observable callback using AsyncResponse
#POST
#Path("/bla")
public void foo(#Suspended final AsyncResponse asyncResponse) {
Observable<EntityDaoDTO> observable = observableFosConnectorManager.execute("EntityAggregateRoot", "database", getEntityDaoDTO(), null, MethodDTO.CREATE);
observable
.subscribeOn(Schedulers.computation())
.subscribe(result -> {
EntityPayLoad entityPayLoad = new EntityPayLoad();
entityPayLoad.setTitle(result.getTitle());
entityPayLoad.setDescription(result.getDescription());
asyncResponse.resume(Response.status(Response.Status.OK.getStatusCode()).entity(entityPayLoad).build());
}, t -> asyncResponse.resume(Response.status(Response.Status.INTERNAL_SERVER_ERROR.getStatusCode()).build()),
() -> getLogger().info(null, "Subscription done"));
}
Regards
Micronaut allows different return types including reactive responses.
For example, you can return a CompletableFuture:
#Controller("/people")
public class PersonController {
Map<String, Person> inMemoryDatastore = new ConcurrentHashMap<>();
#Post("/saveFuture")
public CompletableFuture<HttpResponse<Person>> save(#Body CompletableFuture<Person> person) {
return person.thenApply(p -> {
inMemoryDatastore.put(p.getFirstName(), p);
return HttpResponse.created(p);
}
);
}
}
Convert your scala future to a Java completable future: https://stackoverflow.com/a/46695386/2534803
https://docs.micronaut.io/latest/guide/index.html#_binding_using_completablefuture

Categories

Resources