I have a web service that makes http calls to another service. The web service breaks down one-to-many requests and attempts to make parallel one-to-one requests. For testing performance, I have kept the throughput to the backend constant. For example, I was able to achieve a throughput of 1000 req/sec with a 99th percentile latency of 100ms. So to test parallel requests that get broken down to 2 requests to the backend per each request to the web service, I sent 500 req/sec but achieved only a 150ms 99th percentile latency. Am I creating thread contention and/or making blocking http calls with the following code?
import java.util.HashMap;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ConcurrentMap;
import java.util.stream.Collectors;
public class Foo {
private HTTPClient myHTTPClient = new HTTPClient("http://my_host.com"); //java ws rs http client
private interface Handler<REQ, RES> {
RES work(REQ req);
}
private <REQ, RES> CompletableFuture<RES> getAsync(REQ req, Handler<REQ, RES> handler) {
CompletableFuture<RES> future = CompletableFuture.supplyAsync(() -> {
return handler.work(req);
});
return future;
}
public RouteCostResponse getRouteCost(Point sources, List<Point> destinations) {
Map<String, Request> requests = new HashMap<>();
// create request bodies and keep track of request id's
for (Point destination : destinations) {
requests.put(destination.getId(), new RouteCostRequest(source, destination))
}
//create futures
ConcurrentMap<String, CompletableFuture<RouteCost>> futures = requests.entrySet().parallelStream()
.collect(Collectors.toConcurrentMap(
entry -> entry.getKey(),
entry -> getAsync(entry.getValue(), route -> myHTTPClient.getRoute(route)))
));
//retrieve results
ConcurrentMap<String, RouteCost> result = futures.entrySet().parallelStream()
.collect(Collectors.toConcurrentMap(
entry -> entry.getKey(),
entry -> entry.getValue().join()
));
RouteCostResponse response = new RouteCostResponse(result);
return response;
}
}
There is no thread contention with the following code, though it seems i have run into I/O issues. The key is to use an explicit thread pool. ForkJoinPool or Executors.fixedThreadPool
import java.util.HashMap;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ConcurrentMap;
import java.util.concurrent.ForkJoinPool;
import java.util.stream.Collectors;
public class Foo {
private HTTPClient myHTTPClient = new HTTPClient("http://my_host.com"); //java ws rs http client
private static final ForkJoinPool pool = new ForkJoinPool(1000);
private interface Handler<REQ, RES> {
RES work(REQ req);
}
private <REQ, RES> CompletableFuture<RES> getAsync(REQ req, Handler<REQ, RES> handler) {
CompletableFuture<RES> future = CompletableFuture.supplyAsync(() -> {
return handler.work(req);
});
return future;
}
public RouteCostResponse getRouteCost(Point sources, List<Point> destinations) {
Map<String, Request> requests = new HashMap<>();
// create request bodies and keep track of request id's
for (Point destination : destinations) {
requests.put(destination.getId(), new RouteCostRequest(source, destination))
}
//create futures
ConcurrentMap<String, CompletableFuture<RouteCost>> futures = requests.entrySet().stream()
.collect(Collectors.toConcurrentMap(
entry -> entry.getKey(),
entry -> getAsync(entry.getValue(), route -> myHTTPClient.getRoute(route)))
));
//retrieve results
ConcurrentMap<String, RouteCost> result = futures.entrySet().stream()
.collect(Collectors.toConcurrentMap(
entry -> entry.getKey(),
entry -> entry.getValue().join()
));
RouteCostResponse response = new RouteCostResponse(result);
return response;
}
}
Related
The issue I need to solve (in Java) is:
Fetch an array of games from an API.
Iterate over games N times:
(async / concurrent)
Make an individual HTTP request for every game in order to get its
details.
Store the details of it in a gamesWithDetails array.
Done. I have my array gamesWithDetails.
I cannot fetch the details of all the games with a single request, I have to hit the API endpoint every time per game. So I want to execute these requests asynchronously from each other.
This is a working example in JavaScript in case it's useful. However I'd like to make it work for Spring Boot.
axios.get(`https://la2.api.riotgames.com/lol/match/v4/matchlists/by-account/${data.accountId}`, {
headers: { "X-Riot-Token": "asdasdasdasdadasdasdasd"}
})
.then(resp => {
const promises = [];
for ( match of resp.data.matches ) {
promises.push(
axios.get(`https://la2.api.riotgames.com/lol/match/v4/matches/${match.gameId}`, {
headers: { "X-Riot-Token": "asdasdasdasdasdasdasdasd"}
})
)
}
Promise.all(promises)
.then(matchesDetails => {
matchesDetails.forEach(({ data }) => console.log(data.gameId));
});
})
Basically you will want to do something like this:
package com.example.demo;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.client.RestTemplate;
import java.util.List;
import java.util.Map;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executor;
import java.util.concurrent.Executors;
import java.util.stream.Collectors;
public class GamesProcessor {
private static final String GAME_URI_BASE = "https://la2.api.riotgames.com/lol/match/v4/matches/";
private static final String ACCOUNT_URI_BASE = "https://la2.api.riotgames.com/lol/match/v4/matchlists/by-account/";
private Executor executor = Executors.newFixedThreadPool(Runtime.getRuntime().availableProcessors() - 1);
#Autowired
private RestTemplate restTemplate;
public void processGames(String accountId) throws JsonProcessingException, ExecutionException, InterruptedException {
String responseAsString = restTemplate.getForObject(ACCOUNT_URI_BASE + accountId, String.class);
ObjectMapper objectMapper = new ObjectMapper();
if (responseAsString != null) {
Map<String, Object> response = objectMapper.readValue(responseAsString, new TypeReference<Map<String, Object>>() {
});
List<Map<String, Object>> matches = (List<Map<String, Object>>) ((Map<String, Object>) response.get("data")).get("matches");
List<CompletableFuture<Void>> futures = matches.stream()
.map(m -> (String) m.get("gameId"))
.map(gameId -> CompletableFuture.supplyAsync(() -> restTemplate.getForObject(GAME_URI_BASE + gameId, String.class), executor)
.thenAccept(r -> {
System.out.println(r); //do whatever you wish with the response here
}))
.collect(Collectors.toList());
// now we execute all requests asynchronously
CompletableFuture.allOf(futures.toArray(new CompletableFuture[0])).get();
}
}
}
Please note that it is not a refined code, but just a quick example of how to achieve this. Ideally you would replace that JSON processing that I've done "by hand" using Map by a response bean that matches the structure of the response you get from the service you are calling.
A quick walk through:
String responseAsString = restTemplate.getForObject(ACCOUNT_URI_BASE + accountId, String.class);
This executes the first REST request and gets it as a String (the JSON response). You will want to properly map this using a Bean object instead. Then this is processed using the ObjectMapper provided by Jackson and transformed into a map so you can navigate the JSON and get the matches.
List<CompletableFuture<Void>> futures = matches.stream()
.map(m -> (String) m.get("gameId"))
.map(gameId -> CompletableFuture.supplyAsync(() -> restTemplate.getForObject(GAME_URI_BASE + gameId, String.class), executor)
.thenAccept(r -> {
System.out.println(r); //do whatever you wish with the response here
}))
.collect(Collectors.toList());
Once we have all the matches we will use the Stream API to transform them into CompletableFutures that will be executed asynchronously. Each thread will make another request in order to get the response for each individual matchId.
System.out.println(r);
This will be executed for each response that you get for each matchId, just like in your example. This should also be replaced by a proper bean matching the output for clearer processing.
Note that List<CompletableFuture<Void>> futures only "holds the code" but will not get executed until we combine everything in the end using CompletableFuture.allOf(futures.toArray(new CompletableFuture[0])).get(); and execute the blocking get() method.
Quite interesting question as JavaScript implements the famous event loop which means its functions are asynchronous and non-blocking. Spring Boot restTemplate class will block the thread of execution until the response is back, therefore wasting a lot of resources (the one thread per request model).
#Slacky's answer is technically right as you asked about asynchronous HTTP requests but I'd like to share a better option which is both asynchronous and non-blocking, meaning a single thread is able to handle 100s or even 1000s of requests and their responses (reactive programming).
The way to implement in Spring Boot the equivalent to your JavaScript example is to use the Project Reactor WebClient class which is a non-blocking, reactive client to perform HTTP requests.
It is also worth mentioning that Java being statically typed requires you to declare classes to represent your data, in this case something like (using Lombok for brevity):
#Data
class Match {
private String gameId;
// ...
}
#Data
class MatchDetails {
// ...
}
Here is the code following #Slacky's answer naming convention to make the comparison easier.
public class GamesProcessor {
private static final String BASE_URL = "https://la2.api.riotgames.com";
private static final String GAME_URI = "/lol/match/v4/matches/%s";
private static final String ACCOUNT_URI = "/lol/match/v4/matchlists/by-account/%s";
public static List<MatchDetails> processGames(String accountId) {
final WebClient webClient = WebClient
.builder()
.baseUrl(BASE_URL)
.defaultHeader("X-Riot-Token", "asdasdasdasdadasdasdasd")
.build();
// Issues the first request to get list of matches
List<Match> matches = webClient
.get()
.uri(String.format(ACCOUNT_URI, accountId))
.accept(MediaType.APPLICATION_JSON)
.retrieve()
.bodyToMono(new ParameterizedTypeReference<List<Match>>() {})
.block(); // blocks to wait for response
// Processes the list of matches asynchronously and collect all responses in a list of matches details
return Flux.fromIterable(matches)
.flatMap(match -> webClient
.get()
.uri(String.format(GAME_URI, match.getGameId()))
.accept(MediaType.APPLICATION_JSON)
.retrieve()
.bodyToMono(MatchDetails.class))
.collectList()
.block(); // Blocks to wait for all responses
}
}
I am developing prototype for a new project. The idea is to provide a Reactive Spring Boot microservice to bulk index documents in Elasticsearch. Elasticsearch provides a High Level Rest Client which provides an Async method to bulk process indexing requests. Async delivers callbacks using listeners are mentioned here. The callbacks receive index responses (per requests) in batches. I am trying to send this response back to the client as Flux. I have come up with something based on this blog post.
Controller
#RestController
public class AppController {
#SuppressWarnings("unchecked")
#RequestMapping(value = "/test3", method = RequestMethod.GET)
public Flux<String> index3() {
ElasticAdapter es = new ElasticAdapter();
JSONObject json = new JSONObject();
json.put("TestDoc", "Stack123");
Flux<String> fluxResponse = es.bulkIndex(json);
return fluxResponse;
}
ElasticAdapter
#Component
class ElasticAdapter {
String indexName = "test2";
private final RestHighLevelClient client;
private final ObjectMapper mapper;
private int processed = 1;
Flux<String> bulkIndex(JSONObject doc) {
return bulkIndexDoc(doc)
.doOnError(e -> System.out.print("Unable to index {}" + doc+ e));
}
private Flux<String> bulkIndexDoc(JSONObject doc) {
return Flux.create(sink -> {
try {
doBulkIndex(doc, bulkListenerToSink(sink));
} catch (JsonProcessingException e) {
sink.error(e);
}
});
}
private void doBulkIndex(JSONObject doc, BulkProcessor.Listener listener) throws JsonProcessingException {
System.out.println("Going to submit index request");
BiConsumer<BulkRequest, ActionListener<BulkResponse>> bulkConsumer =
(request, bulkListener) ->
client.bulkAsync(request, RequestOptions.DEFAULT, bulkListener);
BulkProcessor.Builder builder =
BulkProcessor.builder(bulkConsumer, listener);
builder.setBulkActions(10);
BulkProcessor bulkProcessor = builder.build();
// Submitting 5,000 index requests ( repeating same JSON)
for (int i = 0; i < 5000; i++) {
IndexRequest indexRequest = new IndexRequest(indexName, "person", i+1+"");
String json = doc.toJSONString();
indexRequest.source(json, XContentType.JSON);
bulkProcessor.add(indexRequest);
}
System.out.println("Submitted all docs
}
private BulkProcessor.Listener bulkListenerToSink(FluxSink<String> sink) {
return new BulkProcessor.Listener() {
#Override
public void beforeBulk(long executionId, BulkRequest request) {
}
#SuppressWarnings("unchecked")
#Override
public void afterBulk(long executionId, BulkRequest request, BulkResponse response) {
for (BulkItemResponse bulkItemResponse : response) {
JSONObject json = new JSONObject();
json.put("id", bulkItemResponse.getResponse().getId());
json.put("status", bulkItemResponse.getResponse().getResult
sink.next(json.toJSONString());
processed++;
}
if(processed >= 5000) {
sink.complete();
}
}
#Override
public void afterBulk(long executionId, BulkRequest request, Throwable failure) {
failure.printStackTrace();
sink.error(failure);
}
};
}
public ElasticAdapter() {
// Logic to initialize Elasticsearch Rest Client
}
}
I used FluxSink to create the Flux of Responses to send back to the Client. At this point, I have no idea whether this correct or not.
My expectation is that the calling client should receive the responses in batches of 10 ( because bulk processor processess it in batches of 10 - builder.setBulkActions(10); ). I tried to consume the endpoint using Spring Webflix Client. But unable to work it out. This is what I tried
WebClient
public class FluxClient {
public static void main(String[] args) {
WebClient client = WebClient.create("http://localhost:8080");
Flux<String> responseFlux = client.get()
.uri("/test3")
.retrieve()
.bodyToFlux(String.class);
responseFlux.subscribe(System.out::println);
}
}
Nothing is printing on console as I expected. I tried to use System.out.println(responseFlux.blockFirst());. It prints all the responses as a single batch at the end and not in batches at .
If my approach is correct, what is the correct way to consume it? For the solution in my mind, this client will reside is another Webapp.
Notes: My understanding of Reactor API is limited. The version of elasticsearch used is 6.8.
So made the following changes to your code.
In ElasticAdapter,
public Flux<Object> bulkIndex(JSONObject doc) {
return bulkIndexDoc(doc)
.subscribeOn(Schedulers.elastic(), true)
.doOnError(e -> System.out.print("Unable to index {}" + doc+ e));
}
Invoked subscribeOn(Scheduler, requestOnSeparateThread) on the Flux, Got to know about it from, https://github.com/spring-projects/spring-framework/issues/21507
In FluxClient,
Flux<String> responseFlux = client.get()
.uri("/test3")
.headers(httpHeaders -> {
httpHeaders.set("Accept", "text/event-stream");
})
.retrieve()
.bodyToFlux(String.class);
responseFlux.delayElements(Duration.ofSeconds(1)).subscribe(System.out::println);
Added "Accept" header as "text/event-stream" and delayed Flux elements.
With the above changes, was able to get the response in real time from the server.
I want to make an asynchronous rest call for which I'm using spring webclient and getting back a Mono. I'm also doing some database calls in parallel but it can't be done reactively due to some reason.
Map<String, Object> models = new HashMap<>();
Mono<User> users = this.webClient...;
users.map(resp -> new UserState(userRequest, resp))
.subscribe(response -> {
models.put("userState", response);
});
Iterable<Product> messages = this.productRepository.findAll();
models.put("products", messages);
//Wait for users.subscribe to finish <<<<<<<<<<<<<HERE
return new ModelAndView("messages/list", models);
How do I wait for subscribe to finish before returning ModelAndView. This would have been easy if I was using a Future where I can do get() whenever I want.
You can wrap the blocking call in a Mono executed on a separate scheduler, zip it with the Mono containing UserState data and transform their combination into a Mono<ModelAndView> (which can be returned from Spring controller methods). The calls will be executed in parallel, results will be combined when both calls are completed.
You can define a single bounded scheduler per application specifically for blocking calls and provide it as a constructor argument to any class that makes blocking calls.
The code will look as follows:
#Configuration
class SchedulersConfig {
#Bean
Scheduler parallelScheduler(#Value("${blocking-thread-pool-size}") int threadsCount) {
return Schedulers.parallel(threadsCount);
}
}
#RestController
class Controller {
final Scheduler parallelScheduler;
...
Mono<User> userResponse = // webClient...
Mono<Iterable<Product>> productsResponse = Mono.fromSupplier(productRepository::findAll)
.subscribeOn(parallelScheduler);
return Mono.zip(userResponse, productsResponse, (user, products) ->
new ModelAndView("messages/list",
ImmutableMap.of(
"userState", new UserState(userRequest, user),
"products", products
))
);
}
Update based on the comment:
If you just need to execute HTTP call asynchronously and then join it with the database results you can do the following
Map<String, Object> models = new HashMap<>();
Mono<User> userMono = webClient...;
CompletableFuture<User> userFuture = userMono.toFuture();
Iterable<Product> messages = productRepository.findAll();
User user = userFuture.join();
models.put("products", messages);
models.put("userState", new UserState(userRequest, user));
return new ModelAndView("messages/list", models);
How to end chained requests in Rx Vert.X ?
HttpClient client = Vertx.vertx().createHttpClient();
HttpClientRequest request = client.request(HttpMethod.POST,
"someURL")
.putHeader("content-type", "application/x-www-form-urlencoded")
.putHeader("content-length", Integer.toString(jsonData.length())).write(jsonData);
request.toObservable().
//flatmap HttpClientResponse -> Observable<Buffer>
flatMap(httpClientResponse -> { //something
return httpClientResponse.toObservable();
}).
map(buffer -> {return buffer.toString()}).
//flatmap data -> Observable<HttpClientResponse>
flatMap(postData -> client.request(HttpMethod.POST,
someURL")
.putHeader("content-type", "application/x-www-form-urlencoded")
.putHeader("content-length", Integer.toString(postData.length())).write(postData).toObservable()).
//flatmap HttpClientResponse -> Observable<Buffer>
flatMap(httpClientResponse -> {
return httpClientResponse.toObservable();
})......//other operators
request.end();
Notice that I have .end() for the top request. How do I end request that is inside of the .flatmap ? Do I even need to end it ?
There are multiple ways to ensure to call request.end(). But I would dig into documentation of Vert.x or just open source code if there is one, to see if it does call end() for you. Otherwise one could be
final HttpClientRequest request = ...
request.toObservable()
.doOnUnsubscribe(new Action0() {
#Override
public void call() {
request.end();
}
});
I think you can do something like the following code.
The main idea is that you don't directly use the HttpClientRequest as obtained by the Vertx client. Instead you create another flowable that will invoke end() as soon as the first subscription is received.
Here, for instance, you can obtain the request through a pair custom methods: in this case request1() and request2(). They both use doOnSubscribe() to trigger the end() you need. Read its description on the ReactiveX page.
This examle uses vertx and reactivex, I hope you could use this set up.
import io.reactivex.Flowable;
import io.vertx.core.http.HttpMethod;
import io.vertx.reactivex.core.Vertx;
import io.vertx.reactivex.core.buffer.Buffer;
import io.vertx.reactivex.core.http.HttpClient;
import io.vertx.reactivex.core.http.HttpClientRequest;
import io.vertx.reactivex.core.http.HttpClientResponse;
import org.junit.Test;
public class StackOverflow {
#Test public void test(){
Buffer jsonData = Buffer.buffer("..."); // the json data.
HttpClient client = Vertx.vertx().createHttpClient(); // the vertx client.
request1(client)
.flatMap(httpClientResponse -> httpClientResponse.toFlowable())
.map(buffer -> buffer.toString())
.flatMap(postData -> request2(client, postData) )
.forEach( httpResponse -> {
// do something with returned data);
});
}
private Flowable<HttpClientResponse> request1(HttpClient client) {
HttpClientRequest request = client.request(HttpMethod.POST,"someURL");
return request
.toFlowable()
.doOnSubscribe( subscription -> request.end() );
}
private Flowable<HttpClientResponse> request2(HttpClient client, String postData) {
HttpClientRequest request = client.request(HttpMethod.POST,"someURL");
// do something with postData
return request
.toFlowable()
.doOnSubscribe( subscription -> request.end() );
}
}
I'm using apache http client within spring mvc 3.2.2 to send 5 get requests synchronously as illustrated.
How can I send all of these asynchronously (in parallel) and wait for the requests to return in order to return a parsed payload string from all GET requests?
public String myMVCControllerGETdataMethod()
{
// Send 1st request
HttpClient httpclient = new DefaultHttpClient();
HttpGet httpget = new HttpGet("http://api/data?type=1");
ResponseHandler<String> responseHandler = new BasicResponseHandler();
String responseBody = httpclient.execute(httpget, responseHandler);
// Send 2st request
HttpClient httpclient2 = new DefaultHttpClient();
HttpGet httpget2 = new HttpGet("http://api/data?type=2");
ResponseHandler2<String> responseHandler2 = new BasicResponseHandler();
String responseBody2 = httpclient.execute(httpget, responseHandler2);
// o o o more gets here
// Perform some work here...and wait for all requests to return
// Parse info out of multiple requests and return
String results = doWorkwithMultipleDataReturned();
model.addAttribute(results, results);
return "index";
}
Just in general, you need to encapsulate your units of work in a Runnable or java.util.concurrent.Callable and execute them via java.util.concurrent.Executor (or org.springframework.core.task.TaskExecutor). This allows each unit of work to be executed separately, typically in an asynchronous fashion (depending on the implementation of the Executor).
So for your specific problem, you could do something like this:
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import java.util.concurrent.Callable;
import java.util.concurrent.Executor;
import java.util.concurrent.FutureTask;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.impl.client.BasicResponseHandler;
import org.apache.http.impl.client.DefaultHttpClient;
import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.RequestMapping;
#Controller
public class MyController {
//inject this
private Executor executor;
#RequestMapping("/your/path/here")
public String myMVCControllerGETdataMethod(Model model) {
//define all async requests and give them to injected Executor
List<GetRequestTask> tasks = new ArrayList<GetRequestTask>();
tasks.add(new GetRequestTask("http://api/data?type=1", this.executor));
tasks.add(new GetRequestTask("http://api/data?type=2", this.executor));
//...
//do other work here
//...
//now wait for all async tasks to complete
while(!tasks.isEmpty()) {
for(Iterator<GetRequestTask> it = tasks.iterator(); it.hasNext();) {
GetRequestTask task = it.next();
if(task.isDone()) {
String request = task.getRequest();
String response = task.getResponse();
//PUT YOUR CODE HERE
//possibly aggregate request and response in Map<String,String>
//or do something else with request and response
it.remove();
}
}
//avoid tight loop in "main" thread
if(!tasks.isEmpty()) Thread.sleep(100);
}
//now you have all responses for all async requests
//the following from your original code
//note: you should probably pass the responses from above
//to this next method (to keep your controller stateless)
String results = doWorkwithMultipleDataReturned();
model.addAttribute(results, results);
return "index";
}
//abstraction to wrap Callable and Future
class GetRequestTask {
private GetRequestWork work;
private FutureTask<String> task;
public GetRequestTask(String url, Executor executor) {
this.work = new GetRequestWork(url);
this.task = new FutureTask<String>(work);
executor.execute(this.task);
}
public String getRequest() {
return this.work.getUrl();
}
public boolean isDone() {
return this.task.isDone();
}
public String getResponse() {
try {
return this.task.get();
} catch(Exception e) {
throw new RuntimeException(e);
}
}
}
//Callable representing actual HTTP GET request
class GetRequestWork implements Callable<String> {
private final String url;
public GetRequestWork(String url) {
this.url = url;
}
public String getUrl() {
return this.url;
}
public String call() throws Exception {
return new DefaultHttpClient().execute(new HttpGet(getUrl()), new BasicResponseHandler());
}
}
}
Note that this code has not been tested.
For your Executor implementation, check out Spring's TaskExecutor and task:executor namespace. You probably want a reusable pool of threads for this use-case (instead of creating a new thread every time).
You should use AsyncHttpClient. You can make any number of requests, and it will make a call back to you when it gets a response. You can configure how many connections it can create. All the threading is handled by the library, so it's alot easier than managing the threads yourself.
take a look a the example here: https://github.com/AsyncHttpClient/async-http-client
Move your request code to separate method:
private String executeGet(String url){
HttpClient httpclient = new DefaultHttpClient();
HttpGet httpget = new HttpGet(url);
ResponseHandler<String> responseHandler = new BasicResponseHandler();
return httpclient.execute(httpget, responseHandler);
}
And submit them to ExecutorService:
ExecutorService executorService = Executors.newCachedThreadPool();
Future<String> firstCallFuture = executorService.submit(() -> executeGet(url1));
Future<String> secondCallFuture = executorService.submit(() -> executeGet(url2));
String firstResponse = firstCallFuture.get();
String secondResponse = secondCallFuture.get();
executorService.shutdown();
Or
Future<String> firstCallFuture = CompletableFuture.supplyAsync(() -> executeGet(url1));
Future<String> secondCallFuture = CompletableFuture.supplyAsync(() -> executeGet(url2));
String firstResponse = firstCallFuture.get();
String secondResponse = secondCallFuture.get();
Or use RestTemplate as described in How to use Spring WebClient to make multiple calls simultaneously?
For parallel execution of multiple request with single HttpClient instance.
configure PoolingHttpClientConnectionManager for parallel execution.
HttpClientBuilder builder = HttpClientBuilder.create();
PlainConnectionSocketFactory plainConnectionSocketFactory = new PlainConnectionSocketFactory();
Registry<ConnectionSocketFactory> registry = RegistryBuilder.<ConnectionSocketFactory>create()
.register("http", plainConnectionSocketFactory).build();
PoolingHttpClientConnectionManager ccm = new PoolingHttpClientConnectionManager(registry);
ccm.setMaxTotal(BaseConstant.CONNECTION_POOL_SIZE); // For Example : CONNECTION_POOL_SIZE = 10 for 10 thread parallel execution
ccm.setDefaultMaxPerRoute(BaseConstant.CONNECTION_POOL_SIZE);
builder.setConnectionManager((HttpClientConnectionManager) ccm);
HttpClient objHttpClient = builder.build();