My question is pretty similar to this question - Java - AsyncHttpClient - Fire and Forget but I am using Jersey / Jax-RS in my case.
How do you configure Jersey Jax-RS asynchronous calls to achieve a "fire-and-forget" where it is imperative to not block the current working thread no matter what?
For example, if there are no available threads to process the request, skip it complete and move on do not block the calling thread.
So given this test client here:
Client client = ClientBuilder.newClient();
Future<Response> future1 = client.target("http://example.com/customers/123")
.request()
.async().get();
Cool that works great for a get. But what about a fire-and-forget put or post or something. How would I change this to act more "fire-and-forget"?
client.target("http://example.com/customers/123")
.request()
.async().put(myCustomer);
In a fire-and-forget, you could configure it in many ways for example that it will buffer into an in-memory queue up to a configurable amount of memory and then will just start discarding new entries if the queue was full.
Or another example would be N worker threads and if they are all busy you just drop the http request.
What are the different common Jax-RS async parameters that I should configure? Any gotchas?
After spending the day of learning about the java Concurrency API, I still dont quite get how could I create the following functionality with the help of CompletableFuture and ExecutorService classes:
When I get a request on my REST endpoint I need to:
Start an asynchronous task (includes DB query, filtering, etc.), which will give me a list of String URLs at the end
In the meanwhile, responde back to the REST caller with HTTP OK, that the request was received, I'm working on it
When the asynchronous task is finished, I need to send HTTP requests (with the payload, the REST caller gave me) to the URLs I got from the job. At most the number of URLs would be around a 100, so I need these to happen in parallel.
Ideally I have some syncronized counter which counts how many of the http requests were a success/fail, and I can send this information back to the REST caller (the URL I need to send it back to is provided inside the request payload).
I have the building blocks (methods like: getMatchingObjectsFromDB(callerPayload), getURLs(resultOfgetMachingObjects), sendHttpRequest(Url, methodType), etc...) written for these already, I just cant quite figure out how to tie step 1 and step 3 together. I would use CompletableFuture.supplyAsync() for step 1, then I would need the CompletableFuture.thenComponse method to start step 3, but it's not clear to me how parallelism can be done with this API. It is rather intuitive with ExecutorService executor = Executors.newWorkStealingPool(); though, which creates a thread pool based on how much processing power is available and the tasks can be submitted via the invokeAll() method.
How can I use CompletableFutureand ExecutorService together? Or how can I guarantee parallel execution of a list of tasks with CompletableFuture? Demonstrating code snippet would be much appreciated. Thanks.
You should use join() to wait for all thread finish.
Create Map<String, Boolean> result to store your request result.
In your controller:
public void yourControllerMethod() {
CompletableFuture.runAsync(() -> yourServiceMethod());
}
In your service:
// Execute your logic to get List<String> urls
List<CompletableFuture> futures = urls.stream().map(v ->
CompletableFuture.supplyAsync(url -> requestUrl(url))
.thenAcceptAsync(requestResult -> result.put(url, true or false))
).collect(toList()); // You have list of completeable future here
Then use .join() to wait for all thread (Remember that your service are executed in its own thread already)
CompletableFuture.allOf(futures).join();
Then you can determine which one success/fail by accessing result map
Edit
Please post your proceduce code so that other may understand you also.
I've read your code and here are the needed modification:
When this for loop was not commented out, the receiver webserver got
the same request twice,
I dont understand the purpose of this for loop.
Sorry in my previous answer, I did not clean it up. That's just a temporary idea on my head that I forgot to remove at the end :D
Just remove it from your code
// allOf() only accepts arrays, so the List needed to be converted
/* The code never gets over this part (I know allOf() is a blocking call), even long after when the receiver got the HTTP request
with the correct payload. I'm not sure yet where exactly the code gets stuck */
Your map should be a ConcurrentHashMap because you're modifying it concurrently later.
Map<String, Boolean> result = new ConcurrentHashMap<>();
If your code still does not work as expected, I suggest to remove the parallelStream() part.
CompletableFuture and parallelStream use common forkjoin pool. I think the pool is exhausted.
And you should create your own pool for your CompletableFuture:
Executor pool = Executors.newFixedThreadPool(10);
And execute your request using that pool:
CompletableFuture.supplyAsync(YOURTASK, pool).thenAcceptAsync(Yourtask, pool)
For the sake of completion here is the relevant parts of the code, after clean-up and testing (thanks to Mạnh Quyết Nguyễn):
Rest controller class:
#POST
#Path("publish")
public Response publishEvent(PublishEvent eventPublished) {
/*
Payload verification, etc.
*/
//First send the event to the right subscribers, then send the resulting hashmap<String url, Boolean subscriberGotTheRequest> back to the publisher
CompletableFuture.supplyAsync(() -> EventHandlerService.propagateEvent(eventPublished)).thenAccept(map -> {
if (eventPublished.getDeliveryCompleteUri() != null) {
String callbackUrl = Utility
.getUri(eventPublished.getSource().getAddress(), eventPublished.getSource().getPort(), eventPublished.getDeliveryCompleteUri(), isSecure,
false);
try {
Utility.sendRequest(callbackUrl, "POST", map);
} catch (RuntimeException e) {
log.error("Callback after event publishing failed at: " + callbackUrl);
e.printStackTrace();
}
}
});
//return OK while the event publishing happens in async
return Response.status(Status.OK).build();
}
Service class:
private static List<EventFilter> getMatchingEventFilters(PublishEvent pe) {
//query the database, filter the results based on the method argument
}
private static boolean sendRequest(String url, Event event) {
//send the HTTP request to the given URL, with the given Event payload, return true if the response is positive (status code starts with 2), false otherwise
}
static Map<String, Boolean> propagateEvent(PublishEvent eventPublished) {
// Get the event relevant filters from the DB
List<EventFilter> filters = getMatchingEventFilters(eventPublished);
// Create the URLs from the filters
List<String> urls = new ArrayList<>();
for (EventFilter filter : filters) {
String url;
try {
boolean isSecure = filter.getConsumer().getAuthenticationInfo() != null;
url = Utility.getUri(filter.getConsumer().getAddress(), filter.getPort(), filter.getNotifyUri(), isSecure, false);
} catch (ArrowheadException | NullPointerException e) {
e.printStackTrace();
continue;
}
urls.add(url);
}
Map<String, Boolean> result = new ConcurrentHashMap<>();
Stream<CompletableFuture> stream = urls.stream().map(url -> CompletableFuture.supplyAsync(() -> sendRequest(url, eventPublished.getEvent()))
.thenAcceptAsync(published -> result.put(url, published)));
CompletableFuture.allOf(stream.toArray(CompletableFuture[]::new)).join();
log.info("Event published to " + urls.size() + " subscribers.");
return result;
}
Debugging this was a bit harder than usual, sometimes the code just magically stopped. To fix this, I only put code parts into the async task which was absolutely necessary, and I made sure the code in the task was using thread-safe stuff. Also I was a dumb-dumb at first, and my methods inside the EventHandlerService.class used the synchronized keyword, which resulted in the CompletableFuture inside the Service class method not executing, since it uses a thread pool by default.
A piece of logic marked with synchronized becomes a synchronized block, allowing only one thread to execute at any given time.
I have the following code:
List<ObjectA> allObjects = (List<ObjectA>) objArepository.findAll();
for (ObjectA objA : allObjects) {
String location = objA.getUrl();
Client client = utils.createClient();
WebTarget webTarget = client.target(location).path("/testUrl/" + someString);
Invocation.Builder requestBuilder = webTarget.request();
Response response;
try {
response = request.invoke();
}
}
instead of the for loop which sends those calls serially, I would like to send those calls all parallely, the problem is for that I didn't find any examples and I am missing an idea how to do that in java
Use ExecutorService.
The executorService.invokeAll can execute a list of tasks in parallel and wait them to complete.
ExecutorService executor = getExecutorService();
List<Request> requests = getRequests();
List<Callable> tasks = requests.stream()
.map(r -> new Processor(r))
.collect(Collectors.toList());
executor.invokeAll(tasks);
If you need asynchronous calls, use executorService.submit or executorService.execute
Update
According to the comment, I add a few more words about the code above.
getExecutorServices() returns a executorService created in other places, maybe a singleton, since the creation of an executorService is quite expensive.
getRequests() returns a list of requests, Request can be anything you want to process, such as ObjectA in the question.
executorService.invokeAll accepts a list of Callable, so you have to encapsulate your requests in callables. Processor is a callable to process Request.
Actually, I think the code is quite descriptive and an ordinary Java programmer can understand it.
I have an observable that:
emits data after few seconds.
can be triggered several times.
the operation can't be executed in parallel. So we need a buffer.
I understand that this isn't clear so let me explain with example:
Observable<IPing> pingObservable = Observable.defer(() ->
new PingCommand(account, folders)
.post()
.asObservable()
);
this is the main feature. It shouldn't be called again while a previous one is executing, but it should remember that user requests it again. So I created close buffer as PublishSubject
closeBuffer = PublishSubject.create();
now I'm wondering how to merge it.
I have tried this:
Observable.defer(() -> new PingCommand(account, folders)
.post()
.asObservable()
.buffer(() -> closeBuffer)
.flatMap(Observable::from)
.first()
);
but it is not working as I want.
Edit:
I will try to explain that better:
I'm sending POST to the server - We can wait for a response several MINUTES (because it is Exchange ActiveSync PUSH). I cannot ping again while one request is sending. So I have to wait until one request is done. I don't need to buffer those observables - just information if an user is requesting ping - and send request after a first one is done. I'm just learning reactive so I don't know how to really use complicated functions like backpressure.
This is how I want this problem to be solved (pseudo code)
??????<Result> request
= ????.???()
.doOnNext( result -> { … })
.doOnSubscribe(() -> { … })
.doOnCompleted(() -> { … })
.…
//__________________________________________________________
Observable<Result> doAsyncWork(Data data) { … } // this is API function
//__________________________________________________________
// api usage example
Subscription s1 = doAsyncWork(someData).subscribe() // start observing async work; executed doOnSubscribe
Subscription s2 = doAsyncWork(someData).subscribe() // wait for async work result …
//__________________________________________________________
// after some time pass, maybe from other thread
Subscription s1 = doAsyncWork(someData).subscribe() // wait for async work result …
//__________________________________________________________
// async work completes, all subscribers obtain the same result; executed doOnCompleted
//__________________________________________________________
// again
Subscription s1 = doAsyncWork(someData).subscribe() // start observing async work; executed doOnSubscribe
// async work completes, subscriber obtains result; executed doOnCompleted
Obviously, I can use if instead but I want to know how to do it in a proper way.
I'm looking for an example like this but with a synchronous call. My program needs data from external source and should wait until response returns (or until timeout).
The Play WS library is meant for asynchronous requests and this is good!
Using it ensures that your server is not going to be blocked and wait for some response (your client might be blocked but that is a different topic).
Whenever possible you should always opt for the async WS call. Keep in mind that you still get access to the result of the WS call:
public static Promise<Result> index() {
final Promise<Result> resultPromise = WS.url(feedUrl).get().map(
new Function<WS.Response, Result>() {
public Result apply(WS.Response response) {
return ok("Feed title:" + response.asJson().findPath("title"));
}
}
);
return resultPromise;
}
You just need to handle it a bit differently - you provide a mapping function - basically you are telling Play what to do with the result when it arrives. And then you move on and let Play take care of the rest. Nice, isn't it?
Now, if you really really really want to block, then you would have to use another library to make the synchronous request. There is a sync variant of the Apache HTTP Client - https://hc.apache.org/httpcomponents-client-ga/index.html
I also like the Unirest library (http://unirest.io/java.html) which actually sits on top of the Apache HTTP Client and provides a nicer and cleaner API - you can then do stuff like:
Unirest.post("http://httpbin.org/post")
.queryString("name", "Mark")
.field("last", "Polo")
.asJson()
As both are publically available you can put them as a dependency to your project - by stating this in the build.sbt file.
All you can do is just block the call wait until get response with timeout if you want.
WS.Response response = WS.url(url)
.setHeader("Authorization","BASIC base64str")
.setContentType("application/json")
.post(requestJsonNode)
.get(20000); //20 sec
JsonNode resNode = response.asJson();
In newer Versions of play, response does ot have an asJson() method anymore. Instead, Jackson (or any other json mapper) must be applied to the body String:
final WSResponse r = ...;
Json.mapper().readValue(r, Type.class)