How Do Applications handle Asynchronous Responses - via Callback - java

I have been doing Java for a few years but I have not had much experience with Asynchronous programming.
I am working on an application that makes SOAP web service calls to some Synchronous web services and currently the implementation of my consuming application is Synchronous also ie. my applications threads block while waiting for the response.
I am trying to learn how to handle these SOAP calls in an asynchronous way - just for the hell of it but I have some high-level questions which I cant seem to find any answers to.
I am using CXF but my question is not specifically about CXF or SOAP, but higher-level, in terms of asynchronous application architecture I think.
What I want to know (working thru a scenario) - at a high level - is:
So I have a Thread (A) running in my JVM that makes a call to a remote web service
It registers a callback method and returns a Future
Thread (A) has done its bit and gets returned to its pool once it has returned the Future
The remote web service response returns and Thread (B) gets allocated and calls the callback method (which generally populates the Future with a result I believe)
Q1. I cant get my head off the blocking thread model - if Thread (A) is no longer listening to that network socket then how does the response that comes back from the remote service get allocated Thread (B) - is it simply treated as a new request coming into the server/container which then allocates a thread to service it?
Q2. Closely related to Q1 I imagine: if no Thread has the Future, or handler (with its callback method) on its stack, then how does the response from the remote web service get associated with the callback method it needs to call?
Or, in another way of asking, how does Thread B (now dealing with the response) get given a reference to the Future/Callback object?
Very sorry my question is so long - and thanks to anyone who gave their time to read through it! :)

I don't see why you'd add all this complexity using asynchronous Threading.
The way to design an asynchronous soap service:
You have one service sending out a response to a given client / clients.
Those clients work on the response given asynchronously.
When done, they would call another soap method to return their response.
The response will just be stored in a queue (e.g. a database table), without any extra logic. You'd have a "Worker" Service working on the incoming tasks. If a response is needed again another method on the other remote service would be called. The requests I would store as events in the database, which would later be asynchronously handled by an EventHandler. See
Hexagonal Architecture:
https://www.youtube.com/watch?v=fGaJHEgonKg

Your Q1 and Q2 seem to have more to do with multithreading than they have to do with asynchronous calls.
The magic of asynchronous web service calls is that you don't have to worry about multithreading to handle blocking while waiting for a response.
It's a bit unclear from the question what the specific problem statement is (i.e., what you are hoping to have your application do while blocking or rather than blocking), but here are a couple ways that you could use asynchronous web service calls that will allow you to do other work.
For the following cases, assume that the dispatch() method calls Dispatch.invokeAsync(T msg, AsyncHandler handler) and returns a Future:
1) Dispatch multiple web service requests, so that they run in parallel:
If you have multiple services to consume and they can all execute independently, dispatch them all at once and process the responses when you have received them all.
ArrayList<Future<?>> futures = new ArrayList<Future<?>>();
futures.add(serviceToConsume1.dispatch());
futures.add(serviceToConsume2.dispatch());
futures.add(serviceToConsume3.dispatch());
// now wait until all services return
for(Future f<?> : futures) {
f.get();
}
// now use responses to continue processing
2) Polling:
Future<?> f = serviceToConsume.dispatch();
while(!f.isDone()) {
// do other work here
}
// now use response to continue processing

Related

Long running processes inside service invoked from actor

We are using Akka actors in our application. Actors are invoking services and in some cases those services are calling rest API exposed by third party. Now the third party APIs are taking a very long time to respond. Due to this during peak time , the system through put is getting impacted as threads are blocked as they wait for client API to respond.
Sometimes during peak time , because threads are waiting , messages just keep waiting in akka mail box for long time and are picked up, once blocked threads are made available.
I am looking for some solution where in i can improve the throughput of the system and can free up threads so that actor messages can start processing.
I am thinking to change the rest API call from blocking to non blocking call using Future and creating a dedicated actor for such kind of rest API invocations . Actor will periodically check if future is completed and then sends a completion message and then rest of process can continue.This way the number of blocking threads will reduce and they will be available for actor message processing.
Also would like to have separate execution context for actors which are doing blocking operations. As of now we have global execution context.
Need further inputs on this.
Firstly, you are correct that you should not block when processing a message, so you need to start an asynchronous process and return immediately from the message handler. The Actor can then send another message when the process completes.
You don't say what technology you are using for the REST API calls but libraries like Akka HTTP will return a Future when making an HTTP request so you shouldn't need to add your own Future around it. E.g:
def getStatus(): Future[Status] =
Http().singleRequest(HttpRequest(uri = "http://mysite/status")).flatMap {
res =>
Unmarshal(res).to[Status]
}
If for some reason you are using a synchronous library, wrap the call in Future(blocking{ ??? }) to notify the execution context that it needs to create a new thread. You don't need a separate execution context unless you need to control the scheduling of threads for this operation.
You also don't need to poll the Future, simply add an onComplete handler that sends an internal message back to your actor which then processes the result of the API call.

Asynchronous HTTP request in restfull Web Service

I have a java restful web service implemented, and I have one method in that ws that makes a HTTP request which takes like 3-4 minutes, I want to know if I can get any benefit of making that call asynchronous.
The thread could be used by another request or will be blocked anyway by the main call?
Edit: I am making a petition P to my web service A (a synchronous petition only), that petition is handled by thread T1, when the petition P call the URL that takes 3-4 minutes, would I get benefits if I make that call asynchronous (to the URL that takes 3-4 minutes). Benefits like the thread T1 will be able to handle new petitions?.
If the answer is no, then are there another benefit in doing that call asynchronously?
It's not good to block a HTTP request for such a long time, because HTTP is synchronous.
Instead of blocking, it would be better to make it asynchronous and return 202 Accepted. For getting the result you got two choices:
polling (client periodically polls for a result)
callback (notify client with help of callback-url)
For further reading look at this blogpost: https://www.adayinthelifeof.nl/2011/06/02/asynchronous-operations-in-rest/ or Best way to create REST API for long lasting tasks?.

Multithreading with Jersey

Here are two links which seem to be contradicting each other. I'd sooner trust the docs:
Link 1
Request processing on the server works by default in a synchronous processing mode
Link 2
It already is multithreaded.
My question:
Which is correct. Can it be both synchronous and multithreaded?
Why do the docs say the following?:
in cases where a resource method execution is known to take a long time to compute the result, server-side asynchronous processing model should be used
If the docs are correct, why is the default action synchronous? All requests are asynchronous on client-side javascript by default for user experience, it would make sense then that the default action for server-side should also be asynchronous too.
If the client does not need to serve requests in a specific order, then who cares how "EXPENSIVE" the operation is. Shouldn't all operations simply be asynchronous?
Request processing on the server works by default in a synchronous processing mode
Each request is processed on a separate thread. The request is considered synchronous because that request holds up the thread until the request is finished processing.
It already is multithreaded.
Yes, the server (container) is multi-threaded. For each request that comes in, a thread is taken from the thread pool, and the request is tied to the particular request.
in cases where a resource method execution is known to take a long time to compute the result, server-side asynchronous processing model should be used
Yes, so that we don't hold up the container thread. There are only so many threads in the container thread pool to handle requests. If we are holding them all up with long processing requests, then the container may run out of threads, blocking other requests from coming in. In asynchronous processing, Jersey hands the thread back to the container, and handle the request processing itself in its own thread pool, until the process is complete, then send the response up to the container, where it can send it back to the client.
If the client does not need to serve requests in a specific order, then who cares how "EXPENSIVE" the operation is.
Not really sure what the client has to do with anything here. Or at least in the context of how you're asking the question. Sorry.
Shouldn't all operations simply be asynchronous?
Not necessarily, if all the requests are quick. Though you could make an argument for it, but that would require performance testing, and numbers you can put up against each other and make a decision from there. Every system is different.

Java / Scala Future driven by a callback

Short Version:
How can I create a Promise<Result> which is completed on a trigger of a callback?
Long Version:
I am working on an application which deals with third-party SOAP services. A request from user delegates to multiple SOAP services simultaneously, aggregates the results and sends back to the user.
The system needs to be scalable and should allow multiple concurrent users. As each user requests ends up triggering about 10 web service calls and each call blocking for about 1 second, the system needs to be designed with non-blocking I/O.
I am using Apache CXF within Play Framework (Java) for this system. I have managed to generate the Asynchronous WS Client proxies and enable the async transport. What I am unable to figure out is how to return a Future to Play's Thread when I have delegated to multiple Web Service proxies and the results will be obtained as callbacks.
Option 1: Using async method calls returning Java Future.
As described in this scala.concurrent.Future wrapper for java.util.concurrent.Future thread, there is no way we can convert a Java Future to a Scala Future. Only way to get a result from the Future is to do Future.get() which blocks the caller. Since CXF's generated proxies return Java Future, this option is ruled out.
Option 2: Use Scala Future.
Since CXF generates the proxy interfaces, I am not sure if there is any way I can intervene and return a Scala Future (AFAIK Akka uses Scala Futures) instead of Java Future?
Option 3: Use the callback approach.
The async methods generated by CXF which return Java Future also takes a callback object which I suppose will provide a callback when result is ready. To use this approach, I will need to return a Future which will wait until I receive a callback.
I think Option 3 is most promising, although I have no ideas about how I can return a Promise which will be completed on receiving a callback. I could possibly have a thread waiting in a while(true) and waiting in between until result is available. Again, I don't know how I can go into wait without blocking the thread?
In a nutshell, I am trying to build a system which is making a lot of SOAP web service calls, where each call blocks for significant time. The system may easily run out of threads in case of lot of concurrent web service calls. I am working on finding a solution which is non-blocking I/O based which can allow many ongoing web service calls at the same time.
Option 3 looks good :) A couple of imports to start with...
import scala.concurrent.{Await, Promise}
import scala.concurrent.duration.Duration
and, just to illustrate the point, here's a mocked CXF API that takes the callback:
def fetch(url: String, callback: String => Unit) = {
callback(s"results for $url")
}
Create a promise, call API with promise as callback:
val promise = Promise[String]
fetch("http://corp/api", result => promise.success(result))
Then you can take promise.future which is an instance of Future into your Play app.
To test it, you can do this:
Await.result(promise.future, Duration.Inf)
which will block awaiting the result, at which point you should see "results for http://corp/api" in the console.

Asynchronous Pages equivalent in Java for REST Services

We are writing some REST services using Jersey. Our service make some underlying service calls which happens to be dead slow, which results in holding each thread per request for 3-4 seconds. While investigating I came across Asynchronous Pages in .Net which assigns a thread to each request from thread pool and returns the thread to thread pool once I/O operation starts and gets a new thread when I/O operation is finished to do rest of the processing.
Is there anything similar to this exists in Jersey, where we can serve more concurrent connections instead of holding one thread for each connection until it completes. I don't want to POST a request, return a GUID and then keep polling for the status of the request from client, since I don't control client code.
Thanks,
GG
Take a look at the Atmosphere's Framework, specifically the atmosphere-jersey module that brings asynchronous annotation to Jersey.
Take a look at one of the samples, or read this quick tutorial. Atmosphere's Jersey does exactly what you are looking at, without requiring you to manipulate threads or anything like that. Come to our mailing list in case you need more help.
(I am the Atmosphere Creator and Lead)

Categories

Resources