Is it possible to take an HTTPServletRequest away from its thread, dissolve this thread (i.e. bring it back to the pool), but keep the underlying connection with the browser working, until I get the results from a time-consuming operation (say, processing an image)? When the return data are processed, another method should be called asynchronously, and be given the request as well as the data as parameters.
Usually, long pooling functions in a pretty blocking fashion, where the current thread is not dissolved, which reduces the scalability of the server-side app, in terms of concurrent connections.
Yes, you can do this with Servlet 3.0
Below is the sample to write the alert every 30 secs(not tested).
#WebServlet(async =“true”)
public class AsyncServlet extends HttpServlet {
Timer timer = new Timer("ClientNotifier");
public void doGet(HttpServletRequest req, HttpServletResponse res) {
AsyncContext aCtx = request.startAsync(req, res);
// Suspend request for 30 Secs
timer.schedule(new TimerTask(aCtx) {
public void run() {
try{
//read unread alerts count
int unreadAlertCount = alertManager.getUnreadAlerts(username);
// write unread alerts count
response.write(unreadAlertCount);
}
catch(Exception e){
aCtx.complete();
}
}
}, 30000);
}
}
Below is the sample to write based on an event. The alertManager has to be implemented which notifies AlertNotificationHandler when client has to be alerted.
#WebServlet(async=“true”)
public class AsyncServlet extends HttpServlet {
public void doGet(HttpServletRequest req, HttpServletResponse res) {
final AsyncContext asyncCtx = request.startAsync(req, res);
alertManager.register(new AlertNotificationHandler() {
public void onNewAlert() { // Notified on new alerts
try {
int unreadAlertCount =
alertManager.getUnreadAlerts();
ServletResponse response = asyncCtx.getResponse();
writeResponse(response, unreadAlertCount);
// Write unread alerts count
} catch (Exception ex) {
asyncCtx.complete();
// Closes the response
}
}
});
}
}
Yes, it's possible using Servlet spec ver. 3.0. Implementation I can recommend is Jetty server. See here.
Related
I have written a servlet program which is calling a java class to generate the random strings. I have also written an HTML file which has 1000 IFRAME pointing to the servlet address. If multiple requests are going to a servlet then each request will be processed in a separate thread. So in this case, 1000 threads are being created(1000 request). The problem is that it is taking to much of time to processed and not able to handle if requests are more than 1000 and so on. It becomes slower if I do the complex calculation at the back-end. What changes need to be done at the servlet level(multithreading) or at tomcat level(if possible) for the fast response. Any suggestion?
Servlet Code
#WebServlet("/test")
public class MyServlet extends HttpServlet {
private static final long serialVersionUID = 1L;
private static PrintWriter out=null;
private UserOperation operation=null;
private static int counter=0;
public MyServlet() {
super();
operation=new UserOperation();
}
protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
counter++;
out=response.getWriter();
String output=operation.getResult();
System.out.println(counter+":"+output);
out.print(output);
return;
}
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
doGet(request, response);
}
}
UserOperation.java
import java.util.Random;
public class UserOperation {
private Random rand=null;
private int max=9;
private int min=0;
private static final String [] RESULT= {"ONE","TWO","THREE","FOUR","FIVE","SIX","SEVEN","EIGHT","NINE","TEN"};
public UserOperation() {
rand=new Random();
}
public String getResult() {
int randNum=rand.nextInt((max-min)+1)+min;
return RESULT[randNum];
}
}
HTML File
1000 times IFRAME call. below line has been copy paste 1000 times in the HTML file.
<IFRAME src="http://localhost:8080/MultipleRequest/test" width="30px" height="30px"></IFRAME>
Try implementing java.lang.Runnable interface or extending java.lang.Thread class at your UserOperation class for multithreading.
Also try increasing the heapsize of Java by adding -Xmx parameter with appropriate limit.
You can use an async servlet with servlet 3.0; there's an example here:
https://www.javaworld.com/article/2077995/java-concurrency/java-concurrency-asynchronous-processing-support-in-servlet-3-0.html
There's still an issue of an ongoing connection requiring a port on the server. A request made will take up a port, whether it'll take up a thread or not you can mitigate with event code using the async servlet but the port part will remain regardless. The only way to mitigate a port shortage is to scale your application to multiple servers
First of all your servlet is dangerous and not thread safe. First fix that.
#WebServlet("/test")
public class MyServlet extends HttpServlet {
private static final long serialVersionUID = 1L;
private final AtomicInteger counter = new AtomicInteger(0);
private final UserOperation operation=null;
public MyServlet() {
super();
operation=new UserOperation();
}
protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
int count = this.counter.incrementAndGet();
PrintWriter out=response.getWriter();
String output=operation.getResult();
System.out.println(count + ":" + output);
out.print(output);
return;
}
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
doGet(request, response);
}
}
Now that that is thread safe, the fact that you think that 1000 threads are created is wrong. Tomcat has a default request handling thread pool with a size of 200. So it will handle 200 requests concurrently and not 1000 as you think.
Next to that you shouldn't be using 1000 iframes for testing as your browser is limited to generally 6 concurrent requests to the server. So I actually doubt that you receive 1000 concurrent requests, but only 6 at a time from the browser. Chrome could be even limiting it to 1 at a time as it sees the same URL and tries to cache the results 1 by 1.
So in short I doubt that your servlet is slow but that your test method makes it appear to be slow! Use a proper tool like Apache JMeter to send 1000 requests to your server and then test the performance.
Try implementing java.lang.Runnable interface or extending java.lang.Thread class at your UserOperation class for multithreading.
Also try increasing the heapsize of Java by adding -Xmx parameter with appropricate limit.
#WebServlet(urlPatterns={"/AsyncServlet"}, asyncSupported=true)
public class AsyncServlet extends HttpServlet {
private static PrintWriter out=null;
private UserOperation operation=null;
private static int counter=0;
public AsyncServlet() {
super();
operation=new UserOperation();
}
#Override
public void doGet(HttpServletRequest request,
HttpServletResponse response) {
response.setContentType("text/html;charset=UTF-8");
final AsyncContext acontext = request.startAsync();
acontext.start(new Runnable() {
public void run() {
counter++;
try {
HttpServletResponse response = (HttpServletResponse) acontext.getResponse();
out=response.getWriter();
String param = acontext.getRequest().getParameter("param");
String output = operation.getResult();
System.out.println(counter + ":"+output);
out.print(output);
acontext.complete();
} catch (IOException e) {
e.printStackTrace();
}
}
});
}
public class UserOperation {
private Random rand=null;
private int max=9;
private int min=0;
private final String [] RESULT= {"ONE","TWO","THREE","FOUR","FIVE","SIX","SEVEN","EIGHT","NINE","TEN"};
public UserOperation() {
rand=new Random();
}
public String getResult() {
int randNum=rand.nextInt((max-min)+1)+min;
return RESULT[randNum];
}
}
}
I have a servlet which is used for a long process which takes minutes to complete. Upon receiving a request to this servlet, the long process is executed inside a thread in order to send the response back to the client immediately due to timeout issues:
public class TestServlet extends HttpServlet {
public void doPost(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException
{
//Thread safe code
Thread thread = new Thread() {
public void run() {
try {
Thread.sleep(10000); //simulate long processing
} catch(InterruptedException v) {
}
}
};
thread.start();
}
}
This means that every time I receive a request, a new thread is created. In order not to run into the risk of attacks, I need to control how many threads are allowed. This means having a pool in the context, and implementing a fail-fast if all threads are busy.
I was looking at the Executor interface. My question is, how can I implement this Thread Pool Executor to be accessible from all the requests received and act as a queue for all the threads? Should I declare the executor as a local non-thread safe variable in the servlet to be accessible by all instances of this servlet as shown below?
public class TestServlet extends HttpServlet {
//non-thread safe variables
//declare executor here
public void doPost(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException
{
//instantiate executor in case it is null
Thread thread = new Thread() {
public void run() {
try {
Thread.sleep(10000); //simulate long processing
} catch(InterruptedException v) {
}
}
};
//add thread to the executor
}
}
Or is it possible to declare this executor at context level?
I was looking also at the Tomcat Executor, which I believe is used by Tomcat itself to manage its thread. Would it be possible to also add these threads to this executor as well?
Usually doing explicit thread management in an app server is a bad idea. You could set up the servlet to run in a new thread itself, and thus avoid farming things out to another thread inside the servlet. I haven't looked up whether Tomcat lets you configure the maximum number of simultaneous instances of a servlet allowed, so that might remain an issue.
If you do explicitly use 'Thread.sleep()', don't abandon the 'InterruptedException' like that. It's the wrong thing to do. Look up the right thing (handle and re-interrupt).
Below code are servlet 3.1 Non Blocking IO demo:
UploadServlet:
#WebServlet(name = "UploadServlet", urlPatterns = {"/UploadServlet"}, asyncSupported=true)
public class UploadServlet extends HttpServlet {
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
AsyncContext context = request.startAsync();
// set up async listener
context.addListener(new AsyncListener() {
public void onComplete(AsyncEvent event) throws IOException {
event.getSuppliedResponse().getOutputStream().print("Complete");
}
public void onError(AsyncEvent event) {
System.out.println(event.getThrowable());
}
public void onStartAsync(AsyncEvent event) {
}
public void onTimeout(AsyncEvent event) {
System.out.println("my asyncListener.onTimeout");
}
});
ServletInputStream input = request.getInputStream();
ReadListener readListener = new ReadListenerImpl(input, response, context);
input.setReadListener(readListener);
}
protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
}
}
RealListenerImpl:
public class ReadListenerImpl implements ReadListener{
private ServletInputStream input = null;
private HttpServletResponse res = null;
private AsyncContext ac = null;
private Queue queue = new LinkedBlockingQueue();
ReadListenerImpl(ServletInputStream in, HttpServletResponse r, AsyncContext c) {
input = in;
res = r;
ac = c;
}
public void onDataAvailable() throws IOException {
System.out.println("Data is available");
StringBuilder sb = new StringBuilder();
int len = -1;
byte b[] = new byte[1024];
while (input.isReady() && (len = input.read(b)) != -1) {
String data = new String(b, 0, len);
sb.append(data);
}
queue.add(sb.toString());
}
public void onAllDataRead() throws IOException {
System.out.println("Data is all read");
// now all data are read, set up a WriteListener to write
ServletOutputStream output = res.getOutputStream();
WriteListener writeListener = new WriteListenerImpl(output, queue, ac);
output.setWriteListener(writeListener);
}
public void onError(final Throwable t) {
ac.complete();
t.printStackTrace();
}
}
WriteListenerImpl:
public class WriteListenerImpl implements WriteListener{
private ServletOutputStream output = null;
private Queue queue = null;
private AsyncContext context = null;
WriteListenerImpl(ServletOutputStream sos, Queue q, AsyncContext c) {
output = sos;
queue = q;
context = c;
}
public void onWritePossible() throws IOException {
while (queue.peek() != null && output.isReady()) {
String data = (String) queue.poll();
output.print(data);
}
if (queue.peek() == null) {
context.complete();
}
}
public void onError(final Throwable t) {
context.complete();
t.printStackTrace();
}
}
above codes work fine, i want to know what are differences with blocking IO servlet? and i want to know how above code works.
Reading input data:
In the blocking scenario when you read data from the input stream each read blocks until data is available. This could be a long time for a remote client sending large data which means the thread is held for a long time.
For example consider inbound data being received over 2 minutes at regular intervals in 13 chunks. In blocking read you read the first chunk, hold the thread for ~10 seconds, read the next chunk, hold the thread for ~10 seconds etc. In this case the thread might spend less than a second actually processing data and almost 120 seconds blocked waiting for data. Then if you have a server with 10 threads you can see that you would have a throughput of 10 clients every 2 minutes.
In the non-blocking scenario the readListener reads data while isReady() returns true (it must check isReady() before each call to read data),but when isReady() returns false the readListener returns and the thread is relinquished. Then when more data arrives onDataAvailable() is called and the readListener reads data again in until isReady is false().
In the same example, this time the thread reads the data and returns, is woken up 10 seconds later, reads the next data and returns, is woken up 10 seconds later reads data and returns etc. This time, while it has still taken 2 minutes to read the data the thread(s) needed to do this were only active for less than a second and were available for other work. So while the specific request still takes 2 minutes, the server with 10 threads can now process many more requests every 2 minutes.
Sending response data:
The scenario is similar for sending data and is useful when sending large responses. For example sending a large response in 13 chunks may take 2 minutes to send in the blocking scenario because the client takes 10 seconds to acknowledge receipt of each chunk and the thread is held while waiting. However in the non-blocking scenario the thread is only held while sending the data and not while waiting to be able to send again. So, again for the particular client the response is not sent any more quickly but the thread is held for a fraction of the time and the throughput of the server which processes the request can increase significantly.
So the examples here are contrived but used to illustrate a point. The key being that non-blocking i/o does not make a single request any faster than with blocking i/o, but increases server throughput when the application can read input data faster than the client can send it and/or send response data faster than the client can receive it.
I have a java web application that's expected to have many users who will make a strong load on it. At the same time, there are some scheduled tasks that require a lot of processing and I was looking for some automated way to start this thread and pause it according to the high load of the web requests. Is there any ready solutions available there for this task?
Use a javax.servlet.Filter with a static counter, incremented and decremented by the filter. This way you know the current load (= number of requests being processed currently)
Use #Startup with #Singleton and #Schedule to have a regular task (or any other scheduler like Quartz), for example every 5 minutes
In that task check the load. If it's low, enough start the real task(s).
You could monitor the current load in running tasks as well, and pause or exit, for example.
This works, if the real task is processing the contents of a queue for example.
Otherwise you possibly have to do some book-keeping, if the frequency of the first task is higher than the frequency of the real task or you have to make sure that the real task is only run once per day (or at least once a day).
Example code:
The filter:
#WebFilter("/*")
public class LoadFilter implements Filter {
private final static Logger log = Logger.getLogger(LoadFilter.class
.getName());
private final static AtomicInteger load = new AtomicInteger();
public static int getLoad() {
return load.get();
}
public void init(final FilterConfig fc) throws ServletException {
log.info("Hello from init()");
}
public void doFilter(ServletRequest req, ServletResponse resp,
FilterChain chain) throws IOException, ServletException {
final int currentLoad = load.incrementAndGet();
try {
log.info("Current load (enter): " + currentLoad);
chain.doFilter(req, resp);
} finally {
final int newLoad = load.decrementAndGet();
log.info("Current load (exit): " + newLoad);
}
}
public void destroy() {
log.info("Bye from destroy()");
}
}
and the EJB
#Singleton
#Startup
public class StartupSingleton {
#Schedule(second = "*/10", minute = "*", hour = "*", persistent = false)
public void execute() {
// Check load; if low, run task(s)
if (LoadFilter.getLoad() < 10) {
// Run tasks
}
}
}
I'm working on an Android project (API level 10) which needs to send and receive http messages to/from a server.
I implemented a class named NetworkManager which provides different methods, one for each http request (e.g.: loginRequest(user pass), RegistrationRequest(user.....) ).
All these methods generates a JSON object that is passed to the method called sendMessage, which is the method that actually establish the connection, sends and receives the response (also a json object).
Of course network calls are time consuming, so i first decided to use an AsyncTask to display a progressDialog while the network operation is being performed.
The problem is that i need to get the response value retrived from the background thread before executing any other operation which involves the result itself done by the Main thread.
At the same time i would like to make a common and reusable implementation of the AsyncTask.
E.g.: I have a login activity which shows 2 EditText (username, password) and a button called Login. When I press the login button, a progressDialog must appear, and must be disposed once the doInBackground task is accomplished. Of course i could do this way:
onClick(View v) //called when the login button is pressed
{
onPreExecute()
{
//Show the progress dialog
}
doInBackground()
{
//Retreive the login response (an integer containing a message code) using sendLoginRequest(username, password);
//return the response
}
onPostExecute(int response)
{
//Dispose the progress dialog, then loginSucessfull ? start new activity : show error toast
}
}
But, doing this way i should implement an async task for every request i need to send which is what i would like to avoid because if i have N requests i should create N classes that extend AsyncTask.
Thank you!
What i would suggest you is to use INTERFACES for handling response of http request.
The background thread either it be a AysncTask or it be Thread needs to handle both
response
exception
Think it like this way
MainThread - Hey Background Thread do this operation and let me know when you are done.
MainThread - Ok till Background Thread executes its operation let me show progress dialog.
BackGroundThread - I am done with my work. hey MainThread here catch you response or exception
MainThread - Let me stop showing progress bar.
So we need to simulate this callback mechanism via code and also needs to take care that we implement a reusable architecture.
Something like this
Define a Interface
public interface HttpRequestResponse {
public void onSuccess(HttpResponse response);
public void onException(Exception exception);
}
class HttpRequestResponseHandler {
private ActionItem action;
private HttpRequestResponse hrr;
private Executor executor;
public enum ActionItem {
LOGIN_REQUEST ,
REGISTRATION_REQUEST
}
public HttpRequestResponseHandler(ActionItem action, HttpRequestResponse hrr) {
this.action = action;
this.hrr = hrr;
}
public void execute(){
executor = new Executor();
executor.execute();
}
private class Executor extends AsyncTask<Void,Void,Void> {
#Override
public Void doInBackground() {
switch(action) {
case LOGIN_REQUEST : doLogin();
break;
case REGISTRATION_REQUEST : doRegistration();
break;
}
}
}
private void doLogin() {
HttpResponse response = null;
Exception exception = null;
try {
response = makeHttpRequestHere();
} catch (Exception e) {
exception = e;
}
if(exception != null) {
hrr.onException(exception);
} else {
hrr.onSuccess(response);
}
}
}
Now in somewhere in your activity code file do like this.
HttpRequestResponse hrr = new HttpRequestResponse(){
#Override
public void onSuccess(HttpResponse response) {
hideProgressDialog();
handleResponse(response);
}
#Override
public void onException(Exception exception) {
hideProgressDialog();
showErrorDialog(exception.getMessage());
}
}
HttpRequestResponseHandler hrrh = new HttpRequestResponseHandler(ActionItem.LOGIN_REQUEST,hrr);
hrrh.execute();
showProgressDialog();
Hope all this lead to what you want.
Its been a long answer and took quite a effort of mine to figure. :)
why not just using AsyncTask.THREAD_POOL_EXECUTOR(Runnable run);
It wraps a thread pool based executor of #cores + 1 parallelity level.
Then you can simply invoke:
AsyncTask.THREAD_POOL_EXECUTOR(new Runnable(){
public void run(){
doLogin();
});