I am Uploading a file into a webserver using File upload API it works good for a single user, if multiple user upload a file simultaneously how to improve my code using threads?
What type of web server are you using? Typically web servers process separate requests on separate threads, so you shouldn't have to do anything special, your web service code will be inherently multi-threaded.
According to the Servlet API each request is supposed to be processed by a single thread, therefore you shouldn't have any issues.
However if you're trying to maximize the number of users your server could potentially service then you might take a look at advanced connectors for Tomcat or whatever container you are using
http://tomcat.apache.org/tomcat-6.0-doc/aio.html
Related
I am building an application that plays opus audio files stored on a nas mounted on the application server. The application reads these files, and send them to the HTTP response output stream.
I am using spring-mvc and the web application runs on tomcat.
I get the audio files using:
<mvc:resources mapping="/audios/**" location="file:/path/to/the/mounted/nas"/>
and I serve them in a <audio> tag.
Now I would like to make the application extremely reactive and multi user friendly. If I have 20 people playing many audio simultaneously, the system gets very slow and I do not know what is the best thing to do.
Could you please suggest me how to improve the application
Thanks
You have to first investigate where is the actual delay happening. Is it in getting resource from file system from within the code, or it is because Tomcat is not able to withstand more load. To make sure it's not the code issue, you can log the code execution time and check if everything is fine there. Here is the way to do that - How do I time a method's execution in Java?
Now if it's the Tomcat server which is not able to withstand the load, which looks more probable issue, you might want to consider having cluster of multiple tomcat instances by creating a load balancer configuration with Apache web server. This way requests traffic load will be distributed by web server to all the running tomcat nodes, which will reduce the latency.
You can find how to configure load balancing at these link-
http://www.tutorialspoint.com/white-papers/load-balancing-and-scalability-via-tomcat-clusters.htm
https://tomcat.apache.org/connectors-doc/generic_howto/loadbalancers.html
The following programs exist:
1. I have a java application which accepts bio potential data every second or two and stores it in the database. This is a socket server which accepts this data from multiple clients and spawns a new thread for processing it to store in the db.
2. I have a jsp page on tomcat server which reads historic client data from database (stored by application 1) and displays it on the page.
The socket server program in 1.) above is not running inside of tomcat server.
The new requirement now is : Display all of the human data coming in live on the jsp page.
Now the problem:
I will now need to pass the live data from socket server (which is stand alone) to the jsp which is running on a tomcat server.
Possible solutions:
APPROACH 1: Run the socket server in the tomcat instead of stand alone and store the frequently incoming data in a java object so the jsp can access this object every second and display it on a graph.
PROBLEM : The stand alone java application does not need to be included in a tomcat server except for the fact that the jsp needs access to the live data. Also, I have read that this is not the best way.
APPROACH 2: Expose the stand alone java application as a web service and communicate with the jsp using REST architecture.
PROBLEM : The complication of using this method is that it will not have the flexibility offered by websockets or server sent events (SSE) of auto updating the latest data. The jsp will have to keep polling for new data every one second which is also not a very good option.
I need suggestions on which is a better method for accomplishing my task. Or is there a third better way which I have completely missed.
I have a java application which accepts bio potential data every
second or two and stores it in the database
You already have the answer: just display required data from this database in your jsp page. This will be easiest solution.
I undestand that you're trying to display realtime data, but JSP itself is not designed for realtime output, you will have the delay anyway and because you already have required data in database - no need to transport it to Tomcat server.
For my Spring-based web application, I now have the requirement to send out weekly e-mails to my application's users.
What are elegant solutions to this requirements?
Up until now, I have come up with the following possible solutions:
a dedicated cron job that I schedule to run once a week, running independently from my web application JVM process and outside of the web application Servlet container. This process takes care of sending out those weekly e-mails. To accomplish sending personalized e-mails, it reuses domain classes (such as my User class) that I have already developed for my web application. This dedicated process accesses my application's MySQL database concurrently to the running Spring Web MVC servlet?
a scheduled mechanism inside my Spring Web MVC servlet or inside my Servlet container.
In this setup, the e-mail sending happens inside the same JVM and the same servlet container as my web-serving Spring Web MVC servlet. Maybe this setup has (irrelevant?) advantages such as "database connection pool sharing" and "transaction sharing" "class sharing" with the servlet hosted inside the same environment.
Using or not using Spring Batch, for any of the above conceived setups. I have no experience right now with Spring Batch as to judge whether Spring Batch is or isn't an adequate tool for my requirement.
Maybe there are other solutions as well?
I am especially interested in answers that can give insights and guide in making an educated decision.
It is irrelevant for this particular question whether e-mails get sent with my own infrastructure or with a third party e-mail SaaS service.
From your description, the code for generating newsleters must share common code base with your main application. So the natural solution is to develop this code withing your main application. The open case is how this code is triggered:
From CRON. You start a script from CRON that would trigger the function within you application somehow. This somehow may be a process listening on specific port, or, what is quite natural for web application, a dedicated URL that would trigger newsletter. Just make sure that URL can't be run from outside, only from localhost (check caller IP, for example). You must, however, deal with the situation, that your app is down (restarting for example) when CRON launches the script.
From within the application. For example, using Quartz. The minus is that you need to include new library, create database tables for Quartz. The plus - Quartz will handle situation, when the task was scheduled on the moment when the application was down, because it stores the information about what was launched in DB.
We always use cron to fire a JMS message to a queue and have a dedicated process which consumes these messages. You can add the email contents to the message or just use the message as a trigger. The nice thing about this approach is you can fire in a JMS message from anywhere and have multiple handlers lots of different email scenarios. The only downside is installing a JMS broker, if you don't already have one...
I am building a Spring-MVC based web application which is required to send a weekly newsletter to a small group of people. I am using Spring's built-in scheduling mechanism. http://static.springsource.org/spring/docs/3.0.x/reference/scheduling.html‎
Yes, in this setup, the e-mail sending happens inside the same JVM and the same servlet container and it is quite easy and handy to implement the solution. I am observing the stability and reliability of this mechanism and cannot feedback more about it now.
From my understanding of the Heroku platform, they allow only one dyno per app that can handle HTTP requests. Within a Java app, my goal is to have a periodic process run once or twice a day that gets information from Facebook servers and processes it accordingly. If a background worker dyno can't handle web requests, then how should I go about writing a reoccurring process within the web dyno?
In this sense, "handle HTTP requests" refers to the listen side of the connection not the send side. On Heroku, an application can have a single web process* that listens for HTTP connections and many other processes that initiate / send HTTP requests (or connect to other non-HTTP systems).
*Note: You can allocate as many Dynos as you want, to run each process.
To run a process that periodically makes requests to an external service (like Facebook) you can use the Heroku Scheduler Add-on. Then you could either store the results in one of the many relational or NoSQL data-storage add-ons or possibly send the results to other processes via a messaging add-on like CloudAMQP.
If the platform is limiting you, why use it?
There are other platforms that will let you perform this.
You can easily deploy java applications to the cloud using OpenShift and then you use java API with no limitations what soever as far as I know.
OpenShift is also using git, same way Heroku does.
They are both PaaS, so the concept is quite the same, and I think it will be easy for you to try out OpenShift.
And you can also check Google App Engine to see if it has such limitations.
Sorry for not helping you with the specific question, I simply know that at least "one competitor" does not have such a limitation.
This question is kind of related to our web application and it is bugging me from last few months. So we use linux server for database, application and we have our custom built java web server. If we do any change in source code of application, we build a new jar file and replace the existing jar file with new jar file. Now update to take place in live application, we just execute a HTML file which contains this kind of code :
<frameset rows="100%"?
<frame src="http://mydomain.com:8001/RESTART">
</frameset>
How does this opening of port make the application to use new jar file?
The webserver is instructed to give the /RESTART URL special treatment. This can either be through a mapping to a deployed servlet, or through a hardcoded binding to a web container action.
It is very common to have URLs with special meaning (usually protected by a password) allowing for remote maintainance, but there is no common rule set. You can see snapshots of the Tomcat Administration console at http://linux-sxs.org/internet_serving/c516.html
EDIT: I noticed you mentioned a "custom built web server". If this web server does not provide servlets or JSP's - in other words conforms to the Servlet API - you may consider raising the flag about switching to a web server which do.
The Servlet API is a de-facto industry standard which allows you to cherry-pick from a wide array of web servers from the smallest for embedded devices to the largest enterprise servers spreading over multiple physical machines, without changing your code. This means that the hard work of making your application scale has been done by others. In addition they probably even made the web server as fast as possible, and if not, you can pick another where they did.
You're sending an HTTP GET to whatever's listening on that port (presumably your web server). The servlet spec supports pre- and post-request filters, so the server may have one set up to capture this particular request and handle it in a special fashion.