I'm writing a java application that runs in Tomcat, on a multi-core hardware.
The application executes an algorithm and returns the answer to the user. The problem is that even when I run two requests simultaneously, the tomcat process uses at most one CPU core.
As far as I understand each request in Tomcat is executed in separate thread, and JVM should run each thread on separate CPU core.
What could be the problem that bounds the JVM or Tomcat to use no more than one core?
Thanks in advance.
Are you sure that two threads are being created. You could simply print the name of the thread as a quick test.
What happens if you run the algorithm in a standalone app?
All the processor management will be taken care by the server itself. It is not mandatory that if you pass two requests, it should use two CPUs.
Are you executing any synchronized blocks/methods which would force serial execution? The tomcat connector configuration in server.xml controls the request thread pool - but the default is 200 threads, IIRC.
Here is the procedure to do a load balancing in tomcat http://tomcat.apache.org/tomcat-5.5-doc/balancer-howto.html
I think this will work with Tomcat 6 too as they mentioned balancer webapp is shipped with tomcat 5.0 and later.
Related
Some of my servlets will involve heavy procession and time-consuming. And other servlets are simple. If the heavy servlet calls are very high in number, the simple servlets cannot be used. so, I want to limit the heavy servlets that it can use only up to 50% of the total CPU. Is it possible? if so, please explain how/where to proceed in tomcat.
Capsulate the code away from the rest of the application as war
then
create a secound instance of Tomcat and configure via the operating system the priority.
Ubuntu (Linux) system priority
or
try to set threadPriority into the Executor (thread pool) in Tomcat
When you deploy many applications to a java application server, do those applications all run in the same JVM i.e. the JVM that's started when the application server starts up?
Do you have the option to run each of those applications in a separate JVM? If so why would you want to do this?
java application server runs in a single JVM, so every app deployed under java application server instance runs in the same VM as every other application while every app has a different class loader
Go through this questions's answer..hope all queries will be answered :
Why have one JVM per application?
I am afraid you can't run in different JVMs because the appserver have to manage the objects life cycle. That's what JEE is all about. Also, that's why JEE states that you should not use threads in your app, because you want the container to take care of the concurrency for you.
Of course, in a clustered environment, you can have several JVMs, but still be the same for the app server + container.
Yes if the application server is not clustered.
Otherwise it could work on different host machine and jvm.
I'd like to run a web container where each webapp runs in its own process (JVM). Incoming requests get forwarded by a proxy webapp running on port 80 to individual webapps, each (webapp) running on its own port in its own JVM.
This will solve three problems:
Webapps using JNI (where the JNI code changes between restarts) cannot be restarted. There is no way to guarantee that the old webapp has been garbage-collected before loading the new webapp, so when the code invokes System.loadLibrary() the JVM throws: java.lang.UnsatisfiedLinkError: Native Library x already loaded in another classloader.
Libraries leak memory every time a webapp is reloaded, eventually forcing a full server restart. Tomcat has made headway in addressing this problem but it will never be completely fixed.
Faster restarts. The mechanism I'm proposing would allow near-instant webapp restarts. We no longer have to wait for the old webapp to finish unloading, which is the slowest part.
I've posted a RFE here and here. I'd like to know what you think.
Does any existing web container do this today?
I'm closing this question because I seem to have run into a dead end: http://tomcat.10.n6.nabble.com/One-process-per-webapp-td2084881.html
As a workaround, I'm manually launching a separate Jetty instance per webapp.
Can't you just deploy one app per container and then use DNS entries and reverse proxies to do the exact same thing? I believe Weblogic has something like this in the form of managed domains.
No, AFAIK, none of them do, probably because Java web containers emphasize following the servlet API - which spins off a thread per http request. What you want would be a fork at the JVM level - and that simply isn't a standard Java idiom.
If I understand correctly you are asking for the standard features for enterprise quality servers such IBM's WebSphere Network Deployment (disclaimer I work for IBM) where you can distribute applications across many JVMs, and those JVMs can in fact be distributed across many physical machines.
I'm not sure that your fundamental premise is correct though. It's not necessary to restart a whole JVM in order to deploy a new version of an application. Many app servers will use a class-loader strategy that allows them to discard a version of an app and load a new one.
We have several Java web-applications that need to be deployed on the same machine, over tomcat. The web-applications are not related to each other. Some of them do intensive I/O and CPU operations and consume much memory.
Under the above conditions, which approach is recommended - having a single tomcat with multiple webapps, or multiple tomcats each running a single webapp ?
If all webapps are deployed on the same tomcat, is there a way to guarantee minimum resources per webapp ? I.e. minimum amount of memory, number of threads, etc.
Thanks,
Arnon.
What we did at our company is we run 1 application per instance of Tomcat. We originally started with multiple instances and it occurred occasionally that one application would affect the other, especially if you had to restart the Tomcat instance.
One thing that might be worth evaluating is Spring's TC Server.
http://www.springsource.com/developer/tcserver
Similar to #tjg184 's experience. I would recommend running a tomcat per application instance. If you have a decent config and process management system, the incremental cost is not all that high and it gives you the best isolation possible without separate vm's for each tomcat instance. You could start with a single tomcat and some solid monitoring and then see if you need to move to one tomcat per app.
I am working on a web-based application which is deployed in the Tomcat server. In our local dev enviroemt, when we start the Tomcat server it spawns only one java process which keeps running. However, an issue has been reported in production where the CPU usage for java process has gone up and there are multiple java processes which have been spawned.
There is no other java application running, so this must have been spawned from Tomcat itself. What is the reason that in our development enviroment there is only one java process while in production multiple java processes have been spawned by Tomcat and how to correct it.
Regards,
Dev
Unlike Apache HTTPD, Tomcat doesn't spawn processes on its own (it uses multiple threads to serve multiple clients) so you should look elsewhere. For example, how do you deploy your application to the Tomcat. Could it be something like a buggy deployment script?
Also, what other applications run on this Tomcat container?
What you see are most likely multiple threads that the version of top or ps shows on the production box, while you don't see them on the local one.
In production you most likely face a much higher workload, so that requests are served in parallel, while on the local box Tomcat gets away with less threads.