Multiple java processes in Tomcat - java

I am working on a web-based application which is deployed in the Tomcat server. In our local dev enviroemt, when we start the Tomcat server it spawns only one java process which keeps running. However, an issue has been reported in production where the CPU usage for java process has gone up and there are multiple java processes which have been spawned.
There is no other java application running, so this must have been spawned from Tomcat itself. What is the reason that in our development enviroment there is only one java process while in production multiple java processes have been spawned by Tomcat and how to correct it.
Regards,
Dev

Unlike Apache HTTPD, Tomcat doesn't spawn processes on its own (it uses multiple threads to serve multiple clients) so you should look elsewhere. For example, how do you deploy your application to the Tomcat. Could it be something like a buggy deployment script?
Also, what other applications run on this Tomcat container?

What you see are most likely multiple threads that the version of top or ps shows on the production box, while you don't see them on the local one.
In production you most likely face a much higher workload, so that requests are served in parallel, while on the local box Tomcat gets away with less threads.

Related

Java Application Servers and JVM

When you deploy many applications to a java application server, do those applications all run in the same JVM i.e. the JVM that's started when the application server starts up?
Do you have the option to run each of those applications in a separate JVM? If so why would you want to do this?
java application server runs in a single JVM, so every app deployed under java application server instance runs in the same VM as every other application while every app has a different class loader
Go through this questions's answer..hope all queries will be answered :
Why have one JVM per application?
I am afraid you can't run in different JVMs because the appserver have to manage the objects life cycle. That's what JEE is all about. Also, that's why JEE states that you should not use threads in your app, because you want the container to take care of the concurrency for you.
Of course, in a clustered environment, you can have several JVMs, but still be the same for the app server + container.
Yes if the application server is not clustered.
Otherwise it could work on different host machine and jvm.

Is it bad to have two webapps on a machine each on different servlet container than both on same?

I need to run two webapps on a ubuntu VPS but one(my own webapp) prefers to run in Tomcat & other(Solr webservice) is preferable in Jetty. But I think running a separate server for each webapp would be consuming more resources (like memory consumption would be higher) than both the webapps running on same server, isn't it ?
What may be the other bad if I run both Tomcat & Jetty on a single machine for production use?
A couple of things I can think of, some of which you've mentioned:
more resources are consumed (memory usage of two containers running
is generally going to be more than one container running). Also, there could be implications for things like database connections and caching if you have two containers instead of one.
containers have to run on different ports (but there are ways to make it appear to the site visitor that they're on the same port)
don't forget that containers generally listen on more than one port, so you'll have to make sure you avoid conflicts (think of tomcat shutdown port, etc).
Having said that, I run tomcat and jetty side by side all the time on my dev machine and things work just fine. But development, not performance, is my major goal when running on my dev machine.
Running Tomcat and Jetty in the same machine is something that "performance wise" is not so "heavy", I have both running (almost) all the time in my dev machine... in production I am using only Tomcat (running multiple web services in the same container - mainly my web services + Solr) and I do not experience major differences on performance in both scenarios. So I would say that it depends on your needs... personally I tend to prefer to simplify my production setup and avoid having multiple containers in multiple ports on my servers. Furthermore, Tomcat does seem to be more popular solution for java container (http://zeroturnaround.com/rebellabs/the-great-java-application-server-debate-with-tomcat-jboss-glassfish-jetty-and-liberty-profile/), this does not however mean it is the best one for all scenarios. I personally tend to stick with one Tomcat in production...
Yes, it does consume more resources to run two instead of one, but for development purposes it shouldn't be a problem as long as your personal computer isn't terribly under powered. Other than that, you will have to resolve any port conflicts that come up when you start the second process. The error messages should tell you what port is conflicted and all port numbers are configurable.

Running webapps in separate processes

I'd like to run a web container where each webapp runs in its own process (JVM). Incoming requests get forwarded by a proxy webapp running on port 80 to individual webapps, each (webapp) running on its own port in its own JVM.
This will solve three problems:
Webapps using JNI (where the JNI code changes between restarts) cannot be restarted. There is no way to guarantee that the old webapp has been garbage-collected before loading the new webapp, so when the code invokes System.loadLibrary() the JVM throws: java.lang.UnsatisfiedLinkError: Native Library x already loaded in another classloader.
Libraries leak memory every time a webapp is reloaded, eventually forcing a full server restart. Tomcat has made headway in addressing this problem but it will never be completely fixed.
Faster restarts. The mechanism I'm proposing would allow near-instant webapp restarts. We no longer have to wait for the old webapp to finish unloading, which is the slowest part.
I've posted a RFE here and here. I'd like to know what you think.
Does any existing web container do this today?
I'm closing this question because I seem to have run into a dead end: http://tomcat.10.n6.nabble.com/One-process-per-webapp-td2084881.html
As a workaround, I'm manually launching a separate Jetty instance per webapp.
Can't you just deploy one app per container and then use DNS entries and reverse proxies to do the exact same thing? I believe Weblogic has something like this in the form of managed domains.
No, AFAIK, none of them do, probably because Java web containers emphasize following the servlet API - which spins off a thread per http request. What you want would be a fork at the JVM level - and that simply isn't a standard Java idiom.
If I understand correctly you are asking for the standard features for enterprise quality servers such IBM's WebSphere Network Deployment (disclaimer I work for IBM) where you can distribute applications across many JVMs, and those JVMs can in fact be distributed across many physical machines.
I'm not sure that your fundamental premise is correct though. It's not necessary to restart a whole JVM in order to deploy a new version of an application. Many app servers will use a class-loader strategy that allows them to discard a version of an app and load a new one.

Resource management in tomcat

We have several Java web-applications that need to be deployed on the same machine, over tomcat. The web-applications are not related to each other. Some of them do intensive I/O and CPU operations and consume much memory.
Under the above conditions, which approach is recommended - having a single tomcat with multiple webapps, or multiple tomcats each running a single webapp ?
If all webapps are deployed on the same tomcat, is there a way to guarantee minimum resources per webapp ? I.e. minimum amount of memory, number of threads, etc.
Thanks,
Arnon.
What we did at our company is we run 1 application per instance of Tomcat. We originally started with multiple instances and it occurred occasionally that one application would affect the other, especially if you had to restart the Tomcat instance.
One thing that might be worth evaluating is Spring's TC Server.
http://www.springsource.com/developer/tcserver
Similar to #tjg184 's experience. I would recommend running a tomcat per application instance. If you have a decent config and process management system, the incremental cost is not all that high and it gives you the best isolation possible without separate vm's for each tomcat instance. You could start with a single tomcat and some solid monitoring and then see if you need to move to one tomcat per app.

tomcat multithreading problem

I'm writing a java application that runs in Tomcat, on a multi-core hardware.
The application executes an algorithm and returns the answer to the user. The problem is that even when I run two requests simultaneously, the tomcat process uses at most one CPU core.
As far as I understand each request in Tomcat is executed in separate thread, and JVM should run each thread on separate CPU core.
What could be the problem that bounds the JVM or Tomcat to use no more than one core?
Thanks in advance.
Are you sure that two threads are being created. You could simply print the name of the thread as a quick test.
What happens if you run the algorithm in a standalone app?
All the processor management will be taken care by the server itself. It is not mandatory that if you pass two requests, it should use two CPUs.
Are you executing any synchronized blocks/methods which would force serial execution? The tomcat connector configuration in server.xml controls the request thread pool - but the default is 200 threads, IIRC.
Here is the procedure to do a load balancing in tomcat http://tomcat.apache.org/tomcat-5.5-doc/balancer-howto.html
I think this will work with Tomcat 6 too as they mentioned balancer webapp is shipped with tomcat 5.0 and later.

Categories

Resources