I have got a problem (or doubt) about multithreading in jboss 7. My situation is more or less simillar to situation described here:I have got piece of code that is running in many threads and now I have to move it to jboss server. Only thing that is different is that I don't use EJB, just spring+hibernate. So my question is: should I follow the same steps as was written in the answer or there is any other way that I can create many threads? (I was exploring jboss and there was some place to configure thread factory and thread pools and I don't know if it is tool that I can use in my app)
no, not really. its "illegal" to create threads in Java EE and under the servlet specifcation. the link you pointed to it the easiest way to do threading under a web/Java EE container - use #Asynchronous. note that you can return a Future from such a method if you want to wait for the results
Related
I am new to J2EE and i am working on couple of tasks. one of them is :
I have a web application that works like a reporting toolbox hosted by Apache tomcat 7, I need a heavy weight job to be scheduled to run every hour or other intervals, I googled and find Apache Sling that is kind of separate application server for content-centeric applications. I want to know if there is other solution could be done Apache tomcat or not ?
also its important that solution would be standard and reliable.
There's the ScheduledExecutorService which is part of the standard java api. See the new*Schedule* factory methods in Executors.
For a more heavyweight / configurable option there's Quartz. One of Quartz' nice features is it's support for cron expressions
You can also use Spring Batch. Here's a link that can help you understand this framework.
http://projects.spring.io/spring-batch/faq.html
In case none of the packages work for you one option would be to implement a ServletContextListener. It is an object that is launched when your site goes online. The only problem is that you have to manage all the scheduling.
We are currently developing a small application which needs to communicate with a machine interface via a propriety tcp protocol.
For this low level communication stuff we used Netty to implement the necessary encoders and decoders. Since we also need some Java EE things like WebService, JPA etc we thought about integrating the netty server in an Java EE 6 application. Therefore we would use an ApplicationScoped managed CDI bean, where the bootstrapping is triggered in a PostConstruct method and the unregistering is done in the PreDestroy callback.
So the main question is:
Would this lead to problems, since as far as I know it is technically not allowed to start threads in a Java EE environment (I think Netty starts some threads here)?
If yes, what kind of problems? Since we don't need clustering, we would just use a standard Java EE 6 app server like GlassFish.
Most people will recommend against it, since improper termination and resource lock-ups can lead to catastrophic results. However, if you know what you're doing, there is no reason not to.
That said, based on what you need it for, I would recommend looking into Java Connector Architecture first. It already provides established contracts for connection, transaction, security, life-cycle, work, etc. management. So, you have a much better chance of writing a good implementation as well as transfer thread management to the container. See this and this to get you started.
I am considering using Java 6's embedded HTTP server for some sort of IPC with a Java daemon. It works pretty well and it's nice that's already bundled with all Java 6 installations. No need of additional libraries.
However, I would like to know if someone has tried this with production environments with heavy load. Does it perform well? Should I be looking for something more robust such as Tomcat or Jetty?
Well, as much as it saddens me to say bad things about Java, I'd really not recommend it for production use, or any kind of heavy use scenario. Even though it works well for small stuff like unit/integration tests, it has big memory issues when it is used intensivelly, especially when you use it for a big number of connections at once. I've had similar issues to the ones described here:
http://neopatel.blogspot.com/2010/05/java-comsunnethttpserverhttpserver.html
And Jetty is not that good for heavy production usage for pretty much the same reason. I'd go with Tomcat if I were you.
As an alternative, I believe you could consider Java Messaging Service as an alternative to Inter Process Communications and just have a JMS server running (like Active MQ)
If you want something that ships with Java have a look at RMI or RMI/IIOP.
I'm a long-time client-side (Swing) developer and I operated pretty much by myself in the same job for a long time. Working from home in a vacuum, I was pretty much completely isolated from the community. I recently took a position as a server-side Java guy for a startup, and I'm learning a ton of stuff but I'm the only Java person and am pretty much on my own again. Having never done server-side Java before, so much of this stuff is completely new and I feel like I have no idea what the normal best-practices are, or I don't have an intuitive feel for what tools to use for what jobs. I keep reading and reading various Internet sources (SO is awesome!) trying to bulk up my knowledge, but some things seem hard to search for because they don't have any obvious keywords. Hopefully some of you gurus here can point me in the right direction.
I'm in charge of implementing our backend REST service, which for now supports our website and an iPhone app. We're doing a social media site, eventually with many different clients. Currently the only clients of the service are our own website and our own iPhone app. I'm using Jersey, Spring, Tomcat, and RDS (Amazon's MySQL) on Amazon's EC2 platform. Our media storage is via S3. I've picked up all of these things pretty quickly and so far so good -- things are working fine with the website and the iPhone app. Cool.
Our next step is adding some long-running server-side processing. This processing is basically CPU-intensive stuff that doesn't involve any communication until it's done. I'm trying to figure out what the best way to handle this is. I'm thinking of using Amazon's SQS to queue up jobs in response to the REST events that should trigger them, but I can't figure out how I should handle the dequeuing and processing. I know I need some threads somewhere that take jobs off the SQS queue and process them, and then tell the REST service that the job is done. But where do these threads live?
In a plain "java -jar jobconsumer.jar" process on another EC2 instance that starts a small thread pool. Maybe use Spring to wire up this piece and start it running?
In a webapp deployed in a container like Tomcat on another EC2 instance? I don't really know what benefits I would get from this, but somehow running in a container like this seems more stable? Does this sort of container even really support long-running processing loops, or is it just good at responding to HTTP events?
Now that I write it out like that, I don't really see why I would want to use a container. It just seems like an over-complication. However, the Java community seems so centered on these types of containerized, "managed" environments that to not use a container seems somehow wrong. I feel like maybe I'm not understanding what some of the major benefits of these containers are? I mean, beyond the obvious benefits of the web-facing Servlet and JSP specs. Would any of the functionality of those specs help me out with something like this?
For a regular Java web app, you almost certainly want to be using one of the Servlet containers such as Tomcat - it takes care of accepting connections, parsing and serialising HTTP messages, JSPs, SSL, authentication, etc for you.
For a non-web app, the argument for using Tomcat (or similar) is weaker, but there are a few reasons to still consider it:
straightforward to add JSPs for querying and managing the app or add a web API in future
easy distribution of releases (one .war vs. an unholy mess of jars and config files)
hot deployment (although I've yet to see anyone using this for anything serious)
In terms of long-running processing loops, Servlet containers don't help you out beyond notifying your ServletContextListener when the app starts, so you can kick off any long-running tasks.
It's worth noting that if you're already using Spring, it's relatively easy to switch from a stand-alone app to a container using ContextLoaderListener, so it shouldn't be a problem if you decide later that you need the web stuff.
We recently faced a similar question, as we are hosting a large distributed service on EC2.
In short, we are very happy with Jetty 7 as a container. We use it for our user-facing-www, public-api, and internal-backend-api services. In some cases we use it for non-api services such as a workqueue, simply to expose a bit of status & health info for our monitoring.
The great thing about Jetty (any version) is that it can be configured in ~5 lines of code, with zero external config files etc. It's not a container specifically, but an http server that you can embed.
We use Guice for dependency injection, which also favors config-file-less implementations.
Long lived Java processes are nothing to worry about - you basically bring up your servers / threads / threadpools in your main method and don't call System.exit until you want to shutdown explicitly.
I need to scale calls into Tomcat and it's been suggested to launch threads internally. Has anyone needed to do this and, if so, what solutions did they come up with?
Creating your own threads inside an application server is generally discouraged because the server should manage threads for better scalability. You can also run into problems if the container makes assumptions about what's available in a thread context, such as security information (e.g., authenticated Subject). That typically happens if you spawn a thread and then use a server resource from that thread which is unknown to the container.
Check to see if there is a way to get container managed threads from Tomcat. WebLogic and WebSphere support the commonj.WorkManager, which allows you to schedule work on container managed threads. Spring can also use commonj, but I'm not sure if that support is available on Tomcat.
You shouldn't really launch threads from within your webapp unless you have a very specific need to do so. Without more details on your problem it is hard to tell if this is the right approach to solve your problem.
You might want to take a look at Quartz, which "is a full-featured, open source job scheduling system that can be integrated with, or used along side virtually any J2EE or J2SE application".
Your question is a bit vague. Tomcat itself already uses a thread pool to service HTTP requests. You can increase the number of threads through Tomcat configuration - look to the Tomcat wiki for info on this.
If you mean that in your code you want to launch threads, then I advise perusing the java.util.concurrent API introduced in Java 5. Also read "Java Concurrency in Practice", which is the text on this subject.
What is the problem you are trying to solve with threads?
If have long running tasks you should use JMS + a full Java EE container.
If you trying to handle excess load you could consider two tomcat instances, however, if you are using http sessions you will need to investigate session replication.
If you are forced to use Tomcat consider using the Executors framework in java.util.concurrency.
as others asked, you should give more details as to what you're trying to accomplish.
Otherwise, tomcat uses thread pools. increase the number of threads in the pool. Use a newer version of tomcat -- 6.x. Use Java 6.0_10. If needed, tune the application using a profiler and fiddle with the JVM settings, if required.
The J2EE abstraction for managed multithreading is JCA. In particular, take look at the WorkManager and Work classes. See also this arcicle. Spring also provides JCA-backed work manager abstraction.