Requests to servlet wait in queue - java

I am developing web app build in GWT using GWT RCP. Application is designed for instant messaging which I use redis messaging for.
When waiting in servlete on a message and I am subscribe on that channel in redise everything works as planned. Though when number of awaiting requests on server is more than 5, the 6th request doesn't start to be processed and waits in que until one of the previous requests is processed. I wasn't sure if the problem is in redis (I am using jedis library) therefore I tried to call directly sleep on currentThread but it behaved the same.
public class TestServiceImpl extends RemoteServiceServlet implements
TestService {
#Override
public void syncWait(Date time) {
try{
Thread.currentThread().sleep(10000l);
}catch (Exception e) {
getLogger().error("sleep error", e);
}
return ;
}
}
It's not just about one particular servlet, when 5 requests are opened, 6th doesn't even load static content. I tried it on jety, glassfish and tomcat.
I also tried to change settings of threadpool in glassfish, maxthread-count I set up on 200 but it didn't work.
Could you please advice on how to increase the number of requests processed pers session and per server?

For this you REALLY want to use one of the servlet Comet implementations and an NIO connector. I'm not intimately familiar with Glassfish or Jetty, but on Tomcat you can use a combination of APR (see http://tomcat.apache.org/tomcat-6.0-doc/apr.html) and advanced IO (see http://tomcat.apache.org/tomcat-6.0-doc/aio.html) to do what you want.
Please note that using Tomcat's Advanced IO is more complicated (and less well documented) than the standard Servlet 2.5 api.
Resin, Tomcat 7 and Glassfish (I believe) support the Servlet 3.0 spec which also offers support for similar features: you may want to take a look at that.

Related

Standard way to run a Java program continously

What is standard(industry standard) way to keep a Java program running continuously.
Use case(Realtime processing):
A continuous running Kafka producer
A continuous running Kafka consumer
A continuous running service to process a stream of objects
Found few questions in Stackoverflow, for example:
https://stackoverflow.com/a/29930409/2653389
But my question is specific to what is the industry standard to achieve this .
First of all, there is no specified standard.
Possible options:
Java EE WEB application
Spring WEB application
Application with Spring-kafka (#KafkaListener)
Kafka producer will potentially accept some commands. In real-life scenario I worked with applications which runs continuously with listeners, receiving requests they triggered some jobs, batches and etc.
It could be achieved using, for example:
Web-server accepting HTTP requests
Standalone Spring application with #KafkaListener
Consumer could be a spring application with #KafkaListener.
#KafkaListener(topics = "${some.topic}")
public void accept(Message message) {
// process
}
Spring application with #KafkaListener will run infinitely by default. The listener containers created for #KafkaListener annotations are registered with an infrastructure bean of type KafkaListenerEndpointRegistry. This bean manages the containers' lifecycles; it will auto-start any containers that have autoStartup set to true. KafkaMessageListenerContainer uses TaskExecutor for performing main KafkaConsumer loop.
Documentation for more information.
If you decide to go without any frameworks and application servers, the possible solution is to create listener in separate thread:
public class ConsumerListener implements Runnable {
private final Consumer<String, String> consumer = new KafkaConsumer<>(properties);
#Override
public void run() {
try {
consumer.subscribe(topics);
while (true) {
// consume
}
}
} finally {
consumer.close();
}
}
}
When you start your program like "java jar" it will work until you didn't stop it. But that is OK for simple personal usage and testing of your code.
Also in UNIX system exist the app called "screen", you can run you java jar as a daemon.
The industry standard is application servers. From simple Jetty to enterprise WebSphere or Wildfly(ex. JBoss). The application servers allows you to run application continiously, communicate with front-end if neccassary and so on.

Website in OSGi that consumes REST web service running in background

I spent quite a few days now trying to figure out how to add a website in OSGi.
I hava Restlet web service running with Jetty extension to use Jetty as a connector. This feature provides different resources under multiple URLs.
But I would also like to have a small website running on the system that can be accessed by the user. I wanted to use some HTML,Javascript,CSS and provide the current data status with some graphs and pictures.
I assume since Jetty is running in the background I would be able to deploy this website on Jetty and maybe call the server resources provided by Restlet in Javascript.
Apparently nothing worked except the restlet services.
My question would be is it possible to add a WAB bundle and expect it to work(Since Jetty is running in background)? Or is there any better way to add a website in OSGi?
Or
The only option I have now is, since it is possible to return an HTML form as a representation, add all my javascript code inside the HTML form and send it as a response to GET request(Which I believe is a mess).
Everything will run in Raspberry pi so I can only have a very small footprint. I am using Equinox, Restlet 2.3.0 and Jetty 9.2.6.
I would really appreciate if someone knows a link where i could get info on getting at least a sample page running in OSGi. I have tried many with no luck.
I recommend you to have a look at how it is done in Apache Karaf (https://github.com/apache/karaf). More on Apache Karaf and WebContainers here: http://karaf.apache.org/manual/latest/users-guide/webcontainer.html
In fact, Jetty is internally used by Restlet under the hood through its connector feature. This way, it's not convenient (and not the correct approach) to register dynamically applications.
That said, Restlet is really flexible and dynamic. This means that you can dynamically handle bundles that contain Restlet applications in a similar way than WAB bundles, i.e. attach them to virtual hosts of a component.
Here is the way to implement this:
Create a bundle that makes available the Restlet component into the OSGi container. You should leverage the FrameworkListener listener to be able that all connectors, converters, and so on... are registered into the Restlet engine:
private Component component;
public void start(BundleContext bundleContext) throws Exception {
bundleContext.addFrameworkListener(new FrameworkListener() {
component = new Component();
(...)
component.start();
});
}
public void stop(BundleContext bundleContext) throws Exception {
component.stop();
}
When the component is started, you can look for bundles that are present in the container and contained Restlet applications. For each bundle of this kind, you can register a dedicated OSGi service that make available the internal Restlet application you want to register against the component.
ServiceReference[] restletAppRefs = bundleContext.getServiceReferences(
"restletApplication",
null);
if (restletAppsRefs != null) {
for (ServiceReference restletAppRef : restletAppsRefs) {
RestletApplicationService descriptor
= (RestletApplicationService) bundleContext
.getService(serviceReference);
String path = descriptor.getPath();
Application restletApplication = descriptor.getApplication();
// Register the application against the component
(...)
}
}
Registering applications against the component is
try {
VirtualHost virtualHost = getExistingVirtualHost(
component, hostDomain, hostPort);
if (virtualHost == null) {
virtualHost = new VirtualHost();
virtualHost.setHostDomain(hostDomain);
virtualHost.setHostPort(hostPort);
component.getHosts().add(virtualHost);
}
Context context = component.getContext().createChildContext();
virtualHost.setContext(context);
virtualHost.attachDefault(application);
component.updateHosts();
application.start();
} catch(Exception ex) {
(...)
}
You also need to take into account the dynamics of OSGi. I mean bundles can come and go after the start of the OSGi container itself. You can leverage
bundleContext.addServiceListener(new ServiceListener() {
public void serviceChanged(ServiceEvent event) {
if (isServiceClass(event, RestletApplicationService)) {
int type = event.getType();
if (type == ServiceEvent.REGISTERED) {
// Register the Restlet application against the component
} else if (type == ServiceEvent.UNREGISTERING) {
// Unregister the Restlet application
}
}
}
});
Hope it helps you,
Thierry
There are many options available to you - OSGi doesn't really impose many restrictions on what you can do. If you want to use OSGi's capabilities then here's a couple of ideas:
One option would be to deploy a WAB. You'll need to ensure that your framework has the necessary OSGi Services running though. Just because some bundle is using Jetty internally it doesn't follow that the necessary OSGi services are running.
The bundle org.apache.felix.http.jetty does provide the necessary services to deploy a WAB. Version 2.2.2 is 1.3MB on disk, and embeds its own copy of Jetty. Other implementations are available (e.g. Pax-Web, as used in Karaf - which also embeds Jetty)
Another option would be to use the OSGi Http service directly (again you'd need to include a bundle which implements this service (like the Felix one mentioned). A call to org.osgi.service.http.HttpService.registerResources() will serve up static content from within your bundle.
If the this additional footprint is a real concern then you might want to look at how you can get Restlet to use the OSGi http service, rather than providing it's own via embedded Jetty.
Yet another option would be to take a Jetty centric view of things. Restlet's embedded Jetty is probably not configured to serve arbitrary content from disk. You could look at either re-configuring the embedded Jetty to do this, or consider deploying Restlet to a 'standard' Jetty install. Personally, I'd try the latter.

Play framework 1.2.5 application slow startup

I am using Play framework 1.2.5 and Hibernate 3.25 for developing my web application. I am facing problems with the application startup, it is very slow :(
For any Java EE servlet-driven application, we use the ServletContextListener for initializing the session factories (which is really a time consuming job). Once the application is deployed, the session factories will be initialized and all this have to be completed before the application is ready to use for the end user. In this way, when the user triggers the first request, the response time for the first is faster.
But, for Play framework does not follow any servlet architecture. Hence not sure how to implement something similar to the ServletContextListener which will be create all the session factories before the application is ready to use to the end user.
Without this, for the first time the application is really very slow for the first request.
I am sure there might be something in Play Framework also which will do the same but I am not aware of it.
Please let me know about this.
You can use a Job to initialise you application.
For example you could have a bootstrap job annotated with #OnApplicationStart which would take care of loading your static data or initialising you cache or factories.
#OnApplicationStart
public class Bootstrap extends Job {
public void doJob() {
//Load static data
//Initialise cache
//Initialise factories
...
// ready to serve application
}
}
You're probably running the application in development mode, where everything is compiled and initialized lazily, on the first request. The production mode compiles everything before actually starting the server. See http://www.playframework.org/documentation/1.2.5/production
JB should be correct. In short you can start the server with --%prod option:
play run --%prod
or
play start --%prod

Getting thread from Container?

On most of applications servers, J2EE Ejb specification forbids creating threads "by hand", since these resources should be managed by the server.
But is there any way to get threads from Tomcat, Glassfish, Jboss etc.; thus access their ThreadPool?
You can use the commonj WorkManager. It was a proposal by IBM and BEA to provide a standard means to accomplish this task (access to container managed threads).
Although it was not included in the actual specification, there are implementations available for most containers.
Use in Weblogic
Use in WebSphere
Implementation for Tomcat, JBOSS and maybe others
Spring integration
The legal way to get threads from container is using JCA (Java Connector Architecture). The component you implement using this technology is called "resource adapter" and is packaged as rar file.
The implementation is pretty verbose but not too complicated in simple cases. So, good luck.
I've seen at least one utility class for getting ahold of Tomcat's threadpool, but it's not wise to go this route. Those threads are created to service your EJB or Servlet's requests, not for you to support the EJB or Servlet. Each one you take up is just another thread that won't be available to service requests to the container.
You could probably just throw in a static ThreadPool and use a static initializer to get around the EJB spec on this one, but you'd obviously have to make sure the thread code works well otherwise it could really bork your EJB.

Servlets + JAX-WS

I'm trying to expose a web service method via JAX-WS annotations. Many examples I've seen reference the EndPoint.publish() method to quickly stand up the service in a standalone app (ex from Java Web Services: Up and Running, 1st Edition):
public class TimeServerPublisher {
public static void main(String[ ] args) {
// 1st argument is the publication URL
// 2nd argument is an SIB instance
Endpoint.publish("http://127.0.0.1:9876/ts", new TimeServerImpl());
}
}
One thing that I'm missing is how to accomplish essentially the same thing but in an existing app. Would I make a servlet to handle this? What is the proper way to publish this service in an existing WAR file?
In a container you don't have to publish like this. The container will do the publish. If you plan to use it in JBoss server try JBossWS otherwise for Tomcat or any other server Axis2 may be the better choice.
Read more from the following links.
http://jbossws.jboss.org/mediawiki/index.php?title=JBossWS
http://ws.apache.org/axis2/
This depends on what WS stack you are using.
If you are using Java 6 then that includes the JAX-WS reference implementation, then you can consult the documentation about JAX-WS RI WAR contents.
As #Jerrish and #andri coments, there are different aproaches and solutions, depending on your concerns.
The idea behind is that you don't need to set the configuration (port, etc) when will be published your web service. The best approach could be to set this via configuration files (XML, properties, etc) or using #Annotations.
For example, if you're accustomed to use frameworks like Guice or Spring, you know that is possible/recommended to start the context of your application publishing or initializing some objects, factories, datasources, etc and publishing webservices is another task that can be done in this time, because will be available when you will start your application, isn't?.
By the way, I've good experiences with CXF and another solution could be Spring Web Services another powerful solution for creating web services.

Categories

Resources