Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I have built a service discovery layer on top of Zookeeper for finding Thrift services in a distributed environment. I'm looking for the best way to run those services in a production environment now.
Currently, it's done by packaging a war that gets deployed to Tomcat. During the servlet instantiation, the Spring ApplicationContext is created, which creates a TThreadPoolServer inside of Tomcat.
I don't like this for a couple of reasons:
It makes Tomcat sort of useless, and it feels like a hack to facilitate easy deployment
It avoids the Tomcat thread pooling and all of the logic that has gone into figuring out the best way to distribute requests
In the process of trying to find the best strategy to handle this, I have come up with a couple of alternatives:
Launch thrift services as a standalone JAR (I don't like this, mainly because I now need to reinvent the logic that app container developers have spent a lot of time working out
Host thrift over HTTP, thus utilizing the Tomcat thread pool and logic for service requests (iffy about this one due to the - albeit minor - performance hit this will incur)
Use a different type of application container for hosting these services
Does anyone have suggestions on how they may have handled hosting distributed servers before. Am I better off just using HTTP inside of Tomcat?
I've tried using Tomcat as a host for Thrift server and found out that it doesn't bring any additional value: all the features of servlet container (request routing, etc) aren't necessary in this scenario. On the other hand, Tomcat adds complexity and moving parts (i.e., it brings hard to resolve PermGen issues).
Using Thrift over HTTP causes significant performance impact, especially in high-load scenarios with a lot of client connections.
So I ended up with standalone Thrift services running under Supervisor Daemon (http://supervisord.org/). It makes management of distributed deployment really convenient. When it's necessary to expose Thrift API over HTTP (for example, for JS clients), we use thin async proxy implemented in vert.x (http://vertx.io/).
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
How to run node.js, Java and PHP application on same server and same port, I was trying to run but unable to run.
You need some kind of HTTP proxy layer in front of all this, typically Apache httpd or NGinx. From there you can configure different paths to go to different applications if necessary.
The configuration directives vary considerably depending on the solution you're using, but you can have / go through to PHP and /node go through to Node, while /java goes somewhere else entirely. Just make sure your sub-components are using non-conflicting paths so they can all play nicely together or you'll have to do a lot of ugly URL rewriting.
You could use the varnish cache as a load director and set up different back-ends for each of those servers. Then you could parse the incoming urls to redirect to the appropriate application server. You can absolutely run all of those app servers on the same machine, with varnish listening on one port, and all the other services listening on other ports. It would be easy to firewall those services from external access as well.
Running each service on different machines is also entirely possible and easy. We've used this solution numerous times in different environments because Varnish is extremely light-weight, reliable, and does not have the overhead of a web server such as Apache or nginx which, while good options, can be overkill.
You also get the added benefit of the robust caching it provides. Bonus!
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am really curious about how professional programmers scale up a web application. I have made significant research effort but failed to get information about stages of scaling, it might be related to the fact that server performance depends on many factors. However, i am pretty sure that some details can be laid down approximately.
For instance,
1.) How many concurrent request can a single Tomcat server handle with decent implementation and decent hardware?
2.) At what point should be a load-balancer server involved?
3.) When does full Java EE stack (JBoss/Glassfish) begin to make sense?
I feel that this is somewhat opinion based but, ultimately, "it depends".
For example, how much load can Tomcat handle? It depends. If you're sending a static HTML page for every request then the answer is "alot". If you're trying to compute the first 100,000 prime numbers every time then probably not so much.
In general, it is best to try to design your application for clustering/distributed use. Don't count on too much in the session - keeping the sessions in sync can be expensive. Do your best to have every method truly stateless. That can be hard sometimes as the consumer (i.e. a web site) may have to pass a bit more information on each call so that any of the clustered machines know the current state of a request. And so on.
I moved a web-app from Tomcat to Glassfish and then Wildfly when I wanted to take advantage of some additional Java EE functionality - specifically JMS, CDI, and JPA. I could have used TomEE and bolted it in but a unified environment with a unified management UI was a nice benefit too. You may never need to do that though. You can add parts that you want (i.e. CDI and JPA) fairly easily to Tomcat.
Note that I didn't move from Tomcat to a full EE server for performance - I wanted to take advantage of a larger part of the EE stack. While Wildfly has some management interfaces that make managing a cluster a bit easier I could have still used Tomcat with no problem.
So, again, "it depends". If you don't have the need for more of the EE stack than Tomcat provides a full EE server may very well be overkill. Putting a set of Tomcat servers behind an Apache HTTPD load balancer (or an Amazon one) on top of a database that also is clustered isn't too bad to implement. If that is sufficient for you then I'd stick with that. Don't jump to Wildfly, etc. for just performance as you will not likely see a huge change either direction.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I've in the past created client-server web applications using Javascript, AJAX, Node, Express and MongoDB, but now I'm required to creare a client-server desktop application. It will therefore basically consists of a desktop program which will connect to a server program by doing requests. The server program will respond to the client program with the requested data which it can fetch from the database.
Since I'm really new to these kind of applications in Java, I have no idea how to create such an application in Java. Since the project will be large, we cannot hard-code all the server. We need probably a framework on the server side that listens for requests, but I've not found any for now. For example, Play Framework seems only to work for web applications. Which frameworks are useful for these purpose? Is this the right approach for this kind of applications? How would I connect client and server applications?
Please, do not suggest "use sockets". This will be quite a big "serious" project, and we need high level tools. We don't know how usually these kind of projects are created. Please, explain a little bit which patterns are usually used. Examples of concrete programs, maybe with open source code will be useful for us to understand. Also a list of the requirements that we need for these project would be very useful.
Note: I'm not asking for a exhaustive list of frameworks that we can use. I rather asking which kind of tools (with concrete examples) should we use and how to combine them. How to structure such a project.
You could write the server side application in Node JS or whatever other server side language you prefer - and implement that using REST services. Then in your Java desktop application, it would just communicate with the server using HTTP REST / SOAP etc.
That way if you were to then want to swap to use something like .NET to make your desktop application you would be free to do so without it changing anything on the server side. Also you would be able to implement a mobile application / tablet app / other web application and reuse all of the server side implementation easily without changing anything server side.
Another option is to use ServerSocket for the Java server side, and then connect to that from the client but you seem to know and dislike that option.
Another option to connect each side of the application would be to use some kind of pub / sub middleware messaging service - check out JMS as a framework - you will need some kind of implementation of JMS such as Active MQ, Websphere MQ or one of the many other free implementations. Check out : http://docs.oracle.com/javaee/6/tutorial/doc/bncdq.html
Difficult question to answer, but those are 3 high level options.
Use web technologies to connect client to server HTTP REST, or SOAP
Use ServerSockets and Socket connections and do everything manually
Use a messaging framework such as JMS
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
We will be developing a server-side linux service using Java.
Does anybody have experience of a framework or an "application server" for this purpose?
Application server not in the common sense, this has nothing to do with web, http or the like. It's a server application listening on a socket, processing some stuff and then answer to it. It will be a custom protocol. So the usual suspects like Glassfish, Tomcat, Jetty, etc. aren't really what I need.
Edit: I'm looking for features like startup handling, automatic service recovery, and may be database connection
Any help is appreciated.
You could use Netty for developing your tcp/ip based client-server application. It has very good documentation and arguably better performance too. If you want you could also look into Apache MINA but IMHO they do not have great documentation. QuickServer is also there if you have time to do some R&D on it.
I think you are wrong - the listed usual suspects (well, Tomcat not) are in my opinion perfect fit. Shortly, what you need is an application server with Java EE 6 Full profile support - I would recommend GlassFish or WildFly, or WebLogic from commercial end. The reason is simple - JCA (Java Connector Architecture). It's a Java EE specification for connecting to 3rd party, or legacy, or custom developed systems. We have used it successfully for implementing communication via application specific socket based protocol, or even for Sun-RPC and Radius (from telco) protocols. There are several examples on the web, how to utilize it - with latest Java EE spec there are also few examples for socket communication finally available. But the specification itself (JCA 1.6) is very well written and after reading it you should be able to use it.
Because it is part of Java EE, the container will support all the pooling, startup, monitoring , ... Give it a try.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I've recently learned about Tomcat 7's feature to allow multiple versions of the same webapp deployed at the same time:
http://www.tomcatexpert.com/blog/2011/05/31/parallel-deployment-tomcat-7
http://www.javacodegeeks.com/2011/06/zero-downtime-deployment-and-rollback.html
Our sites regularly get 10-20,000 user sessions per day, and quite a lot of them are transactional/stateful type of webapps. Parallel deployment seems perfect for what we want, but I haven't really heard much about people's experiences using it on their servers.
If you use this feature of tomcat 7 in production, have you had any issues with it so far? Have you had to make any changes to your webapps to "play nice" with this Tomcat feature?
I didn't use this feature in production. My first thougths are:
What if you apply database schema changes? You'll have two applications running on same schema with different database handling (for example different JPA entities).
What if you have some scheduled tasks? They'll run paralell. Your application must be ready for this.
What if you apply some very important bugfixes? You'll have good and buggy application running together. They'll together make changes to database until all old sesions expires.
Why do you want your users to see old version of an application if you apply some new features or bugfixes.
Your application must be prepared the same way you prepare it to run on cluster with sticky sessions. It's just the same, but on same Tomcat.
Are you sure your application can be redeployed on Tomcat without well-known perm gen issues? I heard they say it can be done now. I still restart Tomcat with each redeploy.
We didn't have much luck getting this to work consistently in our test environment, so no way we'd consider it for production.
The question is, do you need the ability to do hot upgrades in your environment? Often this is theoretically nice but not needed.