How can I get a WebSphere project to run under tc server? - java

Just wondering if anyone has ever converted a Websphere project to run under tc server? I run on a Mac and would love to be able to run my application locally without having to run Websphere in a VM.
I realize there are differences which would have to be accounted for, and that's really my question - what would I have to change? I also realize that even if this is possible, one couldn't depend on the tc server configuration before going to production - it would have to be tested in Websphere first.

As we know, application server provides lot more features than a tomcat. Hence first, you would have to check if your application is using any of those features. If so, then you would have to see if that missing functionality could be plugged-in somehow. To exemplify, you could be using the transactional capability of the application server. Tomcat doesn't come with transaction capability. You would need to plug-in a a third party component for that. For this to happen less intrusively, the code should be configurable so that based on the environment it would know whether to make use of the capabilities of the application server or use the capabilities of the plugged-in components in case of tomcat.
Advantage of this: It would certainly be boosting to the productivity if you use tomcat to develop rather than any application server, as the time it takes to start and stop tomcat if far less than the coffee time (To me ,one coffee time is the amount it takes for an application server to stop and start even when you have the minimal hardware requirements mentioned by it)
Warning:: Again this approach has a negative point that you would miss to see how your component interacts with the classes in the production environment, while you develop. Hence it becomes mandatory to have your QA environment and above to make use of the application server. This way you could avoid any surprises in production.

Related

How can I simulate / test deployment on different Java Application Servers?

I am currently developing something in Java using Jetty as my web container. How can I test if my WAR/EAR/etc file can be deployed on other servers (including enterprise level servers)?
I want to try and sell my application to enterprises, but not sure how I can know whether or not the application will deploy? I've only ever used OS servers before
Also how do I spec my 'system requirements' if i don't have access to enterprise grade hardware?
edit: I mean, if I look at something like this: http://wiki.gxtechnical.com/commwiki/servlet/hwiki?Oracle+Application+Server+Deployment, it looks deceptively easy.. But going in blind is quite scary!
You need to download, configure, and install each server and try it out. Many have free or limited development versions.
Depending on the complexity of your application, you will very likely encounter issues with each container. I can't say if any of them will be in conflict (i.e. fix it to work with container X makes it break in container Y), but it is possible.

Play framework 2.1 application deployment

I've created my first Play application. Which is the most suitable deployment method for production? Should i copy the whole project to the production server and run play start? or should i make a war out of my application and deploy in tomcat / jboss? Which is the most recommended way? Getting confused with it comparing to its rails type of behavior. Note that this is supposed to be a big data application and also it may server loaded requests later on. So we are thinking of scalability, availability, performance aspects too. This application is decided to be deployed in a cloud.
Thanks.
As others have stated, using the dist command is the easiest way to deploy Play for a one-off application. However, to elaborate, I have here some other options and my experience with them:
When I have an app that I update frequently, I usually install Play on the server and perform updates through Git. Doing so, after every update, I simply run play stop (to stop the running server), sometimes I then run play clean to clear out any potentially corrupted libraries or binaries, then I run play stage to ensure all prerequisites are present and to perform compilation, and then finally play start to run the server for the updated app. It seems like a lot, but it is easy to automate via a quick bash script.
Another method is to deploy Play behind a front-end web server such as Apache, Nginx, etc. This is mostly useful if you want to perform some sort of load balancing, but not required as Play comes bundled with its own server. Docs: http://www.playframework.com/documentation/2.1.1/HTTPServer
Creating a WAR archive using the play2war plugin is another way to deploy, but I wouldn't recommend it unless you are giving it to someone who already has a major infrastructure built upon these servlet containers you mentioned (as many large companies do). Using a servlet containers adds a level of complexity that Play is supposed to remove by nature (hence the integrated server). There are no notable performance gains that I am aware of using this method over the two previously described.
Of course, there is always the play dist which creates the package for you, which you upload to your server and run play start from there. This is probably the easiest option. Docs: http://www.playframework.com/documentation/2.1.1/ProductionDist
For performance and scalability, the Netty server in Play will function very adequately to exceptional for what you require. Here's a reputable link showing Netty with the fastest performance of all frameworks and a "stock" Play app as coming in somewhere in the middle of the field, but way ahead of Rails/Django in terms of performance: http://www.techempower.com/blog/2013/04/05/frameworks-round-2/.
Don't forget, you can always change your deployment architecture down the road to run behind a front-end server as described above if you need more load balancing and such for availability. That is a trivial change with Play. I still would not recommend the WAR deployment option unless, like I said, you already have a large installed base of servlet containers in use that someone is forcing you to serve your app with.
Scalability and performance also has a lot more to do with other factors as well, such as your use of caching, the database configuration, use of concurrency (which Play is good at) and the quality of the underlying hardware or cloud platform. For instance, Instagram and Pinterest serve millions of people every day on a Python/Django stack which has mediocre performance by all popular benchmarks. They mitigate that with lots of caching and high-performing databases (which is usually the bottleneck in large applications).
At the risk of making this answer too long, I'll just add one last thing. I, too, used to fret over performance and scalability, thinking I needed the most powerful stack and configuration around to run my apps. That just isn't the case any more unless you're talking like Google or Facebook scale where every algorithm has to be finely tuned as it will be bombarded a billion times every day. Hardware (or cloud) resources are cheap but developer/sysadmin time isn't. You should consider ease of use and maintainability for deployment of your app over raw performance comparisons, even though in the case of Play the best performing deployment configuration is arguably the easiest option as well.
You don't need to use Play's console for running application, it consumes some resources and it's main goal is fast launch while development stage.
The best option is using dist command as described in the doc. Thanks to this, you don't even need to install Play on the target machine, as dist creates ready to use stand-alone application containing all required elements (also build-in server, so you don't need to deploy it with WAR in any container).
If you planning to use a cloud you should also check offers ie. from Heroku, or CloudBees, which allows you to deploy your application just by... pushing changes via git repository, which is very comfortable way, check the documentation's home, scroll down to links: Deploying to... for more details.

can i have more than one java application server on one server?

is it possible to install more than one java application server on one server or VPS ?
i want to install JBOSS , TOMCAT , WEBLOGIC and my be more.
how this is possible and what the benefits and disadvantages ?!
It is possible, if you want to try all of them. You just have to select different port for each services.
Is there a real use-case ? You could have an application bound to your application server, and you could want to run different applications.
Yes. The usual problem is that a given port can only be used by a single process.
Hence you cannot have both JBoss and Weblogic on port 8080. Also a lot of extra ports are needed for normal operation. This is at best tedious.
This is possible. You need to take care about port's these servers use.
Also you need to check what resources (cpu's/core/RAM) you have on your server. Your system should have ample resources to Run multiple servers.
I don't know whats use case, But if possible I would prefer having various webapps on single Java applicationserver.
As already mentioned above it is possible but it is a configuration management nightmare.
If it is for compatibility testing, I would look at EC2 or similar timebased hosting and , put an 1 app server in 1 image and spin up each image in turn and shut it down after the test is finished.
The money that costs is paid 10x by not having to edit all kind of configuration files and debugging weird conflicts.
Yes, it is possible.
Pros:
You don't need extra servers to run your appservers, so you save in physical/virtualmachines. This helps a lot when you're prototyping something, and also in functional testing, because you can share servers between applications.
Some applications may need incompatible appserver settings, so you must run them in different appservers, side by side.
Downtime of one appserver doesn't affect other appservers.
Cons:
You must make sure that every appserver gets enough share of CPU, memory, etc.
You must assign port numbers to each appserver
You are making each environment's performance dependent on each other's.
So, it is something that you do mainly to experiment/develop/test. In production environments you have to be much more careful when running appservers side by side..

Benefits of Tomcat (or equivalent) for a simple service

I'll need to develop a Java service that is simple because:
It only communicates via a TCP socket, no HTTP.
It runs on a dedicated server (there are no other services except the basic SSH and such)
Should I make this a standalone service (maybe in something like Java Service Wrapper) or make it run in a container like Tomcat? What are the benefits and detriments of both?
If you aren't working with HTTP, you will have to build your own connectors for Tomcat. When I've written these types of applications, I've just written them as standard Java applications. On Windows machines, I use a service wrapper that allows them to be part of the Windows startup process. On non-windows machines, you just need to add a start up script.
Using a container (regardless which) buys you that all the details about starting, stopping, scaling, logging etc, which you have to do yourself otherwise, and it is always harder than you think (at least when you reach production).
Especially the scalability is something you need to consider already now. Later it will be much harder to change your mind.
So, if somebody already wrote most of what you need, then use that.
Tomcat doesn't sound like a good choice for me in your situation. AFAIK it's primarily made for Servlets and JSPs, and you have neither. You also don't need to deploy multiple applications on your app. server etc. (so no benefit from ".war").
If you need dependency injection, connection pooling, logging, network programming framework etc., there are a lot of good solutions out there and they don't need tomcat.
For example, in my case I went for a standalone app. that used Spring, Hibernate, Netty, Apache Commons DBCP, Log4j etc. These can be easily setup, and this way you have a lot more freedom.
Should you need a HTTP server, maybe embedding Jetty is another option. With this option too, you have more control over the app. and this can potentially simplify your implementation compared to using a tomcat container.
Tomcat doesn't really buy you much if you don't use HTTP.
However, I was forced to move a non-HTTP server to Tomcat for following reasons,
We need some simple web pages to display the status/stats of the server so I need a web server. Java 6 comes with a simple HTTP server but Tomcat is more robust.
Our operation tools are geared to run Tomcat only and standalone app just falls off radar in their monitoring system.
We use DBCP for database pooling and everyone seems more comfortable to use it under Tomcat.
The memory foot-print of Tomcat (a few MBs) is not an issue for us so we haven't seen any performance change since moved to Tomcat.
A container can save you from reinventing the wheel in terms of startup, monitoring, logging, configuration, deployment, etc. Also it makes your service more understandable to non-developers.
I wouldn't necessarily go for tomcat, check out glassfish and germonimo as they are more modular, and you can have just the bits the need, and exclude the http server.
We faced a similar decision a while back, and some parts of the system ended up being jsw based, and the others as .war files. The .war option is simpler (well more standard for sure) to build and configure.

Common practices if we discover a problem after deploying a web application?

I recently have a problem that my java code works perfectly ok on my local machine, however it just wouldn't work when I deploy it onto the web server, especially the DB part. The worst part is that the server is not my machine. So I had to come back and forth to check the versions of softwares, the db accounts, the settings, and so on...
I have to admit that I did not do a good job with the logging mechanism in the system. However as an newbie programmer with little experience, I had to accept my learning curves. Therefore, here comes a very general but important question:
According to your experience, where would it be most likely to go wrong when it is working perfectly on the development machine but totally surprises you on the production machine?
Thank you for sharing your experience.
The absolute number one cause of problems which occur in production but not in development is Environment.
Your production machine is, more likely than not, configured very differently from your development machine. You might be developing your Java application on a Windows PC whilst deploying to a Linux-based server, for example.
It's important to try and develop against the same applications and libraries as you'll be deploying to in production. Here's a quick checklist:
Ensure the JVM version you're using in development is the exact same one on the production machine (java -version).
Ensure the application server (e.g. Tomcat, Resin) is the same version in production as you're using in development.
Ensure the version of the database you're using is the same in production as in development.
Ensure the libraries (e.g. the database driver) installed on the production machine are the same versions as you're using in development.
Ensure the user has the correct access rights on the production server.
Of course you can't always get everything the same -- a lot of Linux servers now run in a 64-bit environment, whilst this isn't always the case (yet!) with standard developer machines. But, the rule still stands that if you can get your environments to match as closely as possible, you will minimise the chances of this sort of problem.
Ideally you would build a staging server (which can be a virtual machine, as opposed to a real server) which has exactly (or as close as possible to) the same environment as the production server.
If you can afford a staging server, the deployment process should be something like this:
Ensure application runs locally in development and ensure all unit and functional tests pass in development
Deploy to staging server. Ensure all tests pass.
Once happy, deploy to production
You're most likely running under a different user account. So the environment that you inherit as a developer will be vastly different from that a a production user (which is likely to be a very cut down environment). Your PATH/LD_LIBRARY_PATH (or Windows equivalents) will be different. Permissions will have changed etc. Plus the installed software will be different.
I would strongly recommend maintaining a test box and a test user account that is set up with the same software, permissions and environments as the production user. Otherwise you really can't guarantee anything. You really need to manage and control the production and test servers wrt. accounts/installed software etc. Your development box will always be different, but you need to be aware of the differences.
Finally a deployment sanity check is always a good idea. I usually implement a test URL that can be checked as soon as the app is deployed. It will perform database queries or whatever other key functions are required, and report unambiguously as to what's working/not working via a traffic light mechanism.
Specifically you can check all the configuration files (*.xml / *.properties) in your application and ensure that you are not hard coding any paths/variables in your app.
You should maintain different config files for each env. and verify the installation guide from env admin. (if exists)
Other than that versions of all softwares/dependency list etc as described by others.
A production machine will likely miss some of the libraries and tools you have on your development machine. Or there may be older versions of them. Under circumstances it may interfere with the normal software function.
Database connection situation may be different, meaning users and roles and access levels.
One common (albeit easy to detect) problem is conflicting libraries, especially if you're using Maven or Ivy for dependency management and don't double check all the managed dependencies at least once before deploying.
We've had numerous incompatible versions of logging frameworks and even Servlet/JSP API .jar:s a few times too many in our test deployment environment. Also it's always a good idea to check what the shared libraries folder of your tomcat/equivalent contains, we've had some database datasource class conflicts because someone had put postgre's jdbc jar to the shared folder and project came with its own jar for jdbc connectivity.
I always try to get an exact copy of the Server my product is running. After some apps and of course a lot of Bugs i vreated myself a List of common Bugs/Hints. Another Solution i tested for my last project was to get the running Software on that Server and try to configure it. Strange effects can happen with that^^
Last but not least..i always test my apps on different machines.
In my experience there is no definite answer to this question. Following are some of the issues I faced.
Automatic updates was not turned on in dev server (windows) and it was turned on in the production server(which in first place is wrong!). So one of my web application crached due to a patch applied.
Some batch jobs were running in the production app server which changed some data on which my application was using.
It is not me who does the deployment for my company so most of the time people who deploy miss some registry entries, or add wrong registry entries. Simple but very hard to detect (may be for me ;-) ) once I took hours to identify a space in one of the registry values. Now We have a very long release document which has all the details about all servers used by the application and there is a check list for "current release" which the engineers who deploy the application fill in.
Willl add more if I remeber any.
Beyond just a staging server another strategy for making sure the environments you deploy into are the same is to make sure they are set up automatically. That is you use a tool like Puppet to install all the dependencies that the server has and run your install process before every installation so that all the configuration is reset. That way you can ensure the configuration of the box is what you have set it to during the development process and have the configuration of the production environment in source control.

Categories

Resources