Managing many clients with the same java projects with small changes - java

I have several java projects with small differences between them for each client, those differences are:
differences in jrxml reports
differences in properties files
Changes in static classes
When I go to update the clients, I committed to SVN and generating the WAR for each project (approx 90mb) using Jenkins, upload to FTP and install it on each server.
The problem I'm having is the time it takes me to do it this way, which is between 3 and 4 hours for 6 projects every week.
It is possible that all clients are handled by a single project (single WAR) and separate the differences outside the war?
What would be the best way to do this?
This is recommended or is there a better way to handle this?

It is possible that all clients are handled by a single project (single WAR) and separate the differences outside the war?
Sure but... bear with me for a second.
With 4 hours of building/deployment time i'd say the bandwith between the jenkins server and you production servers could be the issue. If upgrading it is not viable you will need to optimize your projects setup.
Usually 90% of the size of a war file is composed by the collection of libraries your application relies on to do his job (spring, hibernate, struts and so on...).
Supposing you currently store them in your WEB-INF/lib folder, you could consider to extract those and install them in your application server shared classpath, removing them from your war.
It will not shock me if after this operation you will not need further optimizations...

I found that when I need to save time, this command saves me every time:
rsync -razpv --delete /folder/name server_name:/folder/name
you can also add:
--exclude 'file/inside/folder'
Use it cleanly, and wisely, but a process of copying files for some of my builds that took 20 minutes always, now take 10 seconds when there is no change, and less than 1 minute when there is a change.
Also - did you think about a management tool (puppet/chef/ansible) to do some of the work?

Related

Liferay 'build services' and deploys are too slow

I'm using the Liferay platform to develop a company portal (version 6.1.1). This portal already have a considerable implementation and database size (174 tables).
As expected, from the beginning the build services and deploys were getting slower as the project were growing.
The problem is that with the current implementation it takes like 20 minutes to perform the 'build services' and about 3/4 minutes to perform a deploy which happens even if i change a simple string in the code. And for every 3 deploys it´s necessary to restart the server because it seems to froze.
My machine specs are:
Intel core i5-3210M
8GB RAM
64bits
And this are the memory args of my liferay server:
-Xms1024m -Xmx1024m -XX:PermSize=1024m -XX:MaxPermSize=1024m
As you know this waiting times have a huge drop of performance in the implementation.
My questions are: is this normal? If yes, what kind of alternatives do i have in a future portal implementation?
Thank you.
174 tables are quite a lot - more than Liferay itself brings. I'd recommend to spread out your application into separately deployable plugins - they don't (technically) need to be in the same plugin, service builder allows you to use the services across different plugins.
Proper dependency management should help you to isolate the functionality that you'll extract into separate applications. Declare which application needs which other application deployed before, and you can access the services cross-context.
To answer your comment-question, sampling with only two projects: Create them, both with service-builder. Let's call them common-portlet and app1-portlet. Obviously, app1-portlet uses components (and services) from common-portlet.
In app1-portlet, edit docroot/WEB-INF/liferay-plugin-package.properties and add the line
required-deployment-contexts=common-portlet
This will make sure that app1-portlet is only deployed when common-portlet is available. Also, common-service.jar, the API of common-portlet, generated with service-biulder, will automatically be put on the classpath of app1-portlet, in other words, you can call the services that you implemented in common-portlet.
Assuming your more abstract portlets have a more stable interface (typically this indicates a proper architecture), changes to app1-portlet (or app2-portlet etc.) will only affect the portlet where you make a change in. Even if you have a change in common-portlet, service-builder will be relatively quick, however, on interface changes you still need to recompile everything, but that's the nature of dependencies. If you don't change your interfaces, you'll only need a redeploy.

How to incremental update Java EE web application?

I have a typical web application deployed in Tomcat. The requirement is to provide incremental update way instead of full-package delivery (a war file) when update the application.
For example, once I finish a bug fix which changed a jar file, an XML file and jpg file. I call these 3 files as a patch. I am supposed to deliver the patch file. Even when customers want to rollback to original version, I have to provider a way to rollback the patch.
All the process is supposed to automatically.
From my perspective, the requirement doesn’t make sense. full-package delivery is easy and reliable way to update a web application, I don’t want to introduce complex and error-prone way to update.
Do you have idea to implement incremental update requirement? Thanks!
When you deploy the .war or .ear, the application server usually unpack it into an internal directory. You can change files in this directory directly, with a finer granularity. However, for changes to take effect consistently, you will need to restart the server.
Your perspective is indeed fully correct. Nowadays, sizes of files don't play a significant role, I don't see the problem with whole updates. Why isn't the customer happy with whole updates?
Note: If what he wants is dynamic updates, i.e. without restarting the server, then this is anyway a complete different problem, and mostly impossible for production systems in java (but doable during development, with solutions like JRebel).
You can create a Java Program that uses Delta-Sync protocol i.e. Only those files need to uploaded which are updated. If you have used Dropbox then you will understand pretty well.
Dropbox uses Delta-Sync protocol to update file and sync data.
Either way for time being you can use Dropbox by installing on your client (mapping to server's WAR folder) and your local machine and share that folder. Then whenever you change the files in your local machine it will automatically upload and sync those CHANGED (PATCH) files to your client's machine.

How to improve our build and deployment process?

Our build/deploy process is very tedious, sufficiently manual and error-prone. Could you give proposals for improvement?
So let me describe our deployment strategy and build process.
We are developing system called Application Server (AS for short). It is essentially servlet-based web application hosted on JBoss Web server. AS can be installed in two "environments". Each environment is a directory with webapp's code. This directory is placed on network storage. Storage is mounted to several production servers where JBoss instances are installed. Directory is linked to JBoss' webapps directory. Thus all JBoss instances use the same code for environment. Configuration of JBoss is separate from environment and updated on per instance basis.
So we have two types of patches: webapp patches (for different environments) and configuration patches (for per instance configuration)
Patch is an executable file. In fact it is bash script with embedded binary rpm package. Installation is pretty straight-forward: you just execute file and optionally answer some questions. Important point is that the patch is not a system as a whole - it contains only some classes with fixes and/or scripts that modify configuration files. Classes are copied into WEB-INF/classes (AS is deployed as exploded directory).
The way we build those pathes is:
We take some previous patch files and copy them.
We modify content of patch. The most important part of it is RPM spec. There we change name of patch, change its prerequisite rpm packages and write down actual bash commands for backing up, copying and modifying files. This is one of the most annoying parts because we not always can get actual change-set. That is especially true for new complex features which are spanned among multiple change requests and commits. Also, writing those commands for change-set is tedious and error-prone.
For webapp patches we also modify spec for other environment. Usually they are identical excepting rpm package name.
We put all rpm related files to VCS
We modify build.xml by adding a couple of targets for building new patch. Modification is done by copypasting and editing.
We modify CruiseControl's config by copypasting project and changing ant targets in it
At last, we build a system
Also, I'm interested in any references on patch preparation and deployment practices, preferably for Java applications. I haven't succeed googling that.
The place I work had similar problems, but perhaps not as complex.
We responded by eliminating the concept of patch altogether. We stopped patching, and started simply installing the whole app (even if we do a just a small change).
We now have Cruise Control build complete install kits that happen to contain the build timestamp in the install-kit name. This is a Cruise Control build artifact.
Cruise Control autoinstalls them on a test server, and runs some automated smoke tests. We then run manual tests on the test server. Then we install the artifact on a staging, then production server.
Getting rid of patching caused some people to splutter, "isn't that wasteful if you're just changing a couple of things?" and "why would you overwrite all the software just to patch something?"
But the truth is that good source control, automated install-kit building, and one-step installation has saved us tons of time. It does take a few seconds longer to install, but we can do it far more repeatedly and with less developer labor.

java web application best practices

I'm trying to figure out the optimum way to develop and release a fairly simple web application, and I'm running into several problems. I'll outline the decisions I've made, because somewhere I've clearly gone off the rails.. Hugely grateful for any help!
I have what I think is a fairly simple web application. It contains a couple of jsps that reference a couple of java beans, and the usual static html, js, css and images.
Decision 1) I wanted to have a clear and clean release procedure, such that I could develop on my local machine and then release reliably to a production machine. I therefore made the decision to package the application into a war file (including all the static resources), to minimize the separate bits and pieces I would need to release. So far so good?
Decision 2) I wanted things on my local machine to be as similar as possible to the production environment. So in my html, for example, I may have a reference to a static file such as http://static.foo.com/file . To keep this code working seamlessly on dev and prod, I decided to put static.foo.com in my /etc/hosts when developing locally, so that all the urls work correctly without changing anything.
Decision 3) I decided to use eclipse and maven to give me a best practice environment for administering and building my project.
So I have a nice tight set up now, except that:
Every time I want to change anything in development, like one line in an html file, I have to rebuild the entire project and then wait for tomcat to load the war before I can see if it's what I wanted. So my questions are:
1) Is there a way to connect up eclipse and tomcat so that I don't have to rebuild the war each time? ie tomcat is looking straight at my actual workspace to serve up the static files?
2)I think I'm maybe making things harder by using /etc/hosts to reflect production urls - is there a better way that doesn't involve manually changing over urls (relative urls are fine of course, but where you have many subdomains, say one for static files and one for dynamic, you have to write out the full path, surely?)
3) Is this really best practice?? How do people set things up so that they balance the requirement for an automated, all-encompassing build process on the one hand, and the speed and flexibility to be able to develop javascript and html and css quickly, as quickly as if one just pointed apache at the directory and developed live? What do people find works?
Many thanks!
Edit: Thanks all for your great responses! If I could mark them all right, I would.. This has really helped me out. What I'm hearing is that best practice is to conserve the structure of the webapp in development, and run it in as close an environment to production as possible. Seems like the differences between people are the extent to which people are prepared to hot deploy resources into the servlet container, circumventing the build process for a little extra speed or convenience. That makes sense. Thanks again.
This is much like what I have to do at work, although we use ant (for now?). Also, while I use an IDE (or two), I refuse to have one as part of my build process, EVER. Somebody needs to be able to understand and tune your build.
Is there a way to connect up eclipse
and tomcat so that I don't have to
rebuild the war each time?
1) I think you're relying too much on your IDE. Usually I have an Ant build.xml that has a couple of tasks: one is "build war" the other is "update jsps." Building the war compiles all the code, packages it, deploys it to Tomcat and restarts everything. Updating the jsps doesn't restart the server, it's just a straight copy from my local files to Tomcat's deployed instance. No restart necessary since they're JSPs. Takes about half a second.
where you have many subdomains, say
one for static files and one for
dynamic, you have to write out the
full path, surely?
2) No way, Jose. So you mean any time the server name changes, you have to recompile your code? If you need to support dynamic URLs you might just want to bite the bullet and take a look at a framework to do the heavy lifting for you. I'm partial to Stripes (which supports dynamic URL rewriting out-of-the-box)... there are others.
To answer #1, I would suggest the following:
Spend some time learning maven to build your .war without eclipse. It's not that hard with the proper archetype. See here for more details: http://maven.apache.org/guides/mini/guide-webapp.html
Maven can generate eclipse projects either through mvn eclipse:eclipse or by using the m2 plugin
For deployment to your local machine and to production, use the maven cargo plugin. http://cargo.codehaus.org/Maven2+plugin and http://blank.jasonwhaley.com/2010/03/automated-deployment-with-cargo-drive.html
To answer question #2, there's nothing wrong with modifying your /etc/hosts file to mimic production. Just have a quick script that lets you add/remove those entries and flushes your dns cache. I do exactly that all of the time. (be sure to make your browser clear its cache frequently also through its relevant setting).
To answer question #3) yes this is how you should be doing things. Each build should result in a single deployable artifact that you can deploy to any of your environments in one step. You need to make sure you can do this without your IDE and use the IDE only as a tool to help you during the development phase.
Others have already answered you, I'll just comment on this (this too long for a comment btw so I make it an answer):
Every time I want to change anything
in development, like one line in an
html file, I have to rebuild the
entire project and then wait for
tomcat to load the war before I can
see if it's what I wanted.
If you change one line in an html file, there's no need to rebuild the entire project.
Note that I always rebuild the full .war and redeploy my .war but this takes less than two seconds (less than one second to rezip the .war [that's really what a .war is, a zipped file] and less than one second to redeploy it) because:
you don't need to recompile your entire project when you simply change one line in an html file
Same when you change one .java file: you can simply recompile that one file and re-war.
I wrote my own Ant build file from scratch (no Maven here) and I've got several targets. I can force a "clean build", that shall re-compile everything but typically I'm simply repackaging and redeploying the .war
You can check it for yourself: build a .war, unzip it in, say, directory dir1, then modify one .html (or one .java/.class file) and build a new .war and unzip that new .war in, say, dir2.
Then compare dir1 and dir2: now fix your build process so that you can create that second .war without needing to recompile everything.
Changing one .html, .java, .jsp, .css, .js / whatever file and redeploying a new .war should be a matter of seconds (less than two seconds if you didn't throw the kitchen sink in your Webapp).
Note that on the very same project, another developer here prefers to "hot deploy" / replace the files directly in the exploded webapp (I prefer to redeploy a .war everytime and because my complete repackage/redeploy takes less than two seconds I'm fine with it that way).
You don't need to reconstruct war file if your project is an Dynamic Web App in Eclipse and configured Tomcat server properly. Follow the below instructions:
1) Check out the below of how to configure tomcat server with eclipse:
http://greatwebguy.com/programming/eclipse/make-eclipse-and-tomcat-play-nice-together/
2) Use relative paths for your application but not absolute paths.
3) If you follow the above 2 steps properly then you have a best environment for development.
during development, you should configure eclipse and tomcat so that no rebuild/redeloy is required. just modify html/css/jsp etc, save, and refresh browser to see the result.
but before deploying to production site, you should do a clean full build and test it carefully.
domains: they should be in a config file; dev and prod should have different config files.

How can I delete Hudson's built artifacts?

We are using Hudson for our Continuous Integration server and it's great. We have 2 issues with it, which are mildly related.
https://hudson.dev.java.net/issues/show_bug.cgi?id=2736 The build order in Hudson means that the downstream dependencies get built a lot more than they need to be. Hopefully this issue will be addressed soon.
Since these things are getting built so frequently, the build history is massive. We really don't need 1000 build items in the history for some of the jobs.
My question is about point 2. I would like something like a job or plugin to delete old artifacts. Keeping say the last 20 builds of everything around would be fine. At the moment it seems unbounded, which isn't great from an operations perspective.
UPDATE: As per Norbert's answer, it is in the job configuration. In the 1.300 UI, there is a "Discard Old Builds" checkbox, which allows this to be configured.
There is such configuration in our hudson build server. In the project config I can chose between an amount of builds or a period of time to keep. I don't think I installed aplugin for this

Categories

Resources