Packaging up a project for deployment - Java - java

I have a Java application (a quite large one with many external .jar dependencies as well as dependencies on images) and I need to package it up so that someone can double click to run, for example. Or something easy like that.
It uses Java Persistence, so it requires a sql connection which is specified in the Persistence.xml file in the Java Project.
How can I package this up? I was thinking:
the installation process should validate that the user has MySQL installed and if not, direct them to install it
the installation process could ask the user to enter credentials for any database and then I could update the Persistence.xml at run time
These were two ideas I had...but I wasn't sure if there was a known solution to this problem. Any help would be much appreciated!

I think you should take a look at embedded database solutions, like H2. Also, you can package your application using maven's shadowing or jar plugin, having the jar-with-dependencies profile activated.
This will nicely rid you of checking for database servers running on the client machine, and also will give you the proper means of bundling the application in one nice JAR, albeit a little large.
Maven is a build ecosystem and toolset especially designed for building Java applications and executing the code -- and generally doing whatever else you can imagine that's possible to do with and to your code.
It has a rich API for developing plugins and many developers have exploited this feature. There are numerous plugins for building -- and launching -- and packaging your application as well as helping you manage your applications dependencies.
Maven's shadowing comes in the form of maven-shade-plugin, available here. What it does is that it helps you create a single JAR file from all your dependencies. Also, there is the maven-jar-plugin which offers a profile jar-with-dependencies. It is also accessible from here.
H2, on the other hand is a full-fledged RDBMS. This is the website: http://www.h2database.com/html/main.html, and here is a tutorial.
You can find information on embedding the database here:
How to embed H2 database into jar file delivered to the client?
Embedding the Java h2 database programmatically
h2 (embedded mode ) database files problem
I would also suggest you use a combination of H2/Hibernate/Spring which is a very easy setup and provides you with really rich features and an easy-to-use API.
I hope this helps you :)

Building a sophisticated installer that checks lots of dependencies, and runs on lots of different platforms (which I assume you want) is complicated.
I suggest that you look at an installer generator; see What is the best installation tool for java?
Another alternative that I've seen in a few products is to write a (non-GUI) installer or configurer in a scripting language like Perl.

I wrote an installer using ANT, but has no GUI. Also, I used Iz Pack (good option), so I think that depends on how smart do you want it to be, if you are supposed to use it, or a non-technical person, etc.

Related

Java dependancy management from a personal dependancy storage/maintenance destination

I have seen many interesting (and duplicated) questions here about "sharing or using classes between projects".
I see this as quite practical but the proposed solutions I have read about definitely assume certain prerequisites such as:
shared eclipse workspaces
projects that can be made as dependencies of oneanother
common servers such that classpaths can be added with local urls
While likely acceptable solutions, I am looking for an alternative with perhaps greater flexibility and portability.
I am thinking of learning how to use gradle (or maybe maven, I haven't fully committed to one or the other yet). And from what I understand it may be possible to manage shared classes with one of these dedicated dependency management programs.
Theoretically is this possible? Can I setup a gradle or maven enabled java project to handle and keep uptodate personal classes on a local server or folder on a portable drive or cloud mirror?
The way I understand dependency management at the moment (on a superficial level, I know the devil is in the details) is that for a configured dependency management enabled project, gradle/maven will handle classpath additions and the actual version specific comparison, retrieval and storage (and maybe even compilation is possible but I don't know about this) of JARS from external sources.
Rather than go through the steps to setup classpaths to jars I have to keep current and compiled myself as proposed in many other answers, I am considering creating a dummy project on a server that I can put generic classes which I could then point numerous individual gradle/maven enabled java projects to use. (I think most people would be able to keep them as stand alone classes, but I think I might need to keep them in a dummy project to be developed and debugged in context from a main class. I am somewhat new to java architecture so if the only thing that would make this solution impossible is pointing to a "project" instead of a "library" I can definitely adjust from there. (Assuming I am even applying the concept of the "library appropriately).)
Other info:
I would like this to simplify personal dependency using both Netbeans and eclipse IDE's and work cross platform (but Linux and Windows is what I plan to test it on)
So you're looking for portability, and you don't want to compile your java class that you want to share between projects. And you don't mind a local deployment.
The first thing that comes to mind for me is Git - I'm not sure if Gradle/Maven deal in the gritty underworld of the uncompiled. Composer will pull in git repos for php, so that got me thinking.
If you're happy with one-way sharing of code among projects, Git has submodules that let you do that.
But searching around, apparently there's a git script that goes one step further - Git Subtrees. I also found an intriguing tutorial that will allow you to make changes to common code that you change while working on any particular project that shares it - so obviously be careful - but check it out and see if the Subtrees script might suit your needs.
Actually, I don't see too much sense for dependency management on a "class level". Typically you would bundle your classes in a jar file, which in turn can be considered as a unit with a particular functional range. Such a jar is suitable to be put in a dependency management.
If you are new to such tools, I'd recommend Maven. It is widely used in the Java world and well-integrated in common IDEs. If you stick to its conventions, it will take care of your whole build process from compiling, testing to packaging. There are a lot of plugins available that let you customize practically everything in a simple XML based configuration. You'll have your first project running in 30min and your current project migrated in another 30min.
To share your code with others, you still need a repository where you can upload your Maven-built artifacts to. Depending on your preference there are many possibilities. Shove it to Amazon S3, Maven Central or install your own Sonatype Nexus in your private network.

Batch Java Help

My company is trying to determine the best strategy for implementing batch Java programs. We have a few hundred (and growing) separate Java programs. Most of them are individual Jasper Reports but some are bigger batch Java jobs. Currently, each Java Project is packaged an independent JAR file using Eclipse's export option. Those JARs are then deployed to our Linux server manually where they are tested. If they pass testing, they are then migrated up through QA and onto Production through a home grown source code control system.
Is this the best strategy for doing batch Java? Ongoing maintenance can be a hassle since searching Jar files is not easy and different developers are creating new Java Projects (new reports) every week.
Importing existing projects from the Jar files into Eclipse is a tricky process as well. We would like these things to be easier. We have thought about packaging all the code into 1 big project and writing an interface to be able to execute the desired "package" (aka program) maybe using a Web Server.
What are other people/companies doing out there with their batch Java programs? Are there any best practices out there on this stuff? Any help/ideas/working models would be appreciated.
I would say that you should be able to create one web based app for access Jasper reports, rather than a bunch of batch processes. Then, when you need to deploy a new report, just deploy a minor update that accesses a new compiled Jasper report file.
That said, you should be checking your code, not your binaries, into a Subversion or Git repository. Dump the "home grown" source control repository. Life is too short to try to home grow stuff like that. Just use Git or Subversion, they're proven, simple, and functional. When you import a new project, just pull it down from Subversion, don't try to import the JAR file from your Eclipse IDE.
Put your JAR files into a Maven repository such as Nexus, and deploy to QA and Production from there. Create automated builds for every project (be that with Maven or something else). Don't depend upon an IDE to export your JAR files. IDE's change and exporting from an IDE introduces more opportunity for human error. Also, different developers will prefer different IDE's. By standardizing on something like Maven, you're a bit more IDE agnostic.
Mhy company has standardized Java Batch execution using IBM Websphere Extended Deployment.
Here http://www.ibm.com/developerworks/websphere/techjournal/0801_vignola/0801_vignola.html is an article introducing techniques for programming and deploying java batch.
Introduction to batch programming using WebSphere Extended Deployment Compute Grid
Christopher Vignola, WebSphere
Architect, IBM
Commonly thought of as a
legacy "mainframe" technology, batch
processing is showing itself to be a
venerable workload style with growing
demand in Java™ and distributed
environments. This article introduces
an exciting new capability for Java
batch processing from IBM®, the leader
in batch processing systems for the
last 40 years. This content is part of
the IBM WebSphere Developer Technical
Journal.
WebSphere Extended Deployment Compute
rid provides a simple abstraction of
a batch job step and its inputs and
outputs. The programming model is
concise and straightforward to use.
The built-in checkpoint/rollback
mechanism makes it easy to build
robust, restartable Java batch
applications.
The Batch Simulator utility provided
with this article offers an
alternative test environment that runs
inside your Eclipse (or Rational
Application Developer) development
environment. Its xJCL generator can
help jump start you to the next phase
of testing in the Compute Grid unit
test server.
But even if you are not interested in the product, the article is a must read anyway.

What are the best methods for deploying java code to production?

Currently we have a Java Restlet API with dependencies controlled via Maven. When we update the API we run maven assembly:assembly which does the unit tests etc and produces a single jar file. We then upload this to the production server and run it using nohup.
Is there a better or more automated way of doing this? Is this where something like Hudson would come in?
Thanks
My experience goes with webapp-deployment. But same should hold true here. Use Maven, Cargo, Nexus (or Artifactory), Hudson and probably, Jira in conjunction of product release.
Automated release process are more reliable because there is no human factor involved that may forget a step.
We also use Liquibase for database versioning. And, if you are dealing with database changes in your application deployment. You'll realize Liquibase boosts so much confidence while running alter scripts.
I would suggest to go through the following resources
Automated Deployment with Maven - going the whole nine yards If you can, literally follow this pattern.
Maven 2 Effective Implementation -- this book really helped us a lot.
There are several Maven plugins to help deployment. The most general of them is Cargo, but there are also app server specific plugins for some concrete servers like JBoss.
Most companies I have worked for (actually, all) have had some sort of custom in-house built deployment system; even if build was done using a standard framework (like Maven in use at my current company).
Part of this is because there are many aspects that tie closely to company-specific infrastructure, capacity management and monitoring systems; and so even though there are open-source systems, there is usually something that needs to be tweaked.
It sounds like you are running your app on its own--it isn't part of any application server. If you aren't using an application server, there are probably some ways to get cargo and maven to deploy it for you, but you may be better off just using some shell scripts to deploy and run the application.
However, as your application grows, you may find a need for an application server like Jetty, JBoss, Glassfish, Tomcat, etc. When this happens, take a look at the cargo plugin for Maven because it will allow you to do something like:
mvn cargo:redeploy
That will package up your application, send it to the server and restart the app. If you want Hudson to do this for you automatically you can add it as a target to build.
Cargo can save you a lot of time when you have to frequently update an application server.

Managing a Large OSGi Application

I have a large, growing OSGi application with a number of bundles. I am curious to know the best way to manage this type of application. Currently, I am using Eclipse and Maven, but although this is great for building bundles (via maven-bundle-plugin), as of now it has not been easy to manage the entire application.
What I would like to do is either have ONE run configuration or ONE pom.xml that can be launched and the entire application/project be built and launched. Also, I would like to have something that would be good for debugging.
I have heard of PAX Construct and have it installed in Eclipse, but so far it has been of little help (maybe I'm not using it correctly).
I am sure there are people out there with large OSGi applications that are being managed correctly. Any advice that could be shared would help tremendously.
Thank you,
Stephen
A run configuration is possible via Pax Runner. It lets you choose OSGi platform implementation, specify profiles (pre-packaged sets of bundles for some role, e.g. web, log, ds, etc.) and has good provisioning support, for instance it can load bundles from Maven repository. As a result, you can have a run configuration like
--platform=felix
--log=INFO
--profiles=scalamodules,ds,config,log
mvn:com.my/bundle/1.0.1-SNAPSHOT#update
# other bundles
In case your application is very large or you have different applications, there a way to create own profiles as well.
Well...
It all deopends on what do You mean by "managing" the application.
For dev time launching, building and debugging - Eclipse IDE should fit the bill just perfectly.
Maven... I can't speak for it, as I've never used it myself.
We have a pretty large eclipse based application (several, actually) and on the dev side of things we are not using anything special besides the Eclipse and it's integrated SCM.
In the cc build server, we also use headless eclipse to do the building and packaging.
Now the setup of the workspace has gone a bit out of hand of late with all the dependencies and intermediate build steps, so we are investigating Buckminster for managing the materialization of target platform and workspace resources.
If that works out, we'll probably move to building with Bucky as well - it sure looks promising.
(I do not have any experience with PAX, but at a glance, it looks promising as well...)
i'm quite new to OSGi but,
wouldn't it be possible to use OBR-service in such a way that
you would have one OBR repository file which needs the bundles
and let the OBR-service figure out the dependencies and populate your OSGIhost for you?
This area I think has very poor support at the moment. OSGI doesn't really define anything about deployment or packaging so its up to other frameworks (e.g. Eclipse) to come up with their own way of doing it.
If you are building an RCP (Eclipse base) application, then the eclipse systems do all this stuff, right down to creating exes etc. However builds are mainly done on the Eclipse workspace, headless builds are trickier. The Tycho project is trying to make this more sensible by joining the Maven and Eclipse build cycles, however it is still focussed on RCP applications rather than generic OSGI.
If you not doing RCP, which is my situation as well, then you probably have to roll your own solution, as I haven't found any general solution. Here's an outline of what we do:
We define one POM project that lists all the bundles that are contained in your application. All this project does is list the references - lets call it the 'bundle-list' project.
Then, we use pax provision to run the project in development mode. This is achieved by making the 'bundle-list' pom the parent of the provisioning pom of the pax project (usually in the 'provision' folder). Then, when you start pax, it uses the list of bundles from that project to start OSGI. The bundle references in the 'bundle-list' project have to be marked as 'provided' scope for this to work.
Then, to create a distribution, we have another project. This project also has the 'bundle-list' project as its parent. This project uses various plugins to create a distribution, including downloading the bundle jars. The distribution includes scripts that start up OSGI, but these are hand written, there's no pax systems here.
This works well for us to keep the list of bundles in one place, but there's still a lot of hand written scripts, and there are issues sharing configuration between the two systems - e.g. config files, bundle start levels etc.

What's the best way to add a self-update feature to a Java Swing application?

I'm trying to figure out a way to add a self-update feature to a Java/Swing application I'm working on.
Basically I've got a bunch of jar files with extra functionality to be re-deployed to the installed users when they change. Nothing complicated, just check if a new version has been released, download them over HTTP, and then optionally offer to restart the app to the user.
I had a look at webstart, and it could work. But this particular app does some funky stuff with classloading and GC memory settings that don't look like they are supported via webstart, or will at least complicate matters. (It's a tweaked build of JMeter)
I also went down the road of adding in this plugin handler http://swing-fx.blogspot.com/2008/06/add-auto-update-and-plugins-to-your.html, but it is very alpha, and tries to do too much with the usual bugs you get with alpha stuff.
I did the exact same thing. But that was long back so there are probably better tools today.
What I found out I needed was a loader. The loader main program did not have the app jars in classpath. It first downloaded an update if required and then created a custom classloader with the app jars in class path and invoked the main method of the application main class. It is not very complicated. IIRC I needed to do this because the jars could not be overwritten in windows if they were already in classpath.
Hope this helps.
we had a swing app 6 years ago that had self-update. like you suggested,
1)it downloaded the latest jars over http,
2) copied them to a folder.
3) since the swing app is launched using a .BAT file, after user said YES, we would shut down the swing app and look for any files in the update folder. if yes, launch another .BAT file to copy the NEW JARs to the required directory.
4) then re launch the swing app.
Updates, plugins, separation of concern etc. are exactly what OSGi is about - you might want to take a look at this. It won't come free (read: with a steep initial learning curve, especially when you are currently using classloading tricks) at least there are good open source implementations (felix - see felix.apache.org, equinox - see www.eclipse.org and others)
For these implementations autoupdaters are available - if you write your modules correctly it's possible to update at runtime without restarting.
I believe you should look again at Java WebStart, or at least detail the "funky classloading" which you think is going to cause problems (as it might also cause problems with any solution proposed here).
IIRC, you can set command line parameters using Java WebStart ( http://java.sun.com/j2se/1.5.0/docs/guide/javaws/developersguide/syntax.html#resources ).
I would definitely first try out Webstart. We've had lots of success launching even the early Eclipse RCP apps using Webstart, and you can probably not get more funky classloading issues than with the OSGI framework (Eclipse Equinox).
Could you perhaps give some more detail in your question about you classloading approach?
Regarding the GC and other VM settings: these are easy to specify in your JNLP (Java Network Launching Protocol) files used by Webstart for launching apps.
The Java Web Start is good choice. The GC stuff is not important. Classloading could be problem. But when you got trusted by user you can grant AllPermisions and you will be able to do custom classloading. Maybe it will be good to reconsider funky stuff with classloading. It is really necessary? Or look at NetBeans. There should be found inspiration for auto-update.

Categories

Resources