I'm looking for a tool to run java server processes as daemon services in Linux (and potentially on Windows and other OS's). I'm looking for the best practices when it comes to how to build production capable scripts and launch configuration.
I'm familiar with best practices when it comes to a project's build, using Apache Maven, or something like Apache ANT + Ivy to manage your build process and manage external dependencies and build artifacts and assemblies.
When it comes to creating a project's assembly containing configuration and launch scripts along with all the compiled code and dependencies I'm unclear what the best choice is. Is there a good open source project that I could look at as an example, that bundles a service wrapper and configuration scripts with their build process?
I've been able to use Maven with the Jetty Launch plugin to run my Web applications, Terracotta Maven plugin to test multiple JVM clustered server nodes and I've used Maven's exec:java to run my custom Java servers, but I'm not sure using Maven in that capacity is really "production" quality, also it means my production servers depend on building the servers from source and downloading dependencies from potentially unavailable servers.
Here are some potential things I'm looking for in a Java service launcher solution:
Should run as a Linux service or Windows service process
Can be built using a Maven plugin or Ant script and allow me to process configuration files and scripts
Should be able to include all my project's dependencies from Apache Ant
Should be able to pull in a full Java Web Application server (e.g. Jetty 7) and be configured with my custom Web application's war
Should be able to handle a standard Java daemon service (custom java server)
Some of the options I've been looking at are Java Service Wrapper, which is used in the Maven appassembler plugin.
Also using Maven's assembly plugin and custom assembly descriptors allows me to tailor the build output.
The Java Service Wrapper seems to be quite common. I've seen it used by a few people, most notably in nexus.
The preferred (aka "best practice") way to implement Linux services of all kinds is to create a shell script that can start, stop and restart the service and put it into /etc/init.d. Then add appropriately symlinks to it from the relevant "rc.*" directories. Refer to the Linux "man" entries for "init(8)", "chkconfig(8)" and so on.
Related
What is the 'best practice' way of separating Maven deployment configuration from the build config?
I have a war project, that is built by Jenkins. I'd like Jenkins to deploy this to Elastic Beanstalk, but alas the best solution available at the moment is to use the beanstalk-maven-plugin.
I'm not sure it makes sense for the POM.xml to include information about deployment; after all, at build time that .war could end up anywhere.
In this situation, is there some way of using Maven modules to store the beanstalk-maven-plugin config in a separate POM to that of the actual software project?
I think you have to solutions.
Just add the beanstalk-maven-plugin definition to your regular pom.xml. The configuration can be stored in separate properties file or provided via system properties in command line (-D option). Add beanstalk goal to command line of maven in Jenkins. So, each build will be deployed on beanstalk. Alternatively you can define yet another project in Jenkins that just runs the deployment without compilation. You can run this deployment project on scheduled basis or via projects dependencies in Jankins.
Create yet another maven project. It will just run beanstalk plugin. I personally do not see serious advantages to do this.
I think about three things:
a. I'm not sure (I'm admit I was a bit busy trying to come up with 0.2.7-RC7), but I think the Elastic Beanstalk Configuration Files are supported in Java.
So it perhaps could be a good idea to separate (I admit managing config in Beanstalker is Boring)
b. Another option is using war overlays in maven-war-plugin's overlay feature, and create a war which depends on your other war.
In my personal case, if you ask, I do have a separate deployment profile in Maven, and that feature often come in handy
I'm currently building a desktop java application in a very clumsy manner. The application is deployed on Windows, Mac and Linux. Here's my build process now:
On Windows:
Update local repository
Fire up Eclipse
Refresh the project
Double click the .jardesc file to generate an executable jar file
Commit the executable jar to source control
Open up the .nsi script and click the build button (I have NSSI plugin installed) to produce the .exe installer
Upload installer to ftp server to publish
On Mac:
Update local repository
Run shell script to generate .dmg file using .jar in source control
Upload to ftp server to publish
On Linux:
Update local repository
Run shell script to generate .deb file using .jar in source control
Upload to ftp server to publish
I'd also like to include some extra steps in my build in the future, such as:
Setting build date
Setting the HEAD git commit-id
Performing some code obfuscation
Any suggestions on how I can streamline and speed up this process?
If you are serious about having a good build system, then I'd recommend learning and using Maven, which provides:
Comprehensive project build lifecycle management based on a declarative project definition (pom.xml)
A huge range of plugins, which I expect will be able to handle all the specific build steps you require
Very good integration with Eclipse
Full dependency management (including automatic resolution and download of dependencies)
This is not for the faint hearted (Maven is a complex beast) but in the long run it is a great solution.
First step would be to just get everything building without Eclipse.
You might also want to consider using something like Jenkins to automate some of this. You'll still require build scripts.
A solution could look like
Update repository.
Jenkins detects update and builds the jar.
Jenkins saves the jar to some location.
Then you can have separate builds for each OS, also running in Jenkins. These could be triggered automatically on successful completion of the first build. These would each:
Pick up the jar from the previous build.
Publish the OS specific binary to an FTP site.
Ant is a good start, but you may also want to look at Apache Ivy or Maven, as these will help a bit with managing your build outputs and dependencies.
You should have a look at Ant: https://ant.apache.org/
Apache Ant is a Java library and command-line tool whose mission is to drive processes described in build files as targets and extension points dependent upon each other. The main known usage of Ant is the build of Java applications.
Also, a long list of build systems: https://en.wikipedia.org/wiki/List_of_build_automation_software
I have a muti-module maven project, and I created a new module that depends on 3 other modules. (I already have a web app maven module that produces a .war file, now I need this)
This module's output is a .jar, and it has a few resources also which are:
spring context xml file
properties file
Now I want to produce a production ready folder so I can upload it to my server. I am hoping maven can do this for me.
I need the following layout:
myjar.jar
/libs/ (the 3 other maven modules that are dependancies)
/resources
Also, there are some generic dependancies that my parent pom.xml have like slf4j/log4j/ that I also need to package.
It would be cool if I could add a switch to mvn that will produce this like:
mvn clean install production
I plan on running this on my server via the command line.
I think what you are looking for is a Maven Assembly:
https://maven.apache.org/plugins/maven-assembly-plugin/
You can use profiles to disable the generation of the assembly by default (can speed up the development process).
#puce is right in that you may be best to use the Assembly Plugin. What you can't do easily is add another lifecycle 'production' to maven. If you have time you could write a plugin to do this, but you might be better off using a profile called 'production' or 'prod-deploy' to enable the coping into place on the server.
mvn clean install -Pprod-deploy
One thing to remember with maven is that it is very good at building projects in using it's conventions, but it is pretty bad at actually script things to happen out side of the build lifecycle.
I have on several occasions used external scripting tools such as ant/python/bash and groovy to first run the build using mvn then to script the deployment in a more natural language.
The intention of Maven is building not deployment in the sense to production. For this purpose i would recommend things like Chef or Puppet. From a technial point of view it's of course possible to handle such things via Maven. What also possible to build on CI solution like Jenkins. Furthermore it's possible to run a script from Jenkins to do the deployment on production.
If I were using python I would probably like to use pip as a nice installer for continuous delivery with its nice repository integration and scripting capabilities.
Do I have anything similar in java which would be useful for me in continuous deployment?
Can someone recommend me how they do full continuous deployment in java?
I'm going to have multiple servers with complex configurations and huge multiple clusters with databases, NOSQL's (and using maven for the some of the projects while others are just downloaded pacakges) etc etc... anyone has recommendation for that?
Again I think pip is a very nice installer and could help me, anyone has experience maybe with ubuntu juju?
However if I use ubuntu juju that would mean I would have to use ubuntu based servers and not centos.
There's a kind of bright line between Java app build and Java app deployment. Build CI in Java is pretty straightforward with a variety of tools available - build scripting (Ant, Maven, Gradle, etc), continuous builds (Jenkins, Go, Anthill, etc), and repositories (Nexus, Artifactory, etc). Dependency management for libraries is a hairball for Java, so definitely use Maven or Ivy for it.
Deployment is a much wilder and less mature world. The environments are potentially far more complex, and often include messy non-Java things like relational databases. You can hand-roll scripts, or use ControlTier or Capistrano or something like that (which will still involve some hand-rolling).
I'm not completely clear what pip does, but here is my toolchain for CI/CD
You need a build tool:
Maven (does a lot of stuff, including downloading dependencies and driving you crazy)
ANT (will poke you until you die with xml brackets)
Gradle and others (pretty much everybody including ANT uses/can use Ivy for downloading dependencies from repositories)
You need an CI server
jenkins
various commercial options (Teamcity, Bamboo ...)
For the deploying part you need something to deploy your apps.
This really depends on the build tool you use (which should be able to do the deployment). Maven has some plugins for this afaik, but I think you will have to google for your app server and the build tool to find a solution for your specific need.
Probably what you are looking for is building a deployment pipeline. Check a video example here: http://www.youtube.com/watch?v=6CEQOuHM86Y
There are multiple ways to achieve it. Ill tell you my preferred one.
Components you will need:
VCS server (SVN, Git)
CI Server (Jenkins, Hudson, TeamCity)
Build Tool (Maven, Ant, Gradle)
Artifact Repository (Artifactory, Nexus)
Deployment Tool (Rundeck, Puppet, Deployinator, Capistrano)
Target Environment/s (Application Server like Tomcat, JBoss)
Workflow:
1) CI Server polls VCS Server for changes
2) When a change is found (i.e., a commit), starts job execution, getting an artifact (CI Server will compile and run tests). CI Server internally will use a Build tool like Maven.
3) CI Server uploads artifact to an Artifact Repository
4) Deployment Tool reads Artifact Repository and provides a list of artifacts to be deployed, plus the list of Target Environments where Developer/Ops can select a combination of both and deploy the artifact in the selected server.
Take into consideration some criteria at the moment of picking the tools. If you have a really big infrastructure (like 200+ Target Environments), robust solutions like Puppet make sense. If you have a modest one (let say, 10 Target Environments) then Rundeck may be enough. Also, take into consideration that some of the listed tools are not free (Puppet Enterprise is not free beyond ten nodes, for example).
Step two of "The Joel Test: 12 Steps to Better Code" states "Can you make a build in one step?". My answer to this is currently no. My application is structured as follows:
+
+-MyApp // this is just a vanilla Java Application
+-MyWebApp // this Dynamic Java Web Application (deployed Tomcat and launches
// a thread contained in MyApp)
+-MyCommonStuff // these are common classes shared between MyApp and MyWebApp
// Ex. Database access code & business classes
In order to build and deploy my software I perform the following steps:
1. Checkout MyApp, MyWebApp, MyCommonStuff from svn
2. build MyCommonStuff.jar and copy to a "libs" directory
3. build MyApp and copy to a "libs" directory
4. build MyWebApp.war (Ant build.xml file specifies where MyApp.jar and MyCommonStuff.jar are located)
5. The deploy portion of build.xml used Tomcat deployment tasks to deploy to a tomcat server.
My question is does the Joel rule above apply to this scenario. i.e. should there be a "master" build script which executes steps 1. to 5.?
Should the script just be a normal #/bin/sh script or are there tools I can leverage. My preference would be stick to using Ant and linux console commands.
Thanks
You can (and should) use maven2. It supports everything required (via plugins). You just need to conform to its directory conventions.
In addition I'd suggest a Continous Integration Engine, which will take your maven configuration and execute and deploy everything. Hudson and TeamCity are good options.
An alternative to Maven, if you just want to use Ant, is Ivy. This is just a dependency manager, a bit like Maven but without all the other stuff Maven does.
I would suggest using one of the two. If you have a project with dependencies like this, you're going to make it so much easier for yourself if you store them in a central repository and use a dependency manager to include them!
You should do a global Ant script, calling all little ant parts through the Ant ant task.
Edit after reading other answers : you should also use maven. But if Maven is really overkill, and you just want to launch the whole build in one step, use a global build.xml