I am trying to find a solution for the following puzzle. I have java projects, managed by maven, which needs some native dependencies to work (run unit and integration tests). Those are provided in form of deb packages, which needs to be installed prior to running a build.
I use Jenkins for CI. Native dependencies can not be installed on Jenkins nodes, because of conflicts with other builds and they can change often. What I do now is not to create a Jenkins job type 'maven', but 'freestyle' and use a pbuilder to create an clean sandbox, install all that is necessary and invoke maven build.
This is working great, but I am loosing Jenkins maven goodies like automatic upstream projects, trigger build when dependency change, etc. Jenkins simply does not know that maven is there.
Finally, my question. Is there a way how to achieve both, isolate build so installed libraries does not affect other builds and leverage Jenkins's 'magic' applied to maven builds and their dependencies?
You could split your build in three jobs, which trigger the next one.
Create needed environment
Run maven job
Clean Up
Even a Freestyle job has "Invoke top-level Maven targets". You could use that to get "maven goodies" while also having ability to run other build steps.
There is an option to "use private Maven repository" which will make sure it will use the .m2/repository folder location relative to the workspace. If you need to separate into multiple jobs, you can use "Custom/shared workspace" between those jobs.
Even in Maven-style job, there is an option to use private repository, so that one job does not affect another.
The problem can be solved by using distributed Jenkins builds. Slave agents can be configured to provision clean environment (e.g. via VMs, docker,...) for each build and tear it down after build is done. This way Jenkins job can be of Maven type and any changes done by pre-build step will not affect others.
More information can be found here.
Consider docker. There you can run processes in isolated environments just as you want. Docker works in a way that it easily communicates with Jenkins.
As a benefit you can also use that docker container to run local builds in the same environment as they run in Jenkins.
Related
Is it possible to execute a lifecycle target (e.g. integration-test) on an artefact that has been installed within the local repo?
My use case is as follows. I have a multi module project with many modules that dedicated various types of integration testing (compliance test, performance tests, etc). I need to invoke these integration multiple times with different environment configurations. These configurations are expressed as maven profiles and parameterised using properties. I want to avoid recompiling the project over and over again.
I would like to have one build CI job performing the mvn install, then separate CI jobs performing the integration tests, triggered once the build CI job has passed. The integration tests would simply invoke integration-test lifecycle phase of the installed artefact setting the profile and passing the parameters
I have tried pointing mvn at the .pom file within the local repo but this does not work. It fails because it cannot find classes within the artefact's own JAR file (as if it were not being put on the classpath) - a problem that doesn't occur if I have my integration job checkout the tree and invoke the pom.xml within the source tree.
mvn -f ~/.m2/repo/x/y/z/myproj-perftests-x.x.x-SNAPSHOT.pom integration-test -Pmyprofile -Dparam1=blah
No, it is not possible. Maven plugins (normally) only work with project sources.
If your only concern is recompiling project again and again, consider splitting your project into the core part and the integration tests part. Then when running integration tests you'll only need to recompile the integration tests part.
I currently have a Jenkins instance installed on a Development box. This builds fine and deploys to our development environment without any issues.
During the build process my project makes use of a single properties file containing details such as a database connection URL (Details such as these will obviously vary depending on the environment I'm pointing to).
What I would like to know is what is the best way to configure my project so that when I want to release to Production the WAR file built by Jenkins contains the Production properties instead of Development?
(Note I am also using Maven in my project).
I know 3 options:
We have used maven.-profiles for that in the past, but they have the disadvantage, that the release-plugin of maven doesn't work with profiles, so we had to change the versions manually and were unable to deploy the artifacts in a remote repository like nexus.
Another Option is mavens assembly-plugin. That can be used together with the release-plugin, as far as I know.
We decided to write a simple tool that changes the war-files after the maven-build process. It runs in a seperate Jenkins-Job. The Idea is, that building and configuring are two seperate steps. The Artifacts comming out of maven are always in a default-configuration. And if we need the configuration for the production release we start a jenkins job that does the configuration of the war-files.
You can create different maven profiles, like dev, prod, then in the profile setting, use/filter the corresponding resource files like .../(dev|test|prod)/project.properties And in Jenkins, when you build for different platform, build with -Pdev or -Pprod to get the war for the right target.
You may want to check maven profile, maven resource filtering for detailed configuration.
something not related, connect Database via jndi if possible.
I have a muti-module maven project, and I created a new module that depends on 3 other modules. (I already have a web app maven module that produces a .war file, now I need this)
This module's output is a .jar, and it has a few resources also which are:
spring context xml file
properties file
Now I want to produce a production ready folder so I can upload it to my server. I am hoping maven can do this for me.
I need the following layout:
myjar.jar
/libs/ (the 3 other maven modules that are dependancies)
/resources
Also, there are some generic dependancies that my parent pom.xml have like slf4j/log4j/ that I also need to package.
It would be cool if I could add a switch to mvn that will produce this like:
mvn clean install production
I plan on running this on my server via the command line.
I think what you are looking for is a Maven Assembly:
https://maven.apache.org/plugins/maven-assembly-plugin/
You can use profiles to disable the generation of the assembly by default (can speed up the development process).
#puce is right in that you may be best to use the Assembly Plugin. What you can't do easily is add another lifecycle 'production' to maven. If you have time you could write a plugin to do this, but you might be better off using a profile called 'production' or 'prod-deploy' to enable the coping into place on the server.
mvn clean install -Pprod-deploy
One thing to remember with maven is that it is very good at building projects in using it's conventions, but it is pretty bad at actually script things to happen out side of the build lifecycle.
I have on several occasions used external scripting tools such as ant/python/bash and groovy to first run the build using mvn then to script the deployment in a more natural language.
The intention of Maven is building not deployment in the sense to production. For this purpose i would recommend things like Chef or Puppet. From a technial point of view it's of course possible to handle such things via Maven. What also possible to build on CI solution like Jenkins. Furthermore it's possible to run a script from Jenkins to do the deployment on production.
We want to use Hudson/Jenkins to build our project which is currently realized entirely in Eclipse. From what I can tell, there are various ways to go from A to B, or E to H, as it were: export as Ant script, export as Maven script, export as Runnable JAR while creating an Ant script for that, etc.
All of the above seem to have in common that between "This runs in Eclipse" and "Hudson produces something that runs" there are multiple steps which are independent, for example, you can change your project, commit to SVN and trigger a Hudson build, but unless you specifically remember to "Export as Ant Script" in between, it will fail.
Is there a "one in all" solution ? I'm not worried about the amount of clicks, but instead about the various steps in between that, to make matters worse, are only needed sometimes. In short: I am looking for something that goes from "I can click on the 'Run' button and it works" to "Hudson produces something that works" without every developer having to remember every optional step in between.
Ideas ?
Edit: All of the answers so far seem to suffer from the same issue: it's all parallel development. You have your Eclipse Run Configuration, and you have Maven/Ant/Whatever build. If you change your run config, you have to then remember later to change your Maven/Ant/Whatever build, commit it, and then HOPE that all other developers notice the change to the Maven/Ant/Whatever build during their daily SVN Update, manually open the file, inspect the changes and then duplicate those changes in their own run configs. That seems like it's just begging for bugs and mistakes, isn't there anything that's properly integrated with the Eclipse Run Configurations ?
Hudson can build Maven or Ant projects, so the first step is to get a reproducible build with either tool, which you only need to set up one time. Then you need to take that pom.xml or build.xml file and actually commit it to Subversion. This is necessary since Hudson won't open Eclipse and will instead use the command-line to execute a build.
Then you can setup a new Hudson job that will watch Subversion for any changes. Your developers can use their normal workflow, where they use Eclipse to do builds and commit changes to source control when they're ready. Hudson will see it and pull down a fresh copy of the code base, and then will do its own compile and will report back any problems.
Personally I prefer Maven2, since I know Hudson has solid integration with it and will do things like run your JUnit tests. Eclipse used to be painful with Maven, but now there's the m2eclipse plugin.
I'd try http://www.ant4eclipse.org/.
It allows you to build your eclipse project from an ant file. From the first paragraph here: http://www.ant4eclipse.org/node/6 it sounds very much like what you want. With ant4eclipse ant will access your eclipse project and then it should be able to build through Hudson.
The aim of the ant4eclipse project is to avoid (or at least: to reduce) the redundancy of Eclipse and Ant configurations. More precisly: it consists of Ant tasks that are able to read and work with some of Eclipse's configuration files.
Migrating to Maven, Hudson has great first class intergration with Maven.
Maven 3 + Archiva makes a very potent build system. Of course there are other Repository Managers but Archiva does just enough for what I need.
Once you get Maven, you really wonder how you did without it up until then. A dedicated private Repository Manager helps this greatly, that is why Archiva is important to the mix.
I was wondering if there is a standard way (i.e. a plugin) to apply a set of patches during a Maven build. Patching the code base in a dedicated step before building is getting tedious as soon as you have different builds or generated sources.
To give an example, this script should deploy 3 different versions from a fresh SVN checkout:
#!/bin/bash
# checkout project
svn checkout http://example-project.googlecode.com/svn/tag/v1_0 example-project-read-only
cd example-project-read-only
# build example-project-1.0
mvn deploy
# build example-project-1.0-a3
mvn -Dmaven.patch.dir=/path/to/patchesA -Dmaven.patch.buildSuffix=a3 clean patch:patch deploy
# build example-project-1.0-b0
mvn -Dmaven.patch.dir=/path/to/patchesB -Dmaven.patch.buildSuffix=b0 clean patch:patch deploy
Currently I'm doing similar things with another build script I'd like to get rid of. Therefore I'm considering to write such a plugin if it's not available yet. (Maybe with dedicated patch artifacts for easy distribution as an added bonus?)
The maven patch plugin might help.
The Patch Plugin has a single goal that can apply either a single declared patch or a directory of patches. Application of an entire patch directory can be configured with various patch-inclusion, -exclusion, and -ordering options:
I haven't heard of any such plugin. However I imagine that you could do something with profiles that applied patches and conditionalized the build dir. Sounds interesting.