Jenkins CI Job Workspace custom changes - java

I have setup a CI environment for deploying to dev using Jenkins/Maven/SVN. For the deployment to Jboss Eap server, I am using JBoss as maven plugin. Since the project teams does not have the plugin for this the jboss-as:deploy in the respective poms, once I create the job in Jenkins and a local workspace gets created for the project, I manually update the pom.xml with the plugin and the server configuration. Is there any way to automate this process? I dont have the option to ask all project teams to commit the plugin in the pom file. Issues i am facing.
Project teams commit a change at the same line number where I added the jboss as dependency in the pom, caused a svn merge conflict in my Jenkins local workspace and the build fails since pom is corrupted.
For every job, I have to manually update the POM in the Jenkins workspace.
If I have to deploy a project to multiple app servers, I have to create multiple jobs.
Can anyone suggest a solution.

I would create another distribution-component and keep it separately, e.g. as a component or module. I would update inside its pom.xml the change you mentioned and just support two projects - the developers' one and the one used for other purposes.

Related

How to setup Jenkins build with two separate repositories (poms)?

My project structure looks as follows:
Core
pom.xml
Project A
pom.xml
Project B
pom.xml
Core, Project A, Project B are separate git repositories. I need this structure because all of my Projects should use the same core settings (and when I have to change something in the core, all Projects are updated without having troubles).
I haved added the following dependency to the Project (Project A + Project B) pom:
<!-- Core -->
<dependency>
<groupId>my-group</groupId>
<artifactId>my-core</artifactId>
<version>${my-core.version}</version>
</dependency>
In eclipse on my local machine it works like a charm. The project finds the core and i am able to run all of my selenium tests.
Now I want to setup jenkins and one jobs should contain Project A + Core, another job Project B + Core - but I have no idea how to do that. I have already searched for some solutions, but I dont want to setup a Nexus for example. So is there a easy way to include my core + project in jenkins?
Looking forward to your answers!
You can use Multiple SCMs Plugin to retrieve all repositories into job's workspace and then:
Build and install core project using maven's install lifecycle (so, my-core becomes available in local maven repository on Jenkins machine).
Build Project A
Build Project B
In the SCM section of the Jenkins job configuration screen, choose 'Multiple SCMs'. You'll then see a drop-down list of the available SCM plugins which can be configured similar to the way build steps are configured for a freestyle job. When chosen, each SCM plugin will present its normal set of configuration options.
Make sure each SCM checkout is done in a separate sub-directory of the workspace to avoid conflicts between the SCM providers. If using Mercurial, this option is hidden in the 'Advanced' section, so don't forget to configure it.
If changing the SCM provider for an existing job, I recommend wiping out the workspace.

Jenkins building for different environments

I currently have a Jenkins instance installed on a Development box. This builds fine and deploys to our development environment without any issues.
During the build process my project makes use of a single properties file containing details such as a database connection URL (Details such as these will obviously vary depending on the environment I'm pointing to).
What I would like to know is what is the best way to configure my project so that when I want to release to Production the WAR file built by Jenkins contains the Production properties instead of Development?
(Note I am also using Maven in my project).
I know 3 options:
We have used maven.-profiles for that in the past, but they have the disadvantage, that the release-plugin of maven doesn't work with profiles, so we had to change the versions manually and were unable to deploy the artifacts in a remote repository like nexus.
Another Option is mavens assembly-plugin. That can be used together with the release-plugin, as far as I know.
We decided to write a simple tool that changes the war-files after the maven-build process. It runs in a seperate Jenkins-Job. The Idea is, that building and configuring are two seperate steps. The Artifacts comming out of maven are always in a default-configuration. And if we need the configuration for the production release we start a jenkins job that does the configuration of the war-files.
You can create different maven profiles, like dev, prod, then in the profile setting, use/filter the corresponding resource files like .../(dev|test|prod)/project.properties And in Jenkins, when you build for different platform, build with -Pdev or -Pprod to get the war for the right target.
You may want to check maven profile, maven resource filtering for detailed configuration.
something not related, connect Database via jndi if possible.

Resolve/Download dependencies after deployment to remote server

I have a big war file over-sized due to lots of external dependencies & also I have internet connection speed issues because of which I don't want to keep the dependency jars in my war, so that I could reduce war size & do faster uploads of my updated wars from dev machine to remote server.
I would like the maven project to instead download the dependencies on the remote tomcat server itself when it has been uploaded there & starts running. How do I configure maven to do that ?
There is a pretty simple solution: Build the project on the server.
An easy way to do this is to put all the sources into a version control system like Mercurial or Git.
In addition to giving you a history and an automated backup, DVCS have insanely efficient algorithms to update remote copies (they just transfer the changes, so if you change a single line, only one line is sent over the wire).
Building on your server also means that you get the very fast download of dependencies on the server (which has probably very good download rates). And local deployment will be very, very fast.
Last but not least: When you use version control, you will be able to go back to the last stable version quickly when something goes wrong.
As Aarom says you should build the project on the server directly.
There are two requirements:
You need to have a command line access on the remote server.
Maven must be installed on the remote server.
Then you can upload the sources of your project on the remote server (without dependencies).
Go in the root directory of your project and run your build command (mvn package or whatever custom build command that you use).
So that's it, you have the .war on the remote server loaded with all the dependencies; you can then remove the source files.
#user01
Install all desired 3rd-party jars to Tomcat's lib folder.
Set the scope of those dependencies to "provided" in you Maven pom.xml.
Install Maven on your remote server.
Install a CI server such as Jenkins, Continuum, Bamboo, Hudson, CruiseControl, etc. I'd suggest Jenkins.
Hopefully, you are using revision control software such as SVN, Git, Mercurial, Bazaar, or CVS. If not, then I'd suggest setting up
Git or SVN for your source code repository.
Configure the scm tag in your pom.xml to point to your project's location within your source code repository.
Configure your CI server to get your pom.xml from your source code repository. Your CI server will read the scm tag, and the
URL's you've configured within the scm tag, and will check your
project out. Your CI server will then build your project.
You can either have Jenkins deploy your built war artifact to Tomcat via the Jenkins Deploy Plugin, or you can use a Maven plugin such as the
tomcat7-maven-plugin or Cargo.

Java EE deployment in Intellij IDEA

I have a fairly complex Java EE project which can be built fine from Maven. After importing it to IDEA, I had set up a working deployment of the frontend WAR and the backend EAR to a local Weblogic 12c server. The project also have several 'common' artifacts packaged as jars and used by both the frontend and backend artifacts. For deployment, I used the exploded artifacts to save some time on packaging/unpacking, everything works fine till that point.
During development, I edit some Java sources and try to redeploy the updated artifacts to the running Weblogic. I press Shift+F10, choose Redeploy artifacts, I see IDEA building the project, the project redeploys on the server, and more often than not, I do not see any of the recent changes. Even if I choose Rebuild project explicitly and then try to redeploy artifacts after, no changes can be seen on the server. The only safe way to make my changes appear in the deployed artifacts is to invoke maven from the command line calling the package goal, and then redeploying from IDEA. (No JRebel is installed, is being used either in the IDE or on Weblogic, and I'd like to stay that way).
Is that expected behaviour from IDEA? Could this be something specific to our project or something global? Should IDEA be able to discover which projects needs rebuilt and repackaged and then redeploy the EAR/WAR artifacts properly to the server? Does it need any help from my side achieving that goal?
Whats your run/debug configurations? Check this, if not already sois not.
in the quick menu, edit configurations > Run/debug configurations window:
Server tab:
On 'update' action: restart server
Before launch: set 'run maven clean' and 'run maven install'
Deplowment tab
inserts your ear's here.
don't sure this specific answer your question but what I can suggest you is to try configure
weblogic maven plugin
then you can execute deployment from command line / or from IDEA with maven support.
http://docs.oracle.com/cd/E21764_01/web.1111/e13702/maven_deployer.htm
http://www.youtube.com/watch?v=hagaMr6UL6U
Evenif your final goal is to do the whole built process done by IntelliJ build and deployment options I will first try the following:
If your project was set up properly in maven you should be able to load you maven pom.xml within you IntelliJ. All the maven build commands and deployment setting you were doing through mvn command line will show up in your IntelliJ's maven panel in a nice three structure.
If this works out then clearly one of the libraries that are built through your IntelliJ build are not being deployed properly into the right location. You need to narrow down which one of the jars, the ear, or the war has to be affected by a single change you make and then check whether the date of the file is updated in the location it is to be deployed or not.
To wrap up, my humble sugestion though is to use either maven or gradle intelliJ panels for your J2EE projects. As you do achieve the defined goal of having your project built completely through the IntelliJ idea. Also whatever plugin you add to your maven shows up in you idea's maven/gradle panel. It is a fairly straight forward approach and you achieve a powerfull and flexible deployment and build tools within IntelliJ like your wanted.

Using Maven with Sourceforge.net

For a new project which uses Maven I would like to add distributionManagement configuration in the pom.xml which will connect the project with the Sourceforge.net file upload system.
I have found this information (of 2007), is it still valid or do you know updated resources?
http://docs.codehaus.org/display/MAVENUSER/MavenAndSourceforge
Related question: How can I deploy artifacts from a Maven build to the SourceForge File Release System?
This looks correct. However, note that it only describes deploying the site artifacts, not the project artifacts (JAR and POM). And while it's possible that you could use maven to deploy your artifacts, I'm not sure that you'd want to -- the Maven directory structure is different from the SourceForge structure (of one directory per release).
If you're looking to deploy your project releases to Maven Central, read this: http://maven.apache.org/guides/mini/guide-central-repository-upload.html
The process has changed in the last year or so. At one time you could request that your project be added to a nightly rsync job, but apparently now you have to deploy directly to a recognized repository. Given the number of times that rsync job would fail, it's no wonder they decided to change the process ...

Categories

Resources