Jenkins without Automated Tests - java

I know that Jenkins is focused on continous building/testing, monitoring of batch jobs about the project. I have a legacy project which such condition :
Has a development team.
It has SVN for source code management
Some cronjobs for some operations.
Compile&Build don't take too much time, there is no very complex dependencies.
It doesn't have any automated test/junit classes and will not have.
I'd like to ask to experienced users about Jenkins, is it still worth to use Jenkins for central build&management of the project ?

Even for simple projects a continuous integration environment is useful. For one, it helps developers verify they have committed all their changes and have not forgot any changes that would break the build.
Along those lines, its also to have a single location from which to build and grab artifacts to deploy. Using developer systems for this always has the potential for introducing errors do to environmental changes between developers such as JDK versions, etc.

Related

Trying to understand what gradle and maven actually do

What exactly do Gradle and Maven "do" that eclipse or sts doesn't? From what I've read it builds, runs, etc which can all be done in eclipse easily.
If I have an existing project I've created, built, and currently runs in eclipse via tomcat, what would I use gradle for?
There's not a lot of benefit to using Maven or Gradle on a small project that you never share with anyone; an IDE can do the build just as well. But as the number of developers increases and the complexity of the build increases, it becomes very useful to separate the build instructions from the IDE. Let's drill into these a little bit.
With the increase in developers, you don't want everyone to have to come by and use your IDE to get a build done. That would be really annoying! So they're on their own machines, but then they tend to have different setups (how dare they have different user account names!) and probably have their IDE installations set up a little differently too. So we need some kind of build instructions that people can use to get things going, and it helps if everyone can use the same build instructions repeatably so that you don't get too many instances of “but it works on my machine!”. It's also very helpful if those instructions are simple enough to use that a new programmer to the team can get up to speed rapidly.
But the other thing that often happens as projects grow in scope is that their builds become more complicated. They very often gain additional dependencies (they didn't start out needing a high-performance date parser and MIME-type identifier, buit they've become required since and you don't want to have to write all those from scratch) and that means you've got to make sure that when the build is done, the right version of those dependencies is used. But that's not the only way that complexity increases. It's also very often the case that you find you're using more automatically-generated code. You might find yourself working with XML schemas or WSDL a lot, or maybe your using Hibernate, or Spring, or … well, there's lots of ways in which things can get complicated, OK? Getting the various steps to do all the build right, reliably, in these sorts of scenarios can be a bit tricky, but encoding them as instructions to something like Maven makes life a lot easier once you've taken the jump in the first place. (It gets even more important when you start trying to deal with projects which need many different sub-programs that work in concert; some of those are plain hard to build even with Maven or Gradle or any other tool.)
And then there's the possibility of offloading work to a build server, running tests automatically, managing dependencies cleanly, etc. IDEs don't handle these all that well by themselves; where they do a reasonable job of it, it's usually because they're using a tool like Maven under the covers to do the heavy lifting.
tl;dr
You don't have to make your code work with a build system, but it helps if you do and in many ways.
Maven and Gradle can do many things that Eclipse doesn't. However, the most important thing they do, is to decouple the bulding and testing processes from the IDE you choose (i.e. Eclipse). When you work on a large environment, with many programmers, usually you can not control the IDE they use. So, it's better to use a tool like Maven and Gradle to standardize these tasks. The same happens with the code examples of a book: instead of the authors having to provide the instructions for configuring any IDE to execute them, they provide the Maven or Gradle files, so the reader can build and test them on any IDE he's using.
Another very important feature that Maven and Gradle give you, is the fact that dependencies are managed without the need of having the executable code under source version control. Instead of having the executable code you depend on as part of the project, you declare the dependencies on a text file (which is under source version control), and then get them from a repository.
However, you may only see the real advantages of using tools like Maven or Gradle (and even Jenkins or Hudson), when you think in large scale projects, developed along many months by teams composed of many developers).
Gradle and Maven are build tools. Maven was first and is a bit older, Gradle is newer and has redefined a way of how projects are built and maintained. In my opinion it's also much easier to use, more readable and easier to maintain. I prefer Gradle ;)
You use eclipse or STS (any other IDE) for development. And while You finish this process You need to provide a configured artifact (war, ear, whatever...) to production and deploy it there. These artifacts have well defined format and the application won't be run from eclipse or STS at the production environment. It's tiresome and error prone to prepare such artifacts by hand.
Gradle or Maven can take responsbility of building and preparing these artifacts (in fact such tools can do much more) off Your shoulders, they make this process automated.

Which Continuous Integration System is the easiest to reestablish on a new machine from backup?

I have had the pleasure of using Hudson (now Jenkins) for a few years now, and I like the general attitude of the system - it's really a nice program - but the focus is elsewhere than one thing I need to get running well.
This is, well, if something happens to our build server and we need to rebuild it from backup tapes (of the whole workarea), this is not easily done with Jenkins. As we do not as such rely on any Jenkins-specific features, I was considering if other CI systems have a better method in place.
Basically we have multiple Eclipse projects next to each other when extracted from git. Each build entry points to an ant script in one of the projects which is then built. We need full flexibility with the version of Ant and Java used. This can perfectly well be described in a "launch configuration"-file somewhere in the project so all that is needed is to point at (or perhaps even auto-discover) said launch file.
It would be better, if the history could be established too etc, but I'd really like just to be able to get the jobs back up and running.
Any recommendations?
(Note 2013-02-18: We migrated our build process to Maven. This simplified the Jenkins configuration greatly and made this question less important. It is still nice to know if you can bootstrap a CI-configuration easily from backup tapes or scratch (based on information stored in the various pom files))
It's fairly easy to backup Jenkins.
Backup all your config files. I have a Jenkins job that runs once an hour, scans for any config.xml changes, or additions, and adds/updates them in our Perforce server. In addition it gives me the ability to rollback to an old config , if I mess up a config bad enough, just by doing a sync in Perforce.
Backup your plugins. I just backup the .hpi files, again into perforce. That way I don't have to remember what plugins I have on my server, should I need to rebuild it.
Backup your workspace dir. This one I don't do, as my CI builds I don't care about, and my nightly builds are stored on another server. I found the Jenkins copy to be very very slow, so I copy all artifats/sources inside my build scripts instead, to an archive server, that is backed up nightly, by IT already(it's a san). The only problem with this, is I can't point a Jenkins job artifacts to a separate location and have Jenkins link them as artifacts. I haven't found a plugin to do this, and the plugin I've been writing to do it is coming along slow. But it's not too bad of a tradeoff.
These 3 allow me to restore Jenkins, but without log files(I don't really need them, as long as I have the output, tee.exe is your friend) in a pretty short order should my server ever die.
This probably is not a direct answer to your question, but...
We've been using Jenkins (formerly Hudson) for just over a year and a half now. We have not used any feature that is Jenkins specific that I would not get on other CI systems. However, after going through configurations of buildbot and CruiseControl (sorry, please don't hate me) initially, I found that the plethora of plugins available and the general ease of use of Jenkins made the choice a no-brainer. We now have 86 projects building, and have two servers - one for development, and one for QA and Release builds.
As for backups, we actually back up the entire Jenkins home folder once a week (for both servers). Once every month, we back up all our virtual machines (entire virtual hard disk).
Note that both our servers and all slaves are VMs.
This way, if something goes wrong, I can restore at least the job configurations easily - the oldest it will be is one week.
The repository has all the latest code anyway, and the virtual machines can be brought back to a version which is (at most) one month old.
Also, Jenkins has a backup plugin - https://wiki.jenkins-ci.org/display/JENKINS/Backup+Plugin - which makes it quite easy to back up at least the configuration.
Considering that backup is not the primary function of a CI system, chances are all of them are going to have their own quirks for backup and restore.
Just my 2 paisa/cents/pence.

What is the best way to handle Java web application versioning?

I have a standard Java application that handles both REST and UI calls. What is the best way for me to create and manage an application version (major.minor.release.build)? I'm using Subversion, Maven, Bamboo (continuous build) and Spring in the stack. I would like the version to be tied together with SVN, Bamboo and Maven. And, would like to be able to log version on start-up -- likely using some Spring bean.
There must be a framework/pattern out there to help with this. I'd rather not roll my own.
Thank you!
Why not use Semantic Versioning? It is what most people expect nowadays, it is pretty well defined and it is out there. Good enough for me.
Maven has a release plugin. This is a bear to setup first but once it is working it works well. It does all the nitty gritty of making sure everything is cleanly checked in, tagged properly, and does the magic with the version numbers. It is not a ask to look forward to, but at least now it is properly done. It pays to setup some maven repository. We use Nexus and can recommend that, but I heard good things of artifactory too.
During testing we do not rely on the maven version too much but on the build number, which we put in a discrete place on the web pages and similar artifacts so we can quickly determine which exact build we're talking about. We use hudson which provides the build number in an environment variable, but Bamboo must provide that too. The filter copy functionality makes that pretty straightforward.
Hudson tags the VCS (we use git, but that does not matter) with the build number and the maven release plugin tags the releases.
You can include the SCM revision number in your artifact using the maven build-number plugin (http://mojo.codehaus.org/buildnumber-maven-plugin/), e.g. in a filtered resource, such as a properties file.
If you are using Artifactory as your binary repo then it can also tag your binary artifacts with a build number and have full traceability from your artifact to the CI server build that created it. Currently this is supported with Hudson, TeamCity and Bamboo.

Java EE: Need better deployment system

We are currently using JDeveloper to build our production EARs. One problem with this is that if the developer doesn't add new files to a VCS, then that developer is the only one capable of making EARS, and therefore he can use unversioned files as well.
What would be a good system that seperates this, so that EAR files can be correctly produced without depending on the local developers workspace (this would also ensure that they add their files to a VCS before allowing to make a deployment/check-in).
One problem with this is that if the developer doesn't add new files to a VCS, then that developer is the only one capable of making EARS,
If the developer doesn't use the VCS, this is not your only problem:
You cannot reproduce things in another environment, you're tied to the developer machine (but you're aware of that). What if he is sick?
Not versioning files means you don't have any history of modifications and that you don't know what you put into production ("Hmm, what is in this version? Wait, let's open the EAR to check that.").
And last but not least, in case of hardware failure (e.g. a hard drive crash), you can say good bye to everything that is not in the VCS.
So, the first thing to fix is to ALWAYS version files, even if there is only one developer as working alone doesn't save you from the mentioned problems. These points should be reminded (the developer needs to be aware of them to understand their importance).
To make sure this happens, you should actually not rely on the developer machine to build the EAR but rather rely on an "external" process that would be the reference. You want to avoid this syndrome:
alt text http://img716.imageshack.us/img716/9901/worksonmymachinestarbur.png
To put such a process in place, you need to automate the build (i.e. your whole build can be run in one command) and to break the dependency with your IDE. In other words, do not use the IDE to build your EAR but rather use a tool like Maven or Ant (which are IDE agnostic). That would be the second thing to fix.
Once you'll have automated your build process, you could go one step further and run it continuously: this is called Continuous Integration (CI) and allows to get frequent, ideally immediate, feedback about changes (to avoid big bang integration problems). That would be the third thing to fix.
Given your actual toolset (which is far from ideal, there is not much community support for the tools you are using), my recommendation would be to use Ant (or Maven if you have some knowledge of it) for the build and Hudson for the continuous integration (because it's extremely easy to install and to use, and it has a Dimensions plugin).
Here's my recommendation:
Get a better IDE - IntelliJ, Eclipse, or NetBeans. Nobody uses JDeveloper
Have developers check into a central version control system like Subversion or Git.
Set up a continuous integration facility using Ant and either Cruise Control or Hudson to automate your builds.
What we do is use cruisecontrol. It does two things, it lets us do continuous integration builds, so that we have nightly builds as well as lightweight builds that get built every time a change is checked it.
We also use it to more specifically address your issue. When we want to ship, we use cruisecontrol to kick off a build, that is tagged with the proper production build version. It will grab the code from our version control system (we use SVN) and will build on that, so it is not dependent on developers local environments.
One thing you might also want to consider is creating a production branch to build out of. So, production ears for a particular release are always built from that branch. This way you have even have a bit more control over what goes into the build.
Instead of doing builds from developer workspaces, setup Maven, and have something like Hudson run your Maven build. The artificats of this build (your ear) gets deployed.

Deployment solution automatic compile

At the moment I use SVN to manage java source code. Is there a solution out there whereby I can check in code and have the new code automatically compiled into a JAR file? Somehow the check in would need to trigger the compile process.
You need a continuous integration tool. Hudson would be a good choice (I've been using it the past year and it works really well)
You could do this with a CI server; the server can sit on another machine (the build server) and get the latest commit, build it, and compile it to a JAR. (Of course, this isn't really ideal, on large projects [due to lots of changes being submitted, and so on], but on personal things it should work to a reasonable degree).
There are a number of continuous integration programs which do this very thing. Two I've used are Bamboo (commercial) and CruiseControl (open source). Bamboo is about 1000x easier to setup then CruiseControl, and really pays for itself after you've created a hundred build plans or so. With CruiseControl you will have killed yourself long before you get to your hundred build plan.

Categories

Resources