I have an automation job (Java, Selenium and Cucumber) on jenkins. I'd like to know if it's possible to store my feature files somewhere on jenkins and configure my java project to read these features there. So doing this everyone who needs to edit the feature file can simple access the jenkins file and edit then, otherwise it would be necessary access the java project, edit the feature and commit and push it to the git repository (too complicated)
Should work if you change the features option for #CucumberOptions in your cucumber runner class. Just make it point to a different folder containing the feature files.
Yes, its possible. Jenkins downloads the code from repo to a slave jenkins node on which the test is executed. The feature files should be located on this node. The build tool or the configs (Maven/etc) should include the feature file locations in the test phase or the test plugins used. The #CucumberOptions also would then specify these feature files to run for the runner.
Note that #CucumberOptions is checked into the repo so you want to make that as generic as possible and then control the feature files in the jenkins node.
Related
I am performing automation setup using java, cucumber and junit. However, due to restrictions of firewall and proxy in company, not able to create maven project with archtype templates. Hence initially created java project with source folder as src/test/java, placed feature file, stepdefinition and test runner class within this source folder, however, while running test runner class, I am facing "test class not found error for selected project".
I tried all sort of solutions such as ordering JRE file to bottom, giving entire path of feature file,adding output folder in using run configuration etc however it seems not working.
Hence finally would like to know whether can we run cucumber with only java and junit in the absence of maven.
Kindly help me in resolving this issue.
Possible to execute Java Maven Project in MS TFS?
I have linked the project inside MS TFS using the POM file but it dosnt seem to open any browser(s) instances.
My POM file has a surefire pluging used to execute the TestNG xml but that dosnt seem to work,
The tests run too quickly and the results are inconsitent, is this even possible?
Thanks for your help
Use your build server to make life easier. The easiest way to create a build definition for a Maven build is to use Eclipse with the TFS plugin. You connect to your project, create a new build definition, choose Maven, and it will construct a TFSBuild.proj file (yes, the really old Upgrade Template). At the very bottom of that tfsbuild.proj file, you'll need to edit to look something like this:
Note the "Goals" entry - this can be modified to your specific goal.
In the build definition, the Configuration Folder Path will have the value of the location of your TFSBuild.Proj file - just the folder. By default, it will be created in $/YourProject/TeamBuildTypes/YourBuildName. In Source Settings, the location of the POM file is mapped to $(SourceDir).
If you are using the TFS2015 or later with vNext build system, you can refer to this link for detailed steps: Build your Java app with Maven.
I'm currently building a desktop java application in a very clumsy manner. The application is deployed on Windows, Mac and Linux. Here's my build process now:
On Windows:
Update local repository
Fire up Eclipse
Refresh the project
Double click the .jardesc file to generate an executable jar file
Commit the executable jar to source control
Open up the .nsi script and click the build button (I have NSSI plugin installed) to produce the .exe installer
Upload installer to ftp server to publish
On Mac:
Update local repository
Run shell script to generate .dmg file using .jar in source control
Upload to ftp server to publish
On Linux:
Update local repository
Run shell script to generate .deb file using .jar in source control
Upload to ftp server to publish
I'd also like to include some extra steps in my build in the future, such as:
Setting build date
Setting the HEAD git commit-id
Performing some code obfuscation
Any suggestions on how I can streamline and speed up this process?
If you are serious about having a good build system, then I'd recommend learning and using Maven, which provides:
Comprehensive project build lifecycle management based on a declarative project definition (pom.xml)
A huge range of plugins, which I expect will be able to handle all the specific build steps you require
Very good integration with Eclipse
Full dependency management (including automatic resolution and download of dependencies)
This is not for the faint hearted (Maven is a complex beast) but in the long run it is a great solution.
First step would be to just get everything building without Eclipse.
You might also want to consider using something like Jenkins to automate some of this. You'll still require build scripts.
A solution could look like
Update repository.
Jenkins detects update and builds the jar.
Jenkins saves the jar to some location.
Then you can have separate builds for each OS, also running in Jenkins. These could be triggered automatically on successful completion of the first build. These would each:
Pick up the jar from the previous build.
Publish the OS specific binary to an FTP site.
Ant is a good start, but you may also want to look at Apache Ivy or Maven, as these will help a bit with managing your build outputs and dependencies.
You should have a look at Ant: https://ant.apache.org/
Apache Ant is a Java library and command-line tool whose mission is to drive processes described in build files as targets and extension points dependent upon each other. The main known usage of Ant is the build of Java applications.
Also, a long list of build systems: https://en.wikipedia.org/wiki/List_of_build_automation_software
Step two of "The Joel Test: 12 Steps to Better Code" states "Can you make a build in one step?". My answer to this is currently no. My application is structured as follows:
+
+-MyApp // this is just a vanilla Java Application
+-MyWebApp // this Dynamic Java Web Application (deployed Tomcat and launches
// a thread contained in MyApp)
+-MyCommonStuff // these are common classes shared between MyApp and MyWebApp
// Ex. Database access code & business classes
In order to build and deploy my software I perform the following steps:
1. Checkout MyApp, MyWebApp, MyCommonStuff from svn
2. build MyCommonStuff.jar and copy to a "libs" directory
3. build MyApp and copy to a "libs" directory
4. build MyWebApp.war (Ant build.xml file specifies where MyApp.jar and MyCommonStuff.jar are located)
5. The deploy portion of build.xml used Tomcat deployment tasks to deploy to a tomcat server.
My question is does the Joel rule above apply to this scenario. i.e. should there be a "master" build script which executes steps 1. to 5.?
Should the script just be a normal #/bin/sh script or are there tools I can leverage. My preference would be stick to using Ant and linux console commands.
Thanks
You can (and should) use maven2. It supports everything required (via plugins). You just need to conform to its directory conventions.
In addition I'd suggest a Continous Integration Engine, which will take your maven configuration and execute and deploy everything. Hudson and TeamCity are good options.
An alternative to Maven, if you just want to use Ant, is Ivy. This is just a dependency manager, a bit like Maven but without all the other stuff Maven does.
I would suggest using one of the two. If you have a project with dependencies like this, you're going to make it so much easier for yourself if you store them in a central repository and use a dependency manager to include them!
You should do a global Ant script, calling all little ant parts through the Ant ant task.
Edit after reading other answers : you should also use maven. But if Maven is really overkill, and you just want to launch the whole build in one step, use a global build.xml
I have a series of Eclipse projects containing a number of plugins and features that are checked into CVS. I now need to run an automated build of these plugins. Ideally I'd like to do it without having to hardcode large numbers of Eclipse library locations by hand, which has been the problem with the automatically generated Ant files that Eclipse provides. The build also needs to run headlessly.
Does anyone have experience of this sort of set-up with Eclipse, and recommendations for how to achieve it?
There are a few options for you to look at, depending on which build scripting language you're using:
For Maven2, the way forward seems to be Spring Dynamic Modules. Other options are Pax Construct, m2eclipse, Maven BND
For Ant/Gant, Eclipse PDE Build, Ant4Eclipse
For command line or both the above, Buckminster.
At my current clients we use Buckminster, which wraps PDE-Build, and call it from Ant/CruiseControl. We've got code coming in from multiple repositories all being built into a single RCP product.
Also, these questions may be of help.
The standard way to make an Eclipse Build is to use the PDE Build Plugin.
http://help.eclipse.org/help32/index.jsp?topic=/org.eclipse.pde.doc.user/guide/tasks/pde_feature_build.htm
http://wiki.eclipse.org/index.php/PDEBuild
The PDU plugin is normally included with the Eclipse IDE and contains a series of templates. The templates help you set up a system that will:
fetch: Checkout all plugins and features using a map file, that contains the locations of the plugins
generate: Creates a build process for every plugin checked out
process: Compiles the plugins
assamble: Jars and packs the plugins
postBuild: Allows to set up automatic tests and deployment
Theoretically all you need to do is to modify a customTargets.xml file , write a map file that contains a reference to every plugin that you need to check out and modify a build.properties file to indicate such properties as the cvs server location.
I had a similar problem to the one you have. The build mechanism is divided into several steps. You can customize the preFetch target of the customTargets.xml file so some "bulk" libraries are imported from specific trees in the repository and add them to the build directory, so you don't have to specify every single plugin in the map.
You can use Tycho to build your eclipse plugins with Maven. This is how the M2eclipse plugin is built. Find out more at http://m2eclipse.sonatype.org
You could write some sort of a script that finds those libraries for you and puts them into a format understandable by Ant.
For example, it could build a eclipse.lirbaries.properties file, then you could read in that file using:
<property file="eclipse.libraries.properties" />
You could also use the FileSet attribute:
http://ant.apache.org/manual/Types/fileset.html
Or even a combination of both.
1) Call Ant Script
2) Ant Script calls bash (or whatever scripting language) script which builds eclipse.libraries.properties
3) Ant loads eclipse.libraries.properties
4) Ant goes on with the build