Currently I am working in a project that integrates Gitlab + Jenkins + Maven.
This is a Maven Java project and we have UT and Integration tests.
I designed a pipeline for the CI that looks like this:
Build Core Package and Run UT
Build WebPage and Run UT
Run Integration Tests written in Cucumber.
Deploy to a staging Server
On paper this looks good but now I am trying to implement this and I am having some issues.
Is there a simple way to save the java packages in per Git Branch? Each branch will compile and create a Jar file in step 1 that will be needed in Step 2.
In step 3 how can I use the war built in step 2 to run the tests? Currently I have all inside the lifecycle of Maven, but cannot find a way to split this.
Thanks
You can use jenkins to perform this job
I don't think you want to be putting build assets into source control. You should be able to use Jenkins to run the individual steps of your maven build and use the intermediate jar and war files that it creates in its working directory. If you can't split them out into separate steps within maven for some reason, you may need to simply do the whole maven build and then run the tests with the working files which it shouldn't remove.
Related
I am working on understanding Maven and I'm learning about building your Java app with it.
So when I do a :
maven package
It does build my jar as expected but I see in the output console that Maven does build tests (it always say that the test a run and there are no failure).
I researched on the web about that and learned that Maven use a plugin called Maven Surefire. But I can't understand what does that plugin do to my code, what does the tests "means" ? What does the tests do with my code and how it works behind the console ?
The Maven surefire plugin runs the tests you have written. These are usually in the src/test/java folder. If you have none, the plugin does nothing.
Is this only one question? :D
So. Different things are going on.
You create an application with Java. To test the single components / packages / classes that you create most people use JUnit or TestNg. You usually have dedicated test classes that verify your production code behaves as intended without you clicking through all the things on every change.
When you now use maven to run your build the pom.xml file defines a packaging - in your case "jar" since you create a jar file. The packaging defines what set of default plugins run in the defined maven phases. You probably recognize package here. Maven executes all phases up to package and the registered / configured plugins.
To execute those tests maven provides the surefire plugin which supports running JUnit or TestNg tests. If you follow the directory conventions your tests reside in src/test/java and the surefire includes naming convention maven will execute those tests in every build (as this is the best practice). If you also want to write integration tests then there is the failsafe plugin. That plugin is not enabled by default and runs in different maven phases.
So the tests just run your production code - in fact they just do what you implement in the tests. They don't alter it in any way.
The maven introduction documentation has step by step explanations: Maven in 5 Minutes and the Getting Started Guide.
Starting from scratch this is probably a lot. So don't rush this. The build setup and test setup are very important things to have.
I'm doing a POC for a Java project in an AzureDevops CI/CD pipeline. I created a Maven project that has Selenium tests (TestNG) that run against a demo website which is independent of my project. I want to run unit tests in the build pipeline and UI Selenium tests in the release pipeline.
The Visual Studio test task seems to be the building block that I need. I think you can differentiate between unit tests & UI tests using the 'Test files' field like **\unit*Test.dll, **\ui*Test. Unfortunately, this task is not available/compatible for Java projects.
I was able to run the Selenium tests with the Maven task and Surefire plugin during build but remember, I only want to run unit tests during build.
I actually was able to run the Selenium tests in the release pipeline via a workaround which was:
Copy the whole project to the artifacts directory of the release during build (copy files task).
Add a Maven task to the release pipeline
Trigger the Selenium tests in pom.xml
Normally, you would only copy artifacts to the artifact directory so I think doing that is a huge hack.
Another problem is that Maven will build the project during build and release which is wasteful. To dial back the waste, some savvy Maven configuration might help. I was thinking about skipping compilation and resolve dependencies during release, but I don't know where to find the Maven dependencies in the DevOps ecosystem.
Am I missing something or is AzureDevops maybe not supporting Java all that well?
I do follow a method for Maven selenium tests on Azure DevOps. What I do is, in the build pipeline I build my tests in such a way that it produces a jar with all the dependencies and test classes in it. I also use testng in my approach. Next I copy my build Artifact to Artifactory. This completes the build. Now during the release I download my Artifact from Artifactory and I check the environment where i want to run and I inject the right testng file by running java -jar myfile.jar testngIT.xml. This runs my tests faster and better.
You can try just adding a test task in your release pipeline just as in the build pipeline.
And add a copy task in the build pipeline to copy the test codes and files to the build artifacts and publish it to release pipeline.
Below steps is just for reference(in classic view). Hope it can be of some help.
1, Add copy file task in the build pipeline to copy the all test files and all the dependent setting files to the test folder in artifacts.
2, Publish artifacts to release pipeline
3, In the release pipeline, add the task to execute the tests just like the way you do in build pipeline
I've created maven project with selenium and cucumber. I'm trying to use jira X-ray in a continuous integration setup. Basically I take exported feature files and want to execute them on a command line using bamboo.
I think my main problem is I'm not sure how to feed in feature files to a compiled maven project that has the step definitions.
I have features defined in src/test/resources/shouty
If I only want to run the location.feature using Maven, then I can use the command
mvn test -Dcucumber.options="src/test/resources/shouty/location.feature"
What you want to do is to specify the feature in the CI job using Maven as above.
I had been introduced to concept of CI lately and was trying to work on jenkins CI. I was stuck up in one thing . How to trigger executable testng files in jenkins CI. For ex locally in our machines we just run testng.xml to execute couple of test cases. In the same way how can we trigger this xml file to run in jenkins CI ?
In most cases with jenkins you wouldn't use an executable. Normally you'd run the wrapper for the tests (Junit/Nunit etc.) which Jenkins is fully capable of running on it's own.
You can use this article to run TestNG tests using Maven:
Running TestNG tests using maven
After configuration is completed just add Invoke top-level Maven targets step to the Build Steps in Jenkins (Maven plugin should be installed). The target should be test in this case.
If you will face with any errors during configuration, try to google them.
If you are not using any build tool like maven or ant, you can invoke it from command line as we'll and specify your suite file. Make sure to set the correct class paths http://testng.org/doc/documentation-main.html#running-testng
You can put this as a build step in Jenkins.
Add a compilation step prior to this step. I haven't ever tried it - have always used ant or maven, but that is where I would start exploring.
Step two of "The Joel Test: 12 Steps to Better Code" states "Can you make a build in one step?". My answer to this is currently no. My application is structured as follows:
+
+-MyApp // this is just a vanilla Java Application
+-MyWebApp // this Dynamic Java Web Application (deployed Tomcat and launches
// a thread contained in MyApp)
+-MyCommonStuff // these are common classes shared between MyApp and MyWebApp
// Ex. Database access code & business classes
In order to build and deploy my software I perform the following steps:
1. Checkout MyApp, MyWebApp, MyCommonStuff from svn
2. build MyCommonStuff.jar and copy to a "libs" directory
3. build MyApp and copy to a "libs" directory
4. build MyWebApp.war (Ant build.xml file specifies where MyApp.jar and MyCommonStuff.jar are located)
5. The deploy portion of build.xml used Tomcat deployment tasks to deploy to a tomcat server.
My question is does the Joel rule above apply to this scenario. i.e. should there be a "master" build script which executes steps 1. to 5.?
Should the script just be a normal #/bin/sh script or are there tools I can leverage. My preference would be stick to using Ant and linux console commands.
Thanks
You can (and should) use maven2. It supports everything required (via plugins). You just need to conform to its directory conventions.
In addition I'd suggest a Continous Integration Engine, which will take your maven configuration and execute and deploy everything. Hudson and TeamCity are good options.
An alternative to Maven, if you just want to use Ant, is Ivy. This is just a dependency manager, a bit like Maven but without all the other stuff Maven does.
I would suggest using one of the two. If you have a project with dependencies like this, you're going to make it so much easier for yourself if you store them in a central repository and use a dependency manager to include them!
You should do a global Ant script, calling all little ant parts through the Ant ant task.
Edit after reading other answers : you should also use maven. But if Maven is really overkill, and you just want to launch the whole build in one step, use a global build.xml