Background:
We have a rather large REST API written in Java that we're testing with combination of unit and functional tests. There are many variations that are required when testing it, particularly at the functional level. While the unit tests live in-tree, the functional tests are in a separate code repository.
We are currently using Jacoco for test coverage and TestNG for running our unittests, though I believe answers to my question should be applicable to other tool combinations.
We have several different jobs in Jenkins that are triggered by a check-in to the primary project. These include jobs that run tools like Coverity as well as several different functional test jobs. These jobs are triggered by the initial commit, which is not considered to be "green", until all of the downstream jobs complete successfully.
The Problem:
How do we take coverage reports (like the Jacoco binaries and the TestNG xml files) and combine them to show total code coverage over all of our tests? Specifically, we know how to combine them if they are present in the same job/directory, but these files are spread across multiple Jenkins jobs which may be running at different times.
In my experience, the most commonly accepted way of handling this is to use the
Promoted Builds Plugin to trigger all jobs, then pull their artifacts when it completes down to the triggering job. I don't feel like this scales very well, however, when you have more than one or two jobs that you're attempting to roll-up. This is especially true when you may have more than one variation on the master project (old releases, etc).
I understand it is possible to fingerprint files in Jenkins such that we know that -.jar is the same version used in Jobs A, B, and C. Does a plugin exist that can retrieve all files matching a pattern based on the existence of a different fingerprinted file?
One alternative solution (which would probably be run from an ant/groovy script), is to push test data to a directory somewhere that is tied to a git commit hash, and retrieve all such data in a roll-up job based on the git commit hash of the base project.
Are there any simple ways to do this? Has anyone figured out any better other ways to solve this problem?
Thanks,
Michael
Faced similar issue, tweaked jacoco maven plugin config to merge jacoco results. Basically merged jacoco-unit.exec and jacoco-it.exec into one binary and published that merged result on Jenkins via pipeline step.
pom.xml:
<plugin>
<inherited>false</inherited>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>${jacoco.agent.version}</version>
<executions>
<execution>
<id>merge-results</id>
<phase>post-integration-test</phase>
<goals>
<goal>merge</goal>
</goals>
<configuration>
<fileSets>
<fileSet>
<directory>${project.parent.build.directory}</directory>
<includes>
<include>jacoco-*.exec</include>
</includes>
</fileSet>
</fileSets>
<destFile>${project.parent.build.directory}/jacoco.exec</destFile>
</configuration>
</execution>
</executions>
</plugin>
Jenkinsfile:
echo 'Publish Jacoco trend'
step($class: 'JacocoPublisher',
execPattern: '**/jacoco.exec',
classPattern: '**/classes',
sourcePattern: '**/src/main/java',
)
However you still have to fetch jacoco binaries from several Jenkins builds by another build step or specify their locations explicitly.
Related
I am looking for the best way to measure code coverage for cucumber tests (cucumber jvm).
I found Cobertura but I don't really know how to use and configure it when it has to measure the code coverage for acceptance test and I can't find anything efficient to do that... (For the moment, I just added the maven plugin corresponding to Cobertura, but I don't know what configuration should be done inside).
Do you have any idea ?
If you think I should use any other tool than Cobertura, please tell me :)
Thank you
Before you try and use Cobertura, make sure you understand what it does and whether that applies to your case. Cobertura in fact IS a tool that measures the code coverage BUT it is important to understand how it does that.
Cobertura (and jcoverage which it's based on) calculate the percentage of the code covered by tests, meaning that it is actually checking what lines of code were touched! It is very different from the functional (or business domain) test coverage described by BDD tools like Cucumber that you are using.
Saying that, to use Cobertura you have 2 options:
Single run
Just include it in your dependencies in pom.xml and run: mvn
cobertura:cobertura
Integrate into Maven lifecycle
Add the plugin to your pom.xml
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>cobertura-maven-plugin</artifactId>
<version>2.6</version>
<configuration>
<formats>
<format>html</format>
<format>xml</format>
</formats>
</configuration>
</plugin>
and run mvn clean site-deploy to execute the plugin.
I'm running my tests in Jenkins and Maven and have different test suites in several TestNG.xml files.
Now I manage them directly in pom in this way:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>${surefire.version}</version>
<configuration>
<suiteXmlFiles>
<suiteXmlFile>Testing_Fuzzy_Logic.xml</suiteXmlFile>
<suiteXmlFile>Testing_ACL.xml</suiteXmlFile>
</suiteXmlFiles>
</configuration>
</plugin>
But I'd like to be able to select a specific test suite when start a Jenkins build. And I'd like all the suites to be included to the build as default if I did not select a specific one. So my question is:
What is the proper way to write all my TestNG.xml files in Jenkins parameters and send them to pom file to ask Maven use them when running tests?
Something like this:
Jenkins variable:
Set it to pom file somehow:
The issue has been solved by an example above. At first there was an error in file name and it confused me. I fixed it and my example works okay now.
So one of solutions is: a string TEST_NG_XML parameter which is added to parameterized Jenkins build. All test suites are set there and separated by comma by default. When needed, any of them can be removed (See examples above).
But if there are any other solutions, I'll be delighted to see them.
On jenkins, you can pass top level maven goals in following format
test -DargLine="-DTest_NG_XML=src/test/resources/testng.xml"
In this way , you are passing the value of your Test_NG_XML variable while running the maven test phase.In general execute following maven goal on jenkins.
-DargLine="-Dparameter=value"
test -DsuiteXmlFile=src/test/resources/testng1.xml,src/test/resources/testng2.xml
How can one be sure the precedence of execution of test suites passed this way?
I was facing the same issue, I have split it into separate Jenkins builds with each passing separate suite, to be absolutely sure certain suite runs before the other.
I would like to simplify it, possibly passing this way as indicated above... but did you try the order of execution? Thanks
I think I'm getting memory leaks when using Jenkins to execute my unit tests. If I try to execute more than ~60 unit tests I start to get most tests failing with java.lang.OutOfMemoryError: PermGen space. Often, but not always, the stack trace seems to start in or near org.powermock.core.classloader.MockClassLoader, although it's not consistent. the The maven surefire plugin configuration is pretty straightforward:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.18</version>
<executions>
<execution>
<phase>test</phase>
<configuration>
<reuseForks>false</reuseForks>
<argLine>-XX:PermSize=512m -XX:MaxPermSize=1024m</argLine>
</configuration>
</execution>
</executions>
</plugin>
In Jenkins, MAVEN_OPTS is also set to -XX:MaxPermSize=1024m.
I saw some documents suggesting it might be related to the fact that I was using an older version of powermock, so I upgraded to 1.6.0, but I am still experiencing this error.
I can't reproduce the problem locally, it only seems to happen on the Jenkins server.
I'm not sure how to reliably resolve this: limiting the number of tests cases that execute seems to work OK, but I have 150+ test cases to execute and running batches of 50 tests at a time on the server does not seem like a very good solution. I might be able to give it a bit more memory but it seems like it already has enough, and I don't think surefire needs that much memory when it runs locally. There might be a way to play around with some of the other surefire settings, but I'm not sure which ones I'd need to adjust, or how. Has anyone else every seen this, or have a suggestion for how to resolve it?
This might be relevant: The development environment is IBM's RAD, and the workspace is launched with the option -Xgcpolicy:gencon, which as far as I can tell is specific to IBM's implementation of the JVM. Might this be the reason that the unit tests run fine when I run maven from RAD, but not from Jenkins? If so, what would be an equivalent option for the standard (Oracle) JVM, which Jenkins is using?
The problem is solved. I never figured out where the memory leaks were, but I noticed that in the console, maven would fork for surefire but never included the arguments I passed via <argLine>. When I added the same arguments to the maven command as:
mvn test -DargLine="-XX:MaxPermSize=1024m -Xmx768m"
all the tests executed fine, with no OutOfMemory issues. So I think that the <argLine> element might not work correctly.
I am trying to run maven clover plugin to generate report as well as generate NON-instrumented artifact.
<plugin>
<groupId>com.atlassian.maven.plugins</groupId>
<artifactId>maven-clover2-plugin</artifactId>
<version>3.1.3</version>
<configuration>
<generatePdf>true</generatePdf>
<generateHtml>true</generateHtml>
<licenseLocation>clover.license</licenseLocation>
<!-- the contextFilters element has to be specified within the reporting section and will not work if you specify it in the build section. -->
<!-- contextFilters>try,static,catch</contextFilters -->
</configuration>
</plugin>
mvn clean clover2:instrument clover2:clover install
If I run above according to clover doc instument goal will run in separate lifecycle and will not affect default buildcycle. So It does but problem is I want to skip test during default build lifecycle.
I tried following but it skipped test for both lifecycle.
mvn clean clover2:instrument clover2:clover install -DskipTests
If above works then I can simple set it up on jenkins withou creating mulitple jobs for multiple maven commands.
It is probably not the best idea to do everything in single cryptic maven command (in the same way it is not the best idea to put all your code in a procedure). Why not splitting the command into several steps or even jobs, which will trigger one another? Moreover from CI point of view different kind of jobs ask different priority to fail fast. I do understand that it is not exactly an answer.
I have a lot of Java source code that requires custom pre-processing. I'd like rid of it but that's not feasible right now so I'm stuck with it. Given that I have an unfortunate problem that shouldn't have existed in the first place, how do I solve it using maven?
(For the full story, I'm replacing a python-based build system with a maven one, so one improvement at a time please. Fixing the non-standard source code is harder, and will come later.)
Is it possible using any existing Maven plugins to actually alter the source files during compile time? (Obviously leaving the original, unprocessed code alone)
To be clear, by preprocessing I mean preprocessing in the same sense as antenna or a C compiler would preprocess the code, and by custom I mean that it's completely proprietary and looks nothing at all like C or antenna preprocessing.
There is a Java preprocessor with support of MAVEN: java-comment-preprocessor
This is something that is very doable and I've done something very similar in the past.
An example from a project of mine, where I used the antrun plug-in to execute an external program to process sources:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>process-sources</id>
<phase>process-sources</phase>
<configuration>
<tasks>
<!-- Put the code to run the program here -->
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
Note the tag where I indicate the phase where this is run. Documentation for the lifecycles in Maven is here. Another option is to actually write your own Maven plug-in that does this. It's a little more complex, but is also doable. You will still configure it similarly to what I have documented here.
Maven plugins can hook into the build process at pre-compile time yes, as for whether or not any existing ones will help I have no idea.
I wrote a maven plugin a couple of years ago as part of a university project though, and while the documentation was a bit lacking at the time, it wasn't too complicated. So you may look into rolling your own, there should be plenty of open source projects you can rip ideas or code from (ours was BSD-licenced for instance...)