i use jacoco plugin for gradle:
apply plugin: 'kotlin'
jacoco {
toolVersion = "0.7.9"
}
jacocoTestReport {
reports {
xml.enabled true
html.enabled false
csv.enabled false
}
}
and then i want to build a package for production
./gradlew build jacocoTestReport
the question is: will the generated package be instrumented by jacoco? if yes, how can build package NOT instrumented = ready for production? and having code coverage run? do i have to run build twice? is it impossible to build code once (sign it) and then test it, measure coverage etc and if all checks passes, deploy it?
JaCoCo provides two ways of performing instrumentation:
so-called "on-the-fly" using Java Agent - http://www.jacoco.org/jacoco/trunk/doc/agent.html
and so-called "offline" - http://www.jacoco.org/jacoco/trunk/doc/offline.html
The difference is that in first case instrumentation happens in memory during execution and so no class or jar files will be changed on disk - quoting the second link:
One of the main benefits of JaCoCo is the Java agent, which instruments classes on-the-fly. This simplifies code coverage analysis a lot as no pre-instrumentation and classpath tweaking is required.
So one of the simplifications that Java agent brings - is exactly that you don't need to worry about packaging or multiple builds. This is IMO one of the advantages of JaCoCo over other coverage tools for Java such as Cobertura and Clover.
And this is one of the reasons why it is highly recommended to use on-the-fly instrumentation - quoting http://www.jacoco.org/jacoco/trunk/doc/cli.html :
the preferred way for code coverage analysis with JaCoCo is on-the-fly instrumentation with the JaCoCo agent. Offline instrumentation has several drawbacks and should only be used if a specific scenario explicitly requires this mode.
One of such specific scenarios - is execution of tests on Android, because there is no way to use Java agent on it. So AFAIK Android Plugin for Gradle, when instructed to measure coverage using JaCoCo, uses offline instrumentation and therefore requires two types of build - with coverage and without for release.
On the other hand JaCoCo Gradle Plugin, which integrates JaCoCo into Gradle for Java projects, AFAIK as of today provides ability to perform only on-the-fly instrumentation and not offline.
Related
I found that jacoco only supports two startup methods, offline and javaagent, but the project requires me to use jacoco to attach a running jvm. Is there any way to achieve it?
When looking at the jacoco source code, I saw that CoverageTransformer said that it does not support class retransformation, so I think it is not feasible to inject jacoco through agentmain?
hi all i have build unit testing for controller, repository and Service but why in my sonarqube code coverage always give me code coverage only 0 percent. my question is how to make my percentage up ?
here what i was code in my testing
Sonarqube uses existing code coverage reports from JaCoCo (in the case of Java). Usually, you would set up the JaCoCo Maven plugin (or Gradle) to gather coverage info on test run and Sonarqube then loads this report.
See also the Sonarqube docs for info on setup. But if you are using Maven/Gradle, I believe Sonarqube is able to automatically pick up the correct file unless you have some special configuration.
I am currently building automated tests which will be generated dynamically in their own class (e.g. TestClass1234567890.class). From those tests (which I will run with the JUnit Platform Launcher) I want to generate a JaCoCo rapport.
Whenever I look up examples for the JaCoCo API, they only show it with a class that implements Runnable, which does not make sense since you'd want to run it on a test class.
Can anyone point me in the correct direction on how to combine the JaCoCo API with the JUnit Platform Launcher? Any resource/example would be appreciated.
You use jacoco report maven plugin:
org.jacoco:jacoco-maven-plugin:0.8.7-SNAPSHOT:report
together with its optional parameters you could define how do you like it
check here
We're using AspectJ in our project and also Jacoco for test coverage report, currently we're facing an issue that due to AspectJ changed the byte code during compiling phase, which makes the code coverage report not correct. One example is due to AspectJ adds extra if-else statement, then the branch coverage shows something like 1/4 but actually there's no condition branch in the source code. Is there some good way to tell Jacoco to ignore all code generated by AspectJ?
Thanks a lot.
I am copying here the answer I just wrote on the JaCoCo mailing list:
You have two options with AspectJ if you want to avoid it compiling from source:
Use LTW with the weaving agent.
Move your aspects into a separate Maven module. Compile your Java modules with the normal Maven Compiler Plugin and the aspect module with AspectJ Maven. Then create another module which just uses AspectJ Maven in order to do binary weaving on a Java module, using both previously created artifacts as dependencies. In this scenario you need to make sure that JaCoCo offline instrumentation is bound to a phase before binary weaving is done.
The easiest way out, though, would be to test your aspects in isolation and also the Java code without aspects and measure coverage there without any issues.
#RajeshTV:
Instructions how to use clover-aspectj-compiler are here:
https://confluence.atlassian.com/display/CLOVER/Clover+AspectJ+Compiler
These instructions are valid for OpenClover as well. Just download the:
org.openclover:clover-aspectj-compiler:1.0.0
org.openclover:clover:4.2.0
and aspectj-rt + aspectj-tools JARs
Next call them like this:
java -cp "clover-4.2.0.jar:clover-aspectj-compiler-1.0.0.jar:aspectjrt.jar:aspectjtools.jar" com.atlassian.clover.instr.aspectj.CloverAjc -d <output directory> <list of files>
It will produce *.class files in the specified directory as well as create clover.db database.
You have to call the command above from your Maven build, for instance by using the exec:exec goal.
Please note that the clover-aspectj-compiler does not have a dedicated Maven plugin to do this automatically, so it's your job to write the whole plumbing.
Is it possible to check in Sonar the quality of the *Test.java source code, e.g. Methods maximum size 100 lines?
The problem is, that the Java Junit tests are growing with the productive code, also the complexity.
We have unit test classes with more than 1000 lines and 2 methods.
We want to check in Sonar some rules for these *Test.java classes.
Since Sonar 3.1, it includes a plugin that has specific PMD rules to be executed against the unit tests (a JIRA was created for that). You can see them in the Configuration > Quality Profiles > Coding Rules.
However, it seems that you want to run a full analysis on the test source code, like you do on the production source code, and get additional metrics (for ex. a % rules compliance and also a % rules compliance for unit tests). I don't think that Sonar provides such feature natively. What you can do is to run 2 Sonar analysis:
Your first analysis is the current one;
The second analysis will consider the src/test/java as the "production" source code. Thus, this second analysis will give you the quality of your code. For this analysis, you can specify a specific Maven profile (or an alternative pom.xml) that will change the project information (for ex. it will indicate that src/test/java is the default sourceDirectory).
I also noticed that SonarQube will by default ignore the test resources for quality analysis. Using schnatterers answer, i found a simple way to create a separate project only including the test classes as sources in SonarQube, therefore triggering the quality anlysis on them. In the POM of the project i want to analyze i add a profile, which changes the sonar properties accordingly:
<profiles>
<profile>
<id>analyze-test-classes</id>
<properties>
<sonar.sources>src/test/java</sonar.sources>
<sonar.tests></sonar.tests>
<sonar.projectName>${project.name}-tests</sonar.projectName>
<sonar.projectKey>${project.groupId}:${project.artifactId}-tests</sonar.projectKey>
</properties>
</profile>
</profiles>
Running Maven with
mvn sonar:sonar -Panalyze-test-classes
will then activate this profile and create an additional project in SonarQube with the suffix -tests, which only contains the analysis of the test classes.
With SonarQube 4.5.2 (don't know when they changed the behavior) it seems to me that unit tests are no longer excluded from the analysis. When running sonar-runner with sonar.sources=src sonar also creates issues for src/test/java.
One approach to use a specific quality ruleset for test code would be to run two analyses: one for the main code and another one for the testing code.
This can be realized as follows:
sonar-project.properties:
sonar.projectName=testSonar
sonar.projectKey=testsonar
sonar.sources=src/main/java
sonar.projectVersion=1.0
Analyse main code: sonar-runner
Analyse test code: sonar-runner -Dsonar.projectKey=testsonar.test -Dsonar.sources=src/test/java -Dsonar.projectName="testSonar TEST"
The different quality profiles must be changed via the server (Dashboard | Project Configuration | Quality Profiles), because -Dsonar.profile is deprecated.
This should also work with analyses through maven or jenkins.