I have been working on setting up code coverage metric on SonarQube for a Java project using jacoco plugin.
The project builds successfully (using ant), jacocoreport.xml and jacocoreport.csv files get generated with the right information on covered/missed counter values for the covered/uncovered classes and methods. However, the dashboard on SonarQube reports "0.0%" as the metric value even when there is a single class file which has a covered method, and it provides absolutely NO information on "what classes have what coverage". In fact, upon clicking "0.0%" coverage link, I do not see (under "List" or "Tree") the particular folder which contains the classes I have written jUnit Tests for.
I checked with another developer from a different project who has worked on this before. She tells me that her project has around 2000 Unit Test methods and her jacoco.exec file size is 6MB. My project, on the other hand, has 1 to 2 Unit Test method(s) and the file-size is 6KB. Not sure if the jacoco.exec is generated properly.
How do I make sure the dashboard gives information on the covered/uncovered classes? Do you think it's not reporting because the the project has just one covered class file. I don't think this should be a problem. But I am not sure what's wrong.
NOTE: While running the Sonar Scanner locally, I noticed warnings that said some of the class files couldn't be accessed through the ClassLoader. This issue didn't get resolved even after me adding Sonar.Libraries and Sonar.Binaries properties (please ignore uppercase in this text).
Related
I am trying to do to unit tests coverage on the java.lang classes with JaCoCo.
Context: What works so far
This is not really standard, but is quite straightforward to copy the sources that are provided in OpenJDK-8-sources (on Ubuntu) in ${sourceDirectory} and the classes that are provided in rt.jar into ${project.build.outputDirectory}, and then the JaCoCo maven plugin can see them and generate suitable output for some classes.
For example, I get some coverage with sources for sun.reflect.ByteVectorFactory.
But I can't get coverage for the classes in java.lang. When calling Byte.toString(), the function is not covered in the report and the Byte.class does not show up in the data produced by classDumpDir.
I had a look in the sources of JaCoCo to see if java.lang is explicitly ignored and I didn't find anything obvious so far, but I am not familiar at all with JaCoCo source code.
Actual question
What can be done to work around this limitation? I am thinking of the following possibilities:
It is a hard limitation of Java and nothing can be done about it
The exclusion of java.lang is hardcoded in JaCoCo because it is a system package name, but this can be changed
by setting some hidden option or environment variable
by overwriting some files in the classpath by providing a modified version of them
by changing the JaCoCo source code
EDIT
I seems I am not alone to try to do this kind of thing.
Mailing list thread:
https://groups.google.com/g/jacoco/c/_tuoA7DHA7E/m/BQj53OvXoUsJ
Pull request on GitHub:
https://github.com/jacoco/jacoco/pull/49
Somewhere in the middle of this, someone mentions that
In particular JDK classes which are used by the Agent
itself can probably not be tracked.
So it may explain why some of the classes are not instrumented.
I did more tests and I see that
java.lang.Byte does not get coverage
java.lang.ProcessBuilder gets coverage
java.util.ArrayList does not get coverage
java.util.Calendar gets coverage
So it seems that classes that are already loaded when the agent loads cannot be instrumented in this way.
I extracted all the classes in rt.jar and instrumented them with offline instrumentation, then replaced rt.jar with their instrumented versions and included the JaCoCo agent in the bootstrap class path (using -Xbootclasspath).
Doing, this, I get the following infinite recursion:
...
at java.lang.System.getProperties(System.java)
at org.jacoco.agent.rt.internal_f3994fa.Offline.getRuntimeData(Offline.java:36)
at org.jacoco.agent.rt.internal_f3994fa.Offline.getProbes(Offline.java:60)
at java.lang.System.$jacocoInit(System.java)
at java.lang.System.getProperties(System.java)
at org.jacoco.agent.rt.internal_f3994fa.Offline.getRuntimeData(Offline.java:36)
It should be possible to catch the recursion to prevent this, but this will clearly require some modifications to JaCoCo source code, and it is therefore not currently possible to have these classes instrumented.
I have some test to check my code. I have generated my report in sonarcloud but I have a problem: The coverage percentage takes into account also the test classes, that are obviously uncovered by other test. Is there any option just to take into account all the classes but the test ones??
First of all i would not adapt Jacoco, but you can exclude files from coverage report within sonarqube/sonarcloud with the sonar.coverage.exclusions property, where you can eg. specify a pattern like **/*Test.java to exlude all java files ending with Test.
Additionally you could also set this up within the UI, represented on the following screenshot:
Sidenote: i would inspect the sonar configuration, for me it looks like that test code is provided as normal sources to sonar. This would create such an topic, but normally sonar has an own property to configure test code. see analysis parameters documentation for further details or if you use gradle or maven plugins, check the respective documentation on how to organize the source code.
Is there a way to get Cobertura to gather test-coverage over several .jar-files. The problem I have is that after a refactoring of classes that where covered, being in the same .jar, are no longer being reported as covered (since they now are in a separate .jar).
So, the question, for a ear-project, containing several source-projects (.jar), is there a way to get the actual coverage for the ear-project instead of a sum of of .jar-coverages.
Basically, the tests reflects behaviour, not code-structure. Since we only changed structure, the behaviour isn't changed. Therefore the tests should not need to change and since the tests are not changed then the coverage should not change.
You've 2 good options:
1) Have your cobertura maven plugin in a single (parent) pom and set the aggregate property to true, which should overlay your cobertura reports on top of each other. See this blog post as an example.
2) If you only care about the report, use a report aggregating tool such as Sonar to not only give you aggregated reports across the project, but a whole host of extra metrics and useful info.
I have red that if you organized your project in a Maven multi-module project (one module for each jar) you should be able to merge the cobertura reports in one report, but I have never tryed what written at that page.
I have two different java projects (I'll call them project 1 and Project 2 for simplicity's sake) loaded into eclipse, and project 1 is added to the build path of project 2. I have imported the only package in the src folder of project 1 into a test class for project 2, and within the code of that test class I have simple object declarations of classes from project 1 as such:
ProjectOneClass object = new ProjectOneClass();
This code compiles without error, the compiler recognizes that these classes are on the build path. When I run the code as a Java application via Junit4, though, the program throws ClassNotFoundExceptions when it comes across these lines of code. The code is supposed to print these ClassNotFoundExceptions to the error log, but for some reason nothing is being printed. I'm using a simple
try {
...
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
structure, so I don't know why it's not printing to the error log.
The JUnit printout is simply:
Junit execution complete
Summary: 4 succeeded, 3 failed.
The four that succeed do not reference the imported project 1 package.
I have tried all manners of changes to the build path configurations, and nothing has shown any promise of improvement. The only thing I can think might be able to fix this is the order of the build path as specified in the Order And Export tab of the build path configuration window. Right now the order is:
project 2 packages
EAR Libraries
JRE System Libraries
JUnit4
a few JAR files (c3p0, commons-codec, ojdbc6)
project 1
I don't know for sure if the problem lies here or elsewhere, though. If anyone can help me out with this, I'd be very grateful. Thanks for reading.
I figured out the issue. Project 1 relies on several JARs that Project 2 doesn't have as part of its build path. I figured that since Project 1 had those JARs set within its build path, it would still work fine. Upon creating instances of classes from Project 1, however, I found that Project 2 required these JARs in its own build path.
This would make sense to me if the classes I was instantiating were implementations/extensions of a class from one of these JARs, but they weren't.
Regardless, by adding a couple of JARs to Project 2 I fixed the problem. Thanks everyone.
You should set the Run Configuration for your Project 2 class file.
In Eclipse, you can right click -- Select Run As -- Run Configuration and then in the classpath tab, make sure Project 1 is added by clicking on Add Projects.
Let's narrow down the problem and see if it relates to a JUnit launch config, or a more general problem involving the two Eclipse projects.
Create a class in Project 2 with a main(String[]) method. In that method, reference one or more of the types in Project 1 that appear to be causing your CNFE's. Run this class using the Java Application launch configuration type. (I'm assuming from your post that you're having trouble with launches based on the "JUnit Test" Launch Configuration type.) Report back.
Also, I've seen some funny behavior using the JUnit Test launch configuration type with JUnit 4 tests of late, although I don't think it's totally broken. So you could try a JUnit 3 configuration.
I'm writing an Ant script to do some additional checks on my Play! Framework Application.
Currently, I am executing my tests from my Ant script by simply making an exec call to "play auto-test".
<exec executable="${play.dir}/play.bat">
<arg line="auto-test"/>
</exec>
Would anyone know how to integrate Clover into a Play test suite? Obviously I am not tied to having to run my tests using the above.
I also tried writing the Ant script using the traditional way of executing JUnit tests (i.e. using Ant's junit target) and I got two problems:
When executing all my tests, only the first would execute successfully while the others would fail for strange reasons
If I just expose one test in my suite and have the test run successfully, it would say I have code coverage of 0%. I then thought I set up clover incorrectly, however, I created a simple class that tested a production class that did nothing and the coverage went up as I would expect.
So if I were to go down the junit route, I would need to know how to execute all my tests so that they can run one after another successfully (it works when using the Play way of running play auto-test) and I would need to know why Clover does not seem to pick up lines of code touched by Play tests.
(I know there is a Cobertura module for Play, however, I find that Clover does a better job telling me an accurate coverage figure)
Update: Unfortunately I am unable to replicate the exact error I was getting before as I have run into compilation issues when I try and compile things manually. I've started to use the Secure module and it only contains Java source files. So in my Ant script, I call play precompile which produces the byte code for the Secure module (as well as everything else in the system including my code). So now when I try and compile my app code using Clover, I think the compiler gets into a bit of a tangle as I have two versions of my classes - one produced by the precompile command (non-clover) and one produced by my own ant compilation (with clover):
[javac] C:\projects\testproject\out\clover\classes\clover8583527041716431332.tmp\model\HouseTest.java:45: incompatible types
[javac] found : play.db.jpa.JPABase
[javac] required: models.House
[javac] __CLR2_5_115y15ygoxiz3dv.R.inc(1527);House found = House.findById(id);
So I essentially have two problems now:
I need to be able to compile my source code that also depends on Play provided modules (e.g. CRUD, Secure) which do not have compiled versions hence my attempt at getting around it by calling play precompile myself in my Ant script
Once I get compilation working, I will undoubtedly have my original issue again of not being able to execute the tests using the junit target.
Update #2: It turns out that the error I got was due to the findById call required a cast from JPABase to House (not that the IDE or play seemed to care about it). So after I went in and put a cast for all of play's "find*" methods, I actually got JUnit and Clover reports! However... I am now getting two kinds of errors:
Entity classes created in Play can be created by extending the Model class which provides magic methods such as those find methods mentioned before as well as the count method. The Model superclass actually extends GenericModel which implements those methods by throwing an UnsupportedOperationException. So obviously Play! does some more magic behind the scenes to provide actual implementations of these methods. Unfortunately, my tests (and production code) rely on methods such as count but they are throwing the exception in my ant/junit scenario (note: everything works fine when running play auto-test.
The other error I am getting is due to the fact that I use the Spring module. In one of my classes (the root class), I call Spring.getBeanOfType(Some.class). Now I use auto-scanning, but in the ant/junit testing environment, the Spring module has yet to set up my spring container and therefore the call just returns null.
I have a feeling there is one magic fix that will resolve both of my issues however I am unsure what that magic fix is.
Clover does source level instrumentation, so it needs source code available. Everything you do before activating clover that generates bytecode will not be "clovered".
Clover for ant intercepts ant-compiler calls, so if you do a <clover-setup/> before any other compilation tasks in your ant script, everything sould be instrumented by clover.
You can execute the resulting compiled code in any way you want, e.g. executing from script or from junit, it does not matter, as long as the code is instrumented (and of course clover.jar is available in the classpath).
Clover hard-codes the location of the clover-database into the instrumented code, so you do not have to specify anything when executing.
It would really help, if you could outline how you are using clover, and you could also do a recheck at clover documentation at http://confluence.atlassian.com/display/CLOVER/1.+QuickStart+Guide.