Unit test coverage on java.lang classes with JaCoCo - java

I am trying to do to unit tests coverage on the java.lang classes with JaCoCo.
Context: What works so far
This is not really standard, but is quite straightforward to copy the sources that are provided in OpenJDK-8-sources (on Ubuntu) in ${sourceDirectory} and the classes that are provided in rt.jar into ${project.build.outputDirectory}, and then the JaCoCo maven plugin can see them and generate suitable output for some classes.
For example, I get some coverage with sources for sun.reflect.ByteVectorFactory.
But I can't get coverage for the classes in java.lang. When calling Byte.toString(), the function is not covered in the report and the Byte.class does not show up in the data produced by classDumpDir.
I had a look in the sources of JaCoCo to see if java.lang is explicitly ignored and I didn't find anything obvious so far, but I am not familiar at all with JaCoCo source code.
Actual question
What can be done to work around this limitation? I am thinking of the following possibilities:
It is a hard limitation of Java and nothing can be done about it
The exclusion of java.lang is hardcoded in JaCoCo because it is a system package name, but this can be changed
by setting some hidden option or environment variable
by overwriting some files in the classpath by providing a modified version of them
by changing the JaCoCo source code
EDIT
I seems I am not alone to try to do this kind of thing.
Mailing list thread:
https://groups.google.com/g/jacoco/c/_tuoA7DHA7E/m/BQj53OvXoUsJ
Pull request on GitHub:
https://github.com/jacoco/jacoco/pull/49
Somewhere in the middle of this, someone mentions that
In particular JDK classes which are used by the Agent
itself can probably not be tracked.
So it may explain why some of the classes are not instrumented.

I did more tests and I see that
java.lang.Byte does not get coverage
java.lang.ProcessBuilder gets coverage
java.util.ArrayList does not get coverage
java.util.Calendar gets coverage
So it seems that classes that are already loaded when the agent loads cannot be instrumented in this way.
I extracted all the classes in rt.jar and instrumented them with offline instrumentation, then replaced rt.jar with their instrumented versions and included the JaCoCo agent in the bootstrap class path (using -Xbootclasspath).
Doing, this, I get the following infinite recursion:
...
at java.lang.System.getProperties(System.java)
at org.jacoco.agent.rt.internal_f3994fa.Offline.getRuntimeData(Offline.java:36)
at org.jacoco.agent.rt.internal_f3994fa.Offline.getProbes(Offline.java:60)
at java.lang.System.$jacocoInit(System.java)
at java.lang.System.getProperties(System.java)
at org.jacoco.agent.rt.internal_f3994fa.Offline.getRuntimeData(Offline.java:36)
It should be possible to catch the recursion to prevent this, but this will clearly require some modifications to JaCoCo source code, and it is therefore not currently possible to have these classes instrumented.

Related

How to exclude test classes form Coverage Analysis using Jacoco

I have some test to check my code. I have generated my report in sonarcloud but I have a problem: The coverage percentage takes into account also the test classes, that are obviously uncovered by other test. Is there any option just to take into account all the classes but the test ones??
First of all i would not adapt Jacoco, but you can exclude files from coverage report within sonarqube/sonarcloud with the sonar.coverage.exclusions property, where you can eg. specify a pattern like **/*Test.java to exlude all java files ending with Test.
Additionally you could also set this up within the UI, represented on the following screenshot:
Sidenote: i would inspect the sonar configuration, for me it looks like that test code is provided as normal sources to sonar. This would create such an topic, but normally sonar has an own property to configure test code. see analysis parameters documentation for further details or if you use gradle or maven plugins, check the respective documentation on how to organize the source code.

Java 9 module issue with unit tests in IntelliJ: Cannot require JUnit Jupiter [duplicate]

Context
I have a project with following traits
IntelliJ Ultimate 2020.1
Java 13 (with module-info.java)
Kotlin 1.3.72
JUnit (+ truth)
maven (I believe this to be unimportant)
The code base is mixed, some classes are written using plain Java, others with Kotlin, the same is true for tests. Everything works as expected, that is
all code is compiled in proper order and fully interoperable between Kotlin and Java
all test can be executed using either mvn test or IntelliJ "Run Test"
the resulting jar can be run (for the sake of providing context)
but...
apart from the fact that everything works, IntelliJ warns me about a non declared module dependency only if the test class is written in Kotlin. This warning is not displayed for test classes written in plain Java.
The warning:
Error:(9, 6) Symbol is declared in module 'org.junit.jupiter.api' which current module does not depend on
That warning normally allows one to import / require the respective module / dependency, but there is no solution offered in [alt]+[enter] dialog.
Things I have tried so far:
upgrading from JUnit 4 to 5 didn't change the situation
googling to no avail :(
making sure tests written in Kotlin are really executed when mvn test is run by making a test fail
manually running test using IntelliJ "Run Test"
converting tests back and forth from / to Kotlin
explicitly requiring the the JUnit API / truth in module-info
The latter obviously prevents the warning but is no solution since that actually produces a hard dependency. From what I found out while googling, the maven-surefire-plugin makes sure the test-dependencies are included. Also: running mvn test works like charm as stated above, so this does not seem to be part of the problem.
Seeing all the red lines when writing test is really annyoing...
the suspect behavior
same test but in java - everything is fine
Question:
How can I fix that warning for Kotlin Test Classes in IntelliJ?
Note
I have come to believe this is a bug in IntelliJ but I'd be happy to be shown what I overlooked.
Since everything from compiling to running with Maven works like a charm, I excluded details regarding project structure and so on. The issue at hand is about the warning in the IntelliJ, not about a broken build or non-functional jars. I'll glady add those in case they turn out to be necessary.
also since everthing actually works (apart from the annoying warning), I really don't know where to continue researching and hence created a question.
This is a bug in IDEA Kotlin plugin error highlighting: https://youtrack.jetbrains.com/issue/KT-26037
Workaround: add #file:Suppress("JAVA_MODULE_DOES_NOT_DEPEND_ON_MODULE") to the test file before the package declaration.

SonarQube not displaying classes under coverage

I have been working on setting up code coverage metric on SonarQube for a Java project using jacoco plugin.
The project builds successfully (using ant), jacocoreport.xml and jacocoreport.csv files get generated with the right information on covered/missed counter values for the covered/uncovered classes and methods. However, the dashboard on SonarQube reports "0.0%" as the metric value even when there is a single class file which has a covered method, and it provides absolutely NO information on "what classes have what coverage". In fact, upon clicking "0.0%" coverage link, I do not see (under "List" or "Tree") the particular folder which contains the classes I have written jUnit Tests for.
I checked with another developer from a different project who has worked on this before. She tells me that her project has around 2000 Unit Test methods and her jacoco.exec file size is 6MB. My project, on the other hand, has 1 to 2 Unit Test method(s) and the file-size is 6KB. Not sure if the jacoco.exec is generated properly.
How do I make sure the dashboard gives information on the covered/uncovered classes? Do you think it's not reporting because the the project has just one covered class file. I don't think this should be a problem. But I am not sure what's wrong.
NOTE: While running the Sonar Scanner locally, I noticed warnings that said some of the class files couldn't be accessed through the ClassLoader. This issue didn't get resolved even after me adding Sonar.Libraries and Sonar.Binaries properties (please ignore uppercase in this text).

Cobertura, coverage over multiple jar-files

Is there a way to get Cobertura to gather test-coverage over several .jar-files. The problem I have is that after a refactoring of classes that where covered, being in the same .jar, are no longer being reported as covered (since they now are in a separate .jar).
So, the question, for a ear-project, containing several source-projects (.jar), is there a way to get the actual coverage for the ear-project instead of a sum of of .jar-coverages.
Basically, the tests reflects behaviour, not code-structure. Since we only changed structure, the behaviour isn't changed. Therefore the tests should not need to change and since the tests are not changed then the coverage should not change.
You've 2 good options:
1) Have your cobertura maven plugin in a single (parent) pom and set the aggregate property to true, which should overlay your cobertura reports on top of each other. See this blog post as an example.
2) If you only care about the report, use a report aggregating tool such as Sonar to not only give you aggregated reports across the project, but a whole host of extra metrics and useful info.
I have red that if you organized your project in a Maven multi-module project (one module for each jar) you should be able to merge the cobertura reports in one report, but I have never tryed what written at that page.

Performing code coverage using Clover on a Play! Framework Application using Ant

I'm writing an Ant script to do some additional checks on my Play! Framework Application.
Currently, I am executing my tests from my Ant script by simply making an exec call to "play auto-test".
<exec executable="${play.dir}/play.bat">
<arg line="auto-test"/>
</exec>
Would anyone know how to integrate Clover into a Play test suite? Obviously I am not tied to having to run my tests using the above.
I also tried writing the Ant script using the traditional way of executing JUnit tests (i.e. using Ant's junit target) and I got two problems:
When executing all my tests, only the first would execute successfully while the others would fail for strange reasons
If I just expose one test in my suite and have the test run successfully, it would say I have code coverage of 0%. I then thought I set up clover incorrectly, however, I created a simple class that tested a production class that did nothing and the coverage went up as I would expect.
So if I were to go down the junit route, I would need to know how to execute all my tests so that they can run one after another successfully (it works when using the Play way of running play auto-test) and I would need to know why Clover does not seem to pick up lines of code touched by Play tests.
(I know there is a Cobertura module for Play, however, I find that Clover does a better job telling me an accurate coverage figure)
Update: Unfortunately I am unable to replicate the exact error I was getting before as I have run into compilation issues when I try and compile things manually. I've started to use the Secure module and it only contains Java source files. So in my Ant script, I call play precompile which produces the byte code for the Secure module (as well as everything else in the system including my code). So now when I try and compile my app code using Clover, I think the compiler gets into a bit of a tangle as I have two versions of my classes - one produced by the precompile command (non-clover) and one produced by my own ant compilation (with clover):
[javac] C:\projects\testproject\out\clover\classes\clover8583527041716431332.tmp\model\HouseTest.java:45: incompatible types
[javac] found : play.db.jpa.JPABase
[javac] required: models.House
[javac] __CLR2_5_115y15ygoxiz3dv.R.inc(1527);House found = House.findById(id);
So I essentially have two problems now:
I need to be able to compile my source code that also depends on Play provided modules (e.g. CRUD, Secure) which do not have compiled versions hence my attempt at getting around it by calling play precompile myself in my Ant script
Once I get compilation working, I will undoubtedly have my original issue again of not being able to execute the tests using the junit target.
Update #2: It turns out that the error I got was due to the findById call required a cast from JPABase to House (not that the IDE or play seemed to care about it). So after I went in and put a cast for all of play's "find*" methods, I actually got JUnit and Clover reports! However... I am now getting two kinds of errors:
Entity classes created in Play can be created by extending the Model class which provides magic methods such as those find methods mentioned before as well as the count method. The Model superclass actually extends GenericModel which implements those methods by throwing an UnsupportedOperationException. So obviously Play! does some more magic behind the scenes to provide actual implementations of these methods. Unfortunately, my tests (and production code) rely on methods such as count but they are throwing the exception in my ant/junit scenario (note: everything works fine when running play auto-test.
The other error I am getting is due to the fact that I use the Spring module. In one of my classes (the root class), I call Spring.getBeanOfType(Some.class). Now I use auto-scanning, but in the ant/junit testing environment, the Spring module has yet to set up my spring container and therefore the call just returns null.
I have a feeling there is one magic fix that will resolve both of my issues however I am unsure what that magic fix is.
Clover does source level instrumentation, so it needs source code available. Everything you do before activating clover that generates bytecode will not be "clovered".
Clover for ant intercepts ant-compiler calls, so if you do a <clover-setup/> before any other compilation tasks in your ant script, everything sould be instrumented by clover.
You can execute the resulting compiled code in any way you want, e.g. executing from script or from junit, it does not matter, as long as the code is instrumented (and of course clover.jar is available in the classpath).
Clover hard-codes the location of the clover-database into the instrumented code, so you do not have to specify anything when executing.
It would really help, if you could outline how you are using clover, and you could also do a recheck at clover documentation at http://confluence.atlassian.com/display/CLOVER/1.+QuickStart+Guide.

Categories

Resources