We have a single evosuite generated unittest .java file. We have other hand-crafted unittest .java files. We execute a build on our server and all unittests (evosuite and hand-crafted) are executed. We then go into our sonar dashboard and specifically look at the coverage. There is coverage reported by sonar/jacoco from the handcrafted unittests. However for the .java file that the evosuite was to test, that file still has 0% coverage even though on another panel in the sonar/jacoco dashboard the evosuite 16 tests were executed and it shows how long in ms they took.
The server where the builds take place was configured for us and we do not manage them. Sonar/Jacoco admin setup was done by others.
I'm at a loss as to why our handcrafted ones (some use #RunWith(JMockit.class) and some don't use a RunWith annotation) show up w/ coverage percentages and the evosuite does not.
Thanks,
Jim
yes, that is an issue that has been reported few times now... so I just added now some documentation about it :-) at:
http://www.evosuite.org/documentation/measuring-code-coverage/
you can read there why you get 0% coverage, and possible workarounds for it.
Related
I have been working on setting up code coverage metric on SonarQube for a Java project using jacoco plugin.
The project builds successfully (using ant), jacocoreport.xml and jacocoreport.csv files get generated with the right information on covered/missed counter values for the covered/uncovered classes and methods. However, the dashboard on SonarQube reports "0.0%" as the metric value even when there is a single class file which has a covered method, and it provides absolutely NO information on "what classes have what coverage". In fact, upon clicking "0.0%" coverage link, I do not see (under "List" or "Tree") the particular folder which contains the classes I have written jUnit Tests for.
I checked with another developer from a different project who has worked on this before. She tells me that her project has around 2000 Unit Test methods and her jacoco.exec file size is 6MB. My project, on the other hand, has 1 to 2 Unit Test method(s) and the file-size is 6KB. Not sure if the jacoco.exec is generated properly.
How do I make sure the dashboard gives information on the covered/uncovered classes? Do you think it's not reporting because the the project has just one covered class file. I don't think this should be a problem. But I am not sure what's wrong.
NOTE: While running the Sonar Scanner locally, I noticed warnings that said some of the class files couldn't be accessed through the ClassLoader. This issue didn't get resolved even after me adding Sonar.Libraries and Sonar.Binaries properties (please ignore uppercase in this text).
I have integrated Jacoco to a maven project.
When computing coverage for a particular class, the report shows code coverage percentage for each method in that class as close to 40% - 50% even though the coverage is 100%. This is the case with multiple other java classes as well.
Any idea as to why this happens? Appreciate your help to solve this.
My co-worker found this morning that compiling a project with Cobertura enabled changes the sonar results on the same project.
On this particular project we ran a build with sonar:sonar and then ran it again with cobertura:cobertura sonar:sonar.
The sonar results in the comparison are now showing that without Cobertura we have 7/78/153/24/0 violations of the 5 severities, but with Cobertura it changes to 7/81/94/24/0, and in particular finds 3 new critical violations and 15 new major violations that aren't found without Cobertura.
One of the biggest changes is that without Cobertura there are 60 violations of the rule against empty methods (many of them constructors) and with Cobertura only 3 of those are reported.
If Cobertura only prevented violations from being found we could run the two independently, but since some violations are only found with Cobertura enabled it seems like we would have to do two separate Sonar analyses.
Is this a known interaction? Is there any workaround other than doing Cobertura and Sonar in separate builds? And using both sets of results to get the best data?
Based on the comment you made let me explain what it seems to be happening:
You are using FindBugs via SonarQube (rules you are mentioning are findbugs rules)
First let's think about the two tools involved here and how they work (roughly) :
FindBugs : it is a static analysis tool based on bytecode : it will read bytecode and raise issue when it detects bad pattern.
Cobertura : Coverage tool : how does this work ? it instruments the bytecode to place probes and when tests are run keep track of which probes where hit or not.
Then you can understand where the issue might be : FindBugs ends up analyzing the bytecode instrumented by Cobertura. That would explain why you have some new issues and why some of the empty methods issues are removed when analyzing with cobertura.
To avoid this issue you have to be sure your bytecode files are not instrumented when you analyze them with FindBugs but (disclaimer, I develop the sonar java plugin so I might be a little biased here ;) ) I would recommend you to stop using FindBugs in favor of the SonarQube Java Analyzer which won't have this issue as its analyzer approach things slightly differently (see this blog post about that)
User error. :-(
It turns out that the user had run a mvn clean prior to running the sonar:sonar with cobertura, so as implied by benzonico, the findbugs rules that have to analyze compiled code didn't run. Only the rules that are run on source code, like the java plugin, generated results. That's why we were missing a bunch of rules and results.
We still have inconsistencies between Bamboo and manual builds, but that would be a topic for a separate post.
As discussed in Open JaCoCo report in Intellij IDEA, when I gather code coverage statistics using Jacoco (rather than native IntelliJ tracing) 0.0% (i.e. "no" coverage) is always shown in the coverage window.
This is after I have done the whole "Analyze -> Show coverage data..." and selected my generated "jacoco.exec" file.
The same "jacoco.exec" file works fine with other tools such as the Jacoco native report generation task, and with Sonar, and these produce the expected coverage report content.
Also native IntelliJ instrumentation works fine.
Can anybody advise me if there is some essential step I must perform to get IntelliJ to accept my Jacoco coverage?
When you add the class to PowerMockito annotation #PrepareForTest for the test classes, the corresponding class will not be showing the code coverage.
Buid source code through IntelliJ before importing jacoco.exec might be solved this showing 0% coverage problem.
For me the solution was to simply add the root package (in my case "de" in your case probably "com") to the "Packages and classes to include in coverage data".
Now i see a reasonable Code Coverage.
I may guess you are trying to see test coverage in same way as it is show by the TeamCity(jaCoCo).
If Yes there are simple workaround how to check all missed branches(just general statistic which branches was visited and how many times) :
That's all my advices , folks
"intellij idea" seems to have no way of showing correct coverage value of jacoco report created. It is indeed misleading and unfair when it shows coverage as 0.0 instead of giving an unsupport format error.
However, as an alternative, we can push jacoco report (created as part of maven build) to the sonar(qube) server using maven-sonar-plugin's target, sonar:sonar
mvn clean install sonar:sonar -Dsonar.host.url=http://<sonar-host>:9000 -Dsonar.projectKey=<sonar project key> -Dsonar.branch=<sonar project branch> -Dsonar.login=<sonar user> -Dsonar.password=<sonar pwd>
sonar.projectKey and sonar.branch properties value can be retrieved from corresponding project created in sonarqube.
Eclipse support incremental compiling. If I save a source file then it will compile the modified files.
Is it possible after such incremental compile also to run the JUnit tests of the same package and show the fail in the error view. Then I can see the JUnit test failing and compiling errors in the same view without extra action. Are there any plugins that can do it?
You have to look at these plugins:
JUnit Max: Not free, developed by Kent Benk (one of the men behind the TDD practice);
MoreUnit: Free, but essentially dedicated to help you write the tests;
Infinitest: Now free, this plugin is dedicated to run the tests related to the files you have just modified.
So regarding your needs, I suggest that you install MoreUnit and Infinitest plugins.
Use ExternalToolBuilder.
It can be triggered by source modify.
There’s Eclipse customized feature(integrate external tool builder) which may meet your need. But it needs extra effort to write the scripts I never used. Automatic test cases is not a convenient way, at least single click to see green bar in Eclipse is enough for me:)
You can run all tests in a project using Alt+Shift+X,T. I think that making it any more automated than this could take a serious performance toll. Incremental compilation is compiling at most 1 file at a time, but you're talking about running potentially hundreds of tests.