Configure Sonar 6.2 code coverage on a Maven project - java

I have pretty standard Maven multi-modules project (with JUnit, Arquillian and Selenium tests). I have Sonar 6.2 installed on a server. And on my project on Sonar the Code Coverage metric indicates 0.0%. But I know it's wrong as I do have some test coverage.
I found this Generic Test Data documentation page that explains that since 6.2 Sonar is supporting code coverage out of the box and that I have to pass a comma-delimited list of report paths to a parameter sonar.coverageReportPaths (I guess provided either in my pom or in command line).
I'm fine with that. But I cannot find out an example on how to setup this for a pretty classical Java project. What kind of file do I need to give in the list ? The relative paths to each of my Surefire/Failsafe reports ? Do I need to generate Jacoco reports in addition ? Can I give a "generic" path like report.xml if all of my reports have the same name ?

For standard Java project is it probably the easiest to use JaCoCo to generate coverage data and then feed it to SonarJava (SonarQube's plugin to analyze Java code). You can find documentation here
https://docs.sonarqube.org/display/PLUG/Code+Coverage+by+Unit+Tests+for+Java+Project
You might find mentions of separation between unit tests and integration tests, this has been deprecated and now there is only single kind of coverage.
Don't hesitate to reach out to mailing list or ask question if something is not clear, we are in the process of improving this documentation.

Related

How to exclude test classes form Coverage Analysis using Jacoco

I have some test to check my code. I have generated my report in sonarcloud but I have a problem: The coverage percentage takes into account also the test classes, that are obviously uncovered by other test. Is there any option just to take into account all the classes but the test ones??
First of all i would not adapt Jacoco, but you can exclude files from coverage report within sonarqube/sonarcloud with the sonar.coverage.exclusions property, where you can eg. specify a pattern like **/*Test.java to exlude all java files ending with Test.
Additionally you could also set this up within the UI, represented on the following screenshot:
Sidenote: i would inspect the sonar configuration, for me it looks like that test code is provided as normal sources to sonar. This would create such an topic, but normally sonar has an own property to configure test code. see analysis parameters documentation for further details or if you use gradle or maven plugins, check the respective documentation on how to organize the source code.

Is there a plugin or tool which I can use to generate coverage from running application without testcases?

I have a Java based web application which have few ReST endpoints exposed. I want to check the code coverage in running VM. Is there any tool or plugin I can use for this purpose?
I tried looking into jacoco but It looks like it provides code coverage only if you have configured unit/integration tests.
Sometimes, it becomes very difficult to write testcases for all possible scenarios. So, is there a way I can get code coverage without test cases?
Thanks a ton in advance. :)
After doing more search on internet, I have found a very good link which fulfill my requirements:
https://automationrhapsody.com/code-coverage-of-manual-or-automated-tests-with-jacoco/
In short, follow below steps to generate code coverage report without testcases:
Install Jacoco Eclipse plugin: EclEmma Java Code Coverage
Download jacocoagent.jar and put it some location on your computer e.g.
C:\JoCoCo\jacocoagent.jar
Run your application with this VM arguments: -
javaagent:C:\JaCoCo\jacocoagent.jar=output=tcpserver
Import coverage reports:File -> Import -> Coverage Session -> select Agent
address radio button but leave defaults -> enter some name and select code
under test.

SonarQube not displaying classes under coverage

I have been working on setting up code coverage metric on SonarQube for a Java project using jacoco plugin.
The project builds successfully (using ant), jacocoreport.xml and jacocoreport.csv files get generated with the right information on covered/missed counter values for the covered/uncovered classes and methods. However, the dashboard on SonarQube reports "0.0%" as the metric value even when there is a single class file which has a covered method, and it provides absolutely NO information on "what classes have what coverage". In fact, upon clicking "0.0%" coverage link, I do not see (under "List" or "Tree") the particular folder which contains the classes I have written jUnit Tests for.
I checked with another developer from a different project who has worked on this before. She tells me that her project has around 2000 Unit Test methods and her jacoco.exec file size is 6MB. My project, on the other hand, has 1 to 2 Unit Test method(s) and the file-size is 6KB. Not sure if the jacoco.exec is generated properly.
How do I make sure the dashboard gives information on the covered/uncovered classes? Do you think it's not reporting because the the project has just one covered class file. I don't think this should be a problem. But I am not sure what's wrong.
NOTE: While running the Sonar Scanner locally, I noticed warnings that said some of the class files couldn't be accessed through the ClassLoader. This issue didn't get resolved even after me adding Sonar.Libraries and Sonar.Binaries properties (please ignore uppercase in this text).

Sonarlint and other tools for eclipse to mimic sonarqube

On my local machine for my project trying to see the code quality related issues that sonarqube shows.
Using eclipse IDE.
Installed sonarlint plugin and I am able to see most of the issues that I see in the sonarqube for my project.
But, don’t see issues related to duplicate code etc
From what I see on internet sonarqube uses other third party tools like PMD, checksyle, findbugs to show other issues apart from what sonarlint shows.
Who usually provide the xml rulesets for PMD, Checkstyle etc in the company? Is it the sonar team or the architecture team? or the project team leads create one and provide it to the team.
At the time of this writing, SonarLint runs analyses file by file,
so it cannot display errors involving multiple files, such as:
Duplications
Test coverage
Package-level issues (package-info.java is missing, etc)
Furthermore, it only shows issues from SonarSource analyzers,
third party analyzers such as PMD and checkstyle are excluded.
Finally, it will show the same issues as you see in SonarQube in connected mode.
Otherwise, in standalone mode,
it uses the default quality profile (= set of rules),
as defined by its embedded analyzers,
which may slightly vary depending on their version.
The differences that you see between SonarLint and SonarQube will come down to one or more of the reasons as explained above.
Who usually provide the xml rulesets for PMD, Checkstyle etc in the company? Is it the sonar team or the architecture team? or the project team leads create one and provide it to the team.
That depends on the company, and any answer to this would be subjective.

Cobertura changes Sonar violations

My co-worker found this morning that compiling a project with Cobertura enabled changes the sonar results on the same project.
On this particular project we ran a build with sonar:sonar and then ran it again with cobertura:cobertura sonar:sonar.
The sonar results in the comparison are now showing that without Cobertura we have 7/78/153/24/0 violations of the 5 severities, but with Cobertura it changes to 7/81/94/24/0, and in particular finds 3 new critical violations and 15 new major violations that aren't found without Cobertura.
One of the biggest changes is that without Cobertura there are 60 violations of the rule against empty methods (many of them constructors) and with Cobertura only 3 of those are reported.
If Cobertura only prevented violations from being found we could run the two independently, but since some violations are only found with Cobertura enabled it seems like we would have to do two separate Sonar analyses.
Is this a known interaction? Is there any workaround other than doing Cobertura and Sonar in separate builds? And using both sets of results to get the best data?
Based on the comment you made let me explain what it seems to be happening:
You are using FindBugs via SonarQube (rules you are mentioning are findbugs rules)
First let's think about the two tools involved here and how they work (roughly) :
FindBugs : it is a static analysis tool based on bytecode : it will read bytecode and raise issue when it detects bad pattern.
Cobertura : Coverage tool : how does this work ? it instruments the bytecode to place probes and when tests are run keep track of which probes where hit or not.
Then you can understand where the issue might be : FindBugs ends up analyzing the bytecode instrumented by Cobertura. That would explain why you have some new issues and why some of the empty methods issues are removed when analyzing with cobertura.
To avoid this issue you have to be sure your bytecode files are not instrumented when you analyze them with FindBugs but (disclaimer, I develop the sonar java plugin so I might be a little biased here ;) ) I would recommend you to stop using FindBugs in favor of the SonarQube Java Analyzer which won't have this issue as its analyzer approach things slightly differently (see this blog post about that)
User error. :-(
It turns out that the user had run a mvn clean prior to running the sonar:sonar with cobertura, so as implied by benzonico, the findbugs rules that have to analyze compiled code didn't run. Only the rules that are run on source code, like the java plugin, generated results. That's why we were missing a bunch of rules and results.
We still have inconsistencies between Bamboo and manual builds, but that would be a topic for a separate post.

Categories

Resources