I know that I am gonna have to use // CLOVER: OFF to turn off clover test coverage. I have read https://confluence.atlassian.com/display/CLOVER/Using+Source+Directives
I have added that line before my class declaration like:
// CLOVER: OFF
public class SampleClass{
/*
* Some definitions
*/
}
This thing worked for me yesterday and failing today. I am scratching my head trying to figure out a reason for failure.
But, my maven build failed because it did not meet the coverage percentage. I am using maven 3.3.9 and eclipse Neon for my project.
It's the space between CLOVER: and OFF which causes problems. You should use the directives exactly as it's described in the docs
// CLOVER:ON
// CLOVER:OFF
Are you trying to exclude whole file from instrumentation? If so you can simply exclude classes from instrumentation on a Maven or Eclipse level. Documentation links:
https://confluence.atlassian.com/display/CLOVER/Configuring+instrumentation
https://confluence.atlassian.com/display/CLOVER/4.+Scope+of+instrumentation+in+Eclipse#id-4.ScopeofinstrumentationinEclipse-Excludingandincludingpackages
If you want to exclude one complete class from clover, you can do this in the pom configuration.
<configuration>
<excludes>
<exclude>**/SampleClass.java</exclude>
</excludes>
</configuration>
If you want to exclude just one or more methods, use CLOVER:OFF/CLOVER:ON around them.
public class SampleClass {
private int num;
public SampleClass() {
// intentionally left blank
}
// CLOVER:OFF
public void setNum(int num) {
this.num = num;
}
// CLOVER:ON
}
Beware of the fact that clover will still scan the excluded code if you have nothing left to scan in the class, hence the empty constructor.
Related
I'm using Cucumber for my API tests and save the shared context in ScenarioContext class which is marked with #ScenarioScoped and #Injected per Steps class. Of course some features use steps from cross-classes as well (which causes the bug described below, but I dont want to change that due to DRY).
TestS run fine when I do them sequentially however they mix up the token Im saving in scenarioContext when I run them parallelly.
Has anyone faced this issue before? I tried with separate runners and 1 runner however the result was the same. Here's my pom.xml corresponding config:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0-M7</version>
<configuration>
<includes>
<include>**/*Suite.java</include>
</includes>
<forkCount>8C</forkCount>
<reuseForks>false</reuseForks>
</configuration>
</plugin>
</plugins>
</build>
Above configuration should give the highest level of separation but my tests still fail at random (which does NOT happen with sequential run).
I've read following documentations and didnt find corresponding answers:
https://maven.apache.org/surefire/maven-surefire-plugin/test-mojo.html
https://maven.apache.org/surefire/maven-surefire-plugin/examples/process-communication.html
https://maven.apache.org/surefire/maven-surefire-plugin/examples/fork-options-and-parallel-execution.html
#Alexey R. - Im adding additional code :)
So my ScenarioContext class looks as follows:
#ScenarioScoped
public class ScenarioContext extends Base {
Map<String, Object> scenarioContext;
public ScenarioContext() throws Exception {
this.scenarioContext = new HashMap<>();
}
public void setContext(Context key, Object value) {
scenarioContext.put(key.toString(), value);
}
public Object getContext(Context key) {
return scenarioContext.get(key.toString());
}
- some other general shortcut methods
}
And in my Steps classes aforementioned scenarioContext is called as follows:
public class myStepsClass extends Base {
#Inject
private ScenarioContext scenarioContext;
public myStepsClass(ScenarioContext scenarioContext) throws Exception {
this.scenarioContext = scenarioContext;
}
+[gherkin steps getting/setting values to scenarioContext]
I have also added a System.out.println("New scenario Context") in Scenario Context constructor and it seems to be printed out once per thread however I still get different results when tests are run in parallel :(
I run my JUnit and Mockito tests in a big project. I use them for testing my server-side components that connect to web-service. All these connections require some time and it is not neccessary for them to be executed during the build.
I would like that my tests would be ignored during the build.
I have about 10 classes with tests. So the obvious way is to annotate all the classes with #Ignore. However I should do this every time I commit my code to the project and then re-annotate all tests. Not the very best solution I think.
So is this possible somehow simply ignore all package (let say com.example.tests) with the tests?
Or what might be the solution to ignore tests in the build in a simple way?
You can mention on your build.gradle what packages to exclude from tests
test {
exclude '**/*IntegrationTest*'
}
same for maven:
must consider this notation:
By default, the Surefire Plugin will automatically include all test classes with the following wildcard patterns:
"**/Test*.java" - includes all of its subdirectories and all Java filenames that start with "Test".
"**/*Test.java" - includes all of its subdirectories and all Java filenames that end with "Test".
"**/*Tests.java" - includes all of its subdirectories and all Java filenames that end with "Tests".
"**/*TestCase.java" - includes all of its subdirectories and all Java filenames that end with "TestCase".
<project>
[...]
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.20</version>
<configuration>
<excludes>
<exclude>*com.example.tests*/*Test.java</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
[...]
</project>
Another option is the old
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.19.1</version>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
or even when call it
mvn install -DskipTests
Using Categories seems to be an option that can come in handy
This is how you may add these to your Gradle script.
test {
useJUnit {
includeCategories 'org.gradle.junit.CategoryA'
excludeCategories 'org.gradle.junit.CategoryB'
}
}
A sample can be found here, adding it for a quick reference.
public interface FastTests
{
/* category marker */
}
public interface SlowTests
{
/* category marker */
}
public class A
{
#Category(SlowTests.class)
#Test public void a()
{
}
}
#Category(FastTests.class})
public class B
{
#Test public void b()
{
}
}
#RunWith(Categories.class)
#IncludeCategory(SlowTests.class)
#ExcludeCategory(FastTests.class)
#SuiteClasses({ A.class, B.class })
public class SlowTestSuite
{
}
I have found the solution for my case.
To disable all the tests during the build or even in any other context that you want the Spring annotation #IfProfileValue can be used. All tests with this annotation will be executed only in wanted context.
The example is this:
#IfProfileValue(name="enableTests", value="true")
public class DemoApplicationTests {
#Test
public void contextLoads() {
...
}
}
In my IDE I can edit the configuration and set the variable by:
-DenableTests=true
This annotation can be used on the level of a class or on the level of a test also.
All classes or tests annotated with such #IfProfileValue will be executed only in my environment and will be ignored during the build.
This approach is the best for me because it is not convenient in my project to change main pom.xml for my own test needs.
Addition.
Also in Spring or Spring Boot you should add Runner.
For example in Spring:
#RunWith(SpringJUnit4ClassRunner.class)
#IfProfileValue(name="enableTests", value="true")
#ContextConfiguration(classes = { YourClassConfig.class })
YourClassConfig might be empty:
#Configuration
public class YourClassConfig {
}
I've been trying for quite some time to implement my own custom Java rule(s) on SonarQube. However, it seems like no matter what I try, I can't get the new rule to show up on the SonarQube UI.
I only have one rule at the moment, a security rule that checks to see if text output is sanitized. The rule extends BaseTreeVisitor and implements JavaFileScanner. It overrides visitMethodInvocation to do some checks on String arguments for the relevant methods. Here is the rule definition annotation:
#Rule(key = "Sanitize_HTML",
name = "HTML Responses Should be Sanitized",
tags = {"security", "owasp-a3"},
priority = Priority.CRITICAL)
#ActivatedByDefault
#SqaleSubCharacteristic(RulesDefinition.SubCharacteristics.SECURITY_FEATURES)
#SqaleConstantRemediation("10min")
public class SanitizeHTMLCheck extends BaseTreeVisitor implements JavaFileScanner{...}
After writing the rule, I wanted to test it, but quickly realized I had to wrap it in a plugin in order to do so. I wrote three additional classes for this, based entirely on the provided example plugin. Here's the base class:
public class SecurityPlugin extends SonarPlugin{
public List getExtensions(){
return Arrays.asList(
JavaClasspath.class,
JavaTestClasspath.class,
Java.class,
SecurityRulesDefinition.class,
SonarComponents.class,
DefaultJavaResourceLocator.class);
}
}
The classes in the list are all irrelevant (added in desperation) except for SecurityRulesDefinition. It mirrors the structure of the MyJavaRulesDefinition class from the example:
public class SecurityRulesDefinition implements RulesDefinition{
public void define(Context context){
NewRepository repository = context
.createRepository(RulesList.REPOSITORY_KEY, Java.KEY)
.setName("Security Rules");
AnnotationBasedRulesDefinition.load(repository, Java.KEY, RulesList.getChecks());
for(NewRule rule : repository.rules()){
rule.setInternalKey(rule.key());
}
repository.done();
}
}
Finally, just like the example, here's RulesList, where all of my rule classes are supposed to go:
public class RulesList {
public static final String REPOSITORY_KEY = "security_java";
private RulesList(){}
public static List<Class> getChecks(){
return ImmutableList.<Class>builder().addAll(getJavaChecks()).addAll(getJavaTestChecks()).build();
}
//Add all checks to here...
public static List<Class<? extends JavaCheck>> getJavaChecks(){
return ImmutableList.<Class<? extends JavaCheck>>builder()
.add(SanitizeHTMLCheck.class)
.build();
}
//Put all test checks here
public static List<Class<? extends JavaCheck>> getJavaTestChecks(){
return ImmutableList.<Class<? extends JavaCheck>>builder()
.build();
}
}
Like I said, these are all pretty much ripped from the example plugin, so I have no idea what could be wrong with them.
I'm using Eclipse with M2E to try and build the plugin. As suggested by the documentation's Coding A Plugin page, I've added the following plugin tag to my POM.xml:
<build>
<plugins>
<plugin>
<groupId>org.codehaus.sonar</groupId>
<artifactId>sonar-packaging-maven-plugin</artifactId>
<version>1.13</version>
<extensions>true</extensions>
<configuration>
<pluginKey>securityrules</pluginKey>
<pluginClass>org.myOrg.sonar_analysis.security_rules_java.SecurityPlugin</pluginClass>
<pluginName>Sonar Java Custom Security Rules</pluginName>
<pluginDescription>Implements several checks against OWASP-Top-10 vulnerabilities.</pluginDescription>
</configuration>
</plugin>
</plugins>
</build>
Now, according to everything I've read, I should be able to build the project (right-click on the project > Run As > Maven Build (with goal "package") and drop the resulting .jar into SONAR_HOME/extensions/plugins, and when I restart the server, the rule (and repository) should be there. However, no matter what I try, it's never there. I've spent hours combing the internet and trying anything I find, but the rule never shows up in the UI.
Am I missing something? Have I done something wrong? Is my code incorrect or missing anything?
Thank you for reading this monster post. Any advice you have is valuable, as I'm out of ideas.
The structure of the code seems right for me (more or less).
In the SecurityPlugin class, you return many classes (JavaClasspath.class, JavaTestClasspath.class and so on)... What are they for? What do they implement/extend?
In my expirience you need to return there:
- a "RulesDefinition" (to see the rule in SonarQube) and
- a CheckRegistrar (to let the checks being used).
Maybe my small rules project will give you some ideas (https://github.com/arxes-tolina/sonar-plugins ; one rule with two checks).
If you are still struggling with the rules try to set the sonar.log.level-property (./conf/sonar.properties) to DEBUG and watch the start-up of SonarQube.
Consider this:
#Nullable Object obj = null;
Optional<Object> optional = Optional.ofNullable(obj);
This fails because checker-framework assumes ofNullable cannot accept null values (after all, its parameter is not marked as #Nullable).
Is there a good way to tell checker-framework that this method (or other methods in legacy code that I cannot change), accepts #Nullable types everywhere without having to change code everywhere?
EDIT: this answer was based on #mernst help in the comments and in the Checker Framework's Issue tracker
If you, like me, do not want or cannot use the annotated JDK, you will run into this issue.
Note: In most Java shops I've worked, we simply cannot switch which compiler we use or provide a "custom" JDK (that's really unthinkable). For that to be portable, I would have to add the custom JDK to my source repository, for starters, or distribute it to every machine, including CI servers, where the code compiles, and make sure they are in the exact same path across different OS's. Just not cool.
The solution is to provide stub classes and pass them as an argument to the javac process.
This can be done quite easily with whatever tool you use to compile.
For example, with Maven (using the standard compiler plugin):
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<annotationProcessors>
<annotationProcessor>org.checkerframework.checker.nullness.NullnessChecker</annotationProcessor>
</annotationProcessors>
<compilerArgs>
<arg>-Astubs=checkerframework/stubs</arg>
<arg>-AstubWarnIfNotFound</arg>
</compilerArgs>
</configuration>
</plugin>
You also need to add these dependencies to your project:
<dependency>
<groupId>org.checkerframework</groupId>
<artifactId>checker-qual</artifactId>
<version>1.9.2</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.checkerframework</groupId>
<artifactId>checker</artifactId>
<version>1.9.2</version>
<optional>true</optional>
</dependency>
Here, checkerframework/stubs is a directory (relative to the location of the pom), containing the stubs. For Optional, my stub looks like this (strangely, stubs must be named *.astub, so this file is called Optional.astub):
package java.util;
import org.checkerframework.checker.interning.qual.*;
import javax.annotation.Nullable;
class Optional<T> {
static <T> Optional<T> ofNullable(#Nullable T value);
#Nullable T orElse(#Nullable T other);
}
This approach is simple, requires little work, does not mess with which compiler I use or the Java libraries at all, makes sure these definitions are only used with the checkerframework (so I can, for example, add this to a Maven profile and enable it only if I want to by simply passing a Maven argument), will work across machines and OS's without previous setup in the true Java way of doing things.
I'm not sure why you say "its parameter is not marked as #Nullable".
When I look at file
checker-framework/checker/jdk/nullness/src/java/util/Optional.java,
I see the following annotated method:
public static <T> Optional<#NonNull T> ofNullable(#Nullable T value) {
return value == null ? empty() : of(value);
}
Furthermore, when I run the Checker Framework on the following code, it issues no warning.
// run like this:
// javacheck -g TestOptional.java -processor nullness
import java.util.*;
import org.checkerframework.checker.nullness.qual.Nullable;
import org.checkerframework.checker.nullness.qual.NonNull;
public class TestOptional {
void m() {
#Nullable Object obj = null;
Optional<Object> optional1 = Optional.ofNullable(obj);
}
}
I'm not sure what is going on in your case because you didn't provide a complete test case, you didn't say what command you ran, and you didn't give an actual error message. (You did provide a diagnosis, but I'm not sure it is accurate.)
Maybe providing more details would enable better understanding of your problem.
Environnement : Simple Java stand-alone application. AspectJ jar inside.
I've two projects. The first one, say 'A' contains a custom method scope annotation and an aspect who is in charge of doing some task when the method is called.
The annotation :
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.METHOD)
public #interface AccessibleForRole {
String value() ;
}
The aspect :
public aspect AccessibleListener {
pointcut verifyRole():
(
call(#AccessibleForRole * *())
);
before() : verifyRole() { // do something
}
}
I've done a jar file with thoses annotations/aspects.
After that, I create a model class in project A. I annotate a method with my annotation, run the program and I see the aspect catching the event and working on.
public class Model {
#AccessibleForRole("admin")
public void addUserToApplication(){
System.out.println("in Model.addUserToApplication.");
}
}
It works fine....but....
If I create a second project, project 'B' using jar 'A', and I create new classes, with methods that I annotate (same a Model for example), it seems that nothing special occurs. Nothing is catched.
Is it possible to create, in the future, something that have to be catched with AspectJ from project A jar, without having to recompile ?
Thanks in advance,
Problem resolved.
Thanks to a Maven plugin : http://mojo.codehaus.org/aspectj-maven-plugin/libraryJars.html
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
I put :
<aspectLibraries>
<aspectLibrary>
<groupId>myGroupId</groupId>
<artifactId>jarA</artifactId>
</aspectLibrary>
</aspectLibraries>
And when building or running, code from project B is correctly intercepted by the aspect define in jarA.
I hope this will help.