I have the following in my pom file:
pom.xml
<reporting>
<plugins>
<plugin>
<groupId>org.pitest</groupId>
<artifactId>pitest-maven</artifactId>
<version>1.1.8</version>
<configuration>
<targetClasses>
<param>com.myService.utility.*</param>
</targetClasses>
<reportsDirectory>/my-service/target</reportsDirectory>
<targetTests>
<param>com.myService.utility.util.*</param>
</targetTests>
<timeoutConstant>5000</timeoutConstant>
<excludeClasses>
<param>com.myService.utility.EmailImpl.java</param>
<param>com.myService.utility.Email.java</param>
<param>com.myService.utility.ValidationUtil.java.java</param>
</excludeClasses>
<avoidCallsTo>
<avoidCallsTo>org.apache.log4j</avoidCallsTo>
<avoidCallsTo>org.slf4j</avoidCallsTo>
<avoidCallsTo>org.apache.commons.logging</avoidCallsTo>
</avoidCallsTo>
</configuration>
<reportSets>
<reportSet>
<reports>
<report>report</report>
</reports>
</reportSet>
</reportSets>
</plugin>
</plugins>
</reporting>
When I run the tests, the timeout doesn't seem to have changed from the default 3000, the classes in excludeClasses are still picked up, and its still complaining about configuration for log4j(althoguh it is log4j2 so this looks like my fault for not specifying). I can't find many examples in the PITest documentation or anywhere else, minus very simple examples using targetClasses and targetTests
EDIT: I tried changing the reporting tags to build tags and removed the reportSets section. There is still no change; the utility src package contains 6 classes, of which the 3 I've outlined in the pom should be excluded, and there are 3 test files in the test counterpart package. the reporter is still pulling in the classes to be excluded and showing as 0% line and mutation coverage. It is also complaining about log4j configs despite the avoidCallsTo values
The configuration needs to be provided under build/plugins, not reporting.
Unfortunately maven doesn't throw any error when it can't map XML to a plugin.
Included/excluded classes accepts globs against java packages - not source files so should look something like :-
<excludeClasses>
<param>com.myService.utility.EmailImpl</param>
<param>com.myService.utility.Email</param>
<param>com.myService.utility.ValidationUtil</param>
</excludeClasses>
Related
I'm trying to generate 2 controller classes for 2 scopes in my project.
I can do that with 2 separate openapi.yaml files, and 2 maven executions.
I'm using swagger-codegen-maven-plugin to get it done, and I could only find code that uses swagger.yaml or openapi.yaml with a different plugin.
I can't find this combination, though I'm positive it's possible.
The question is if I have 2 scopes such as 'DB' and 'Browse', and I want to have 2 interfaces created for the 2 scopes such as DBApi.java and BrowseApi.java, how can it be done, and if it can be done using 1 openapi.yaml file?
I did see example projects where 1 openapi.yaml file resulted in PetApi.java and StoreApi.java, but I couldn't find how to configure this in my setup.
Thanks.
The relevant part in the maven pom file is:
<groupId>io.swagger.codegen.v3</groupId>
<artifactId>swagger-codegen-maven-plugin</artifactId>
<executions>
<execution>
<id>raptor-codegen</id>
<configuration>
<apiPackage>com.app.seo.graph.rest.v1.api</apiPackage>
<modelPackage>com.app.seo.graph.rest.v1.model</modelPackage>
<inputSpec>${project.basedir}/src/main/resources/api/openapi.yaml</inputSpec>
<configOptions>
<dateLibrary>java8</dateLibrary>
<additional-properties>preAuthorize=hasAuthority,useJsonPropertyOrder=true,resourceMetaType=com.ebay.jaxrs.server.ResourceOperation</additional-properties>
</configOptions>
</configuration>
</execution>
</executions>
</plugin>
I've used <useTags>true</useTags> under <configOptions> in my Maven file and it works for me. YAML definition
paths:
'/operation/':
get:
tags:
- Some-Service
generates "SomeServiceApi" class name with the SpringCodegen generator. Using "openapi-generator-maven-plugin" in the "6.2.0" version.
I've inherited a project where "identical" instance variables are used inconsistently. For example in some classes they are stored as the primitive float:
class Primitive {
float myPrimitiveFloat;
...
}
..and in other classes, when the "same" variable is passed into the constructor, the value is stored as the boxed type Float:
class Boxed {
Float myBoxedFloat;
...
Boxed(Float myFloat, .. ) {
this.myFloat = myFloat;
}
}
..and then calling new Boxed(myPrimitiveFloat, ..) from a method in Primitive.
I'm using float/Float as examples here, but this inconsistency could be any of the other couples too: byte/Byte, short/Short, int/Integer, long/Long double/Double, boolean/Boolean and char/Char.
I find it would be consistent if the type would be either the primitive float or the boxed type Float for the "same" variable, and I am looking for a way of examining the source code, without having to visit each file individually.
As examples the things I'd like to be looking for is when a float being passed when a Float is required or vice versa. That could be (the list is not exhaustive, there could be others):
A call to a constructor (new MyClass(..)) using a variable of a type that is opposite to the type in the constructor.
When the passed in parameter to a setter is opposite to the parameter of the method, as in setMyVaribale(..)
When a getter is returning the opposite to the instance variable type it's getting. Like float getMyValue() where the class stores myValue as a Float.
My local Java editor/toolkit is NetBeans and its internal type checker is Sonar lint. I'm discouraged from using software that is deemed "not standard" by my company. This includes the Eclipse IDE.
The Development envrionment is Java version 11.
Is there any way of configuring Sonar lint or NetBeans to detect this sort of thing, or possible detect it while building using Maven/Gradle?
Generate a log file that could be used as input to an audit review would be useful as well.
One possible solution is to add a report section to the POM file. When building with mvn site (it can take a while), an HTML page (target/site/index.html) or an XML file (target/spotbugsXml.xml) describing the issues will be generated.
Anything tagged with the pattern Bx: indicates the use of a dubious programming issue using boxing/unboxing from the spotbugs viewpoint.
The XML file could be used as input for an audit trail review.
The reporting section described below also invokes some other static code analysis tools which may also indicate other issues which might need to be resolved.
<reporting>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-project-info-reports-plugin</artifactId>
<version>${maven-project-info.version}</version>
<reportSets>
<reportSet>
<reports>
<report>index</report>
<report>ci-management</report>
<report>dependencies</report>
<report>dependency-convergence</report>
<report>dependency-info</report>
<report>dependency-management</report>
<report>distribution-management</report>
<report>issue-management</report>
<report>licenses</report>
<report>mailing-lists</report>
<report>modules</report>
<report>plugin-management</report>
<report>plugins</report>
<report>scm</report>
<report>summary</report>
<report>team</report>
</reports>
</reportSet>
</reportSets>
</plugin>
<plugin>
<groupId>com.github.spotbugs</groupId>
<artifactId>spotbugs-maven-plugin</artifactId>
<version>4.2.0</version>
<configuration>
<effort>Max</effort>
<threshold>low</threshold>
<xmlOutput>true</xmlOutput>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-pmd-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<reportSets>
<reportSet>
<id>html</id>
<configuration>
<doctitle>My API for ${project.name} ${project.version}</doctitle>
<windowtitle>My API for ${project.name} ${project.version}</windowtitle>
</configuration>
<reports>
<report>javadoc</report>
</reports>
</reportSet>
<reportSet>
<id>test-html</id>
<configuration>
<testDoctitle>My Test API for ${project.name} ${project.version}</testDoctitle>
<testWindowtitle>My Test API for ${project.name} ${project.version}</testWindowtitle>
</configuration>
<reports>
<report>test-javadoc</report>
</reports>
</reportSet>
</reportSets>
</plugin>
</plugins>
</reporting>
An entry of the spotbugs Maven page suggests that this is only guaranteed to work with Java 8. This code has been tested with Java version 8 and 11 and no problems were encountered.
More information about spotbugs can be found at Spotbugs maven plugin
I know this question is not new. But it seems that there is no definite answer. This answer from 2012 states that if generated sources are placed under target/generated-sources/<tool> they will be compiled. ANTLR 4 maven plugin follows this paradigm. Per documentation, the default value of outputDirectory is: ${project.build.directory}/generated-sources/antlr4.
Now in my case I have a custom tool that generates sources. I've set its output directory to be at ${project.build.directory}/generated-sources/whatever and it didn't work. Regarding the whateverpart, I've tried to use the id of the goal that generates the sources and even tried to hijack antlr4 name. No result though.
When I try this solution that suggests using mojo build-helper-maven-plugin it compiles as expected. But according to maven guide to generating sources it should be working without any helper plugin, shouldn't it? Am I missing something?
Here is the POM (fragment) configuration that I use to generate the sources.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.4.0</version>
<executions>
<execution>
<id>generate-code</id>
<phase>generate-sources</phase>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
<configuration>
<includeProjectDependencies>false</includeProjectDependencies>
<includePluginDependencies>true</includePluginDependencies>
<executableDependency>
<groupId>com.company.product</groupId>
<artifactId>CodeGenerator</artifactId>
</executableDependency>
<arguments>
<argument>${basedir}/</argument>
<argument>${project.build.directory}/generated-sources/generate-code/</argument>
</arguments>
<mainClass>com.company.codegeneration.CodeGenerator</mainClass>
</configuration>
<dependencies>
<dependency>
<groupId>com.company.product</groupId>
<artifactId>CodeGenerator</artifactId>
<version>1.0-SNAPSHOT</version>
<type>jar</type>
</dependency>
</dependencies>
</plugin>
Your understanding is just a bit incorrect.
Nothing automatic, plugins generating source code typically handle that by adding their output directory (something like target/generated-sources/ by convention) as source directory to the POM so that it will be included later during the compile phase.
Some less well implemented plugins don't do that for you and you have
to add the directory yourself, for example using the Build Helper
Maven Plugin.
As the other answer noted, most plugins typically add the generated code as new source path.
Ex: See antlr4's Antlr4Mojo.java class. Here, the plugin is adding the generated classes to project source by calling addSourceRoot method in execute method.
// Omitted some code
void addSourceRoot(File outputDir) {
if (generateTestSources) {
project.addTestCompileSourceRoot(outputDir.getPath());
}
else {
project.addCompileSourceRoot(outputDir.getPath());
}
}
// Omitted some code
#Override
public void execute() throws MojoExecutionException, MojoFailureException {
// Omitted code
if(project!=null)
{
// Tell Maven that there are some new source files underneath the output
// directory.
addSourceRoot(this.getOutputDirectory());
}
}
// Omitted some code
So, you can either do this in your custom plugin or use the build-helper-maven-plugin.
I have a properties file:
property.a=$[value]
I am using maven-resources-plugin with filtering on this property file enabled in order to substitute build variables in there:
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<configuration>
<delimiters>
<delimiter>$[*]</delimiter>
</delimiters>
</configuration>
</plugin>
Everything works flawlessly, until $[*] token is not nested into ${*} one, like below:
property.a=${VALUE:$[value]}
Assuming value=XXX in Maven properties, I expected to get:
property.a=${VALUE:XXX}
However, Maven resources plugin doesn't substitute $[value] in there, leaving filtered contecnts as-is. I tried enabling supportMultiLineFiltering but it changed nothing. It feels like despite <delimiters> option is set explicitly, plugin treats ${*} as a valid delimiter either, and tries to filter it, without success.
How should I configure maven resources plugin so that it filters the property file contents as expected?
I just realized I missed a configuration option in maven resource plugin, designed specially for controlling default delimiters - useDefaultDelimiters, which is true by default. The configuration below solved the issue:
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<configuration>
<delimiters>
<delimiter>$[*]</delimiter>
</delimiters>
<useDefaultDelimiters>false</useDefaultDelimiters>
</configuration>
</plugin>
I am writing a simple annotation processor and trying to debug it using eclipse. I created a new project for annotation processor and configured javax.annotation.processing.Processor under META-INF as needed and it processes annotations fine.
Then, I added some more code and tried debugging, but could never make the execution stop at the breakpoints added in the annotation processor. I am compiling using ant and I am using the following ANT options.
export ANT_OPTS="-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000"
After triggering ant build, i go create a remote debug configuration and the debugger starts fine. Ant build also starts successfully. But the execution never stops at any break point added in the annotation processor.
This is a problem I just ran into, and the eclipse plugin solution seems super cumbersome to me. I found a simpler solution using javax.tools.JavaCompiler to invoke the compilation process. Using the code below, you can just Right-Click > Debug As > JUnit Test in eclipse and debug you annotation processor directly from there
#Test
public void runAnnoationProcessor() throws Exception {
String source = "my.project/src";
Iterable<JavaFileObject> files = getSourceFiles(source);
JavaCompiler compiler = ToolProvider.getSystemJavaCompiler();
CompilationTask task = compiler.getTask(new PrintWriter(System.out), null, null, null, null, files);
task.setProcessors(Arrays.asList(new MyAnnotationProcessorClass()));
task.call();
}
private Iterable<JavaFileObject> getSourceFiles(String p_path) throws Exception {
JavaCompiler compiler = ToolProvider.getSystemJavaCompiler();
StandardJavaFileManager files = compiler.getStandardFileManager(null, null, null);
files.setLocation(StandardLocation.SOURCE_PATH, Arrays.asList(new File(p_path)));
Set<Kind> fileKinds = Collections.singleton(Kind.SOURCE);
return files.list(StandardLocation.SOURCE_PATH, "", fileKinds, true);
}
This question has been posted over 6 years ago, however, I ran into the same problem now and still couldn't find a good answer on the Internet.
I was finally able to work out a good setup that allows me to develop an Annotation Processor, use it in compilation of another project, and debug it as needed.
The setup is like this:
Annotation Processor developed in a project with GAV:
<groupId>infra</groupId>
<artifactId>annotation-processor</artifactId>
<version>1.0-SNAPSHOT</version>
In the annotation-processor POM file I specified the following:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven.compiler.plugin.version}</version>
<configuration>
<compilerArgument>-proc:none</compilerArgument>
<source>${java.source.version}</source>
<target>${java.source.version}</target>
<encoding>UTF-8</encoding>
</configuration>
</plugin>
</plugins>
</build>
Notice the <compilerArgument>-proc:none</compilerArgument> specification.
In the project where the annotation-processor is used, it is used during the compilation of the project. I.e. the annotation-processor is invoked during the execution of the compiler, javac. I found that in order to debug the annotation-processor execution while running javac directly, I can use the following command line:
javac -J-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=1044 -d target/classes -proc:only -processor infra.annotation.CustomizationAnnotationProcessor -cp ../annotation-processor/target/annotation-processor-1.0-SNAPSHOT.jar src\main\java\org\digital\annotationtest\MyTestClass.java
Notice the suspend=y part in the command line of javac. This tells the JVM to suspend execution until the debugger attaches to it.
In this situation, I can start the eclipse debugger by starting a Remote Java Application Debug Configuration. Configure it to use the annotation-processor project, and attach to the process on localhost and port 1044. this allows you to debug the annotation processor code. If you set a breakpoint in the init or process methods, the debugger will break.
In order to enable the same debug experience while compiling using Maven, I setup the POM file as follows:
Add a dependency to the POM where the annotation-processor is used:
<dependency>
<groupId>infra</groupId>
<artifactId>annotation-processor</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
In the same project using the annotation-processor define the following:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven.compiler.plugin.version}</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<fork>true</fork>
<compilerArgs>
<compilerArg>-J-verbose</compilerArg>
<compilerArg>${enableDebugAnnotationCompilerArg}</compilerArg>
</compilerArgs>
<forceJavacCompilerUse>true</forceJavacCompilerUse>
<annotationProcessorPaths>
<annotationProcessorPath>
<groupId>infra</groupId>
<artifactId>annotation-processor</artifactId>
<version>1.0-SNAPSHOT</version>
</annotationProcessorPath>
</annotationProcessorPaths>
<annotationProcessors>
<annotationProcessor>infra.annotation.CustomizationAnnotationProcessor</annotationProcessor>
</annotationProcessors>
</configuration>
</plugin>
</plugins>
</build>
<profiles>
<profile>
<id>debugAnnotation</id>
<properties>
<enableDebugAnnotationCompilerArg>-J-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=1044</enableDebugAnnotationCompilerArg>
</properties>
</profile>
</profiles>
Notice the use of <fork>true</fork>,
and <compilerArg>${enableDebugAnnotationCompilerArg}</compilerArg>.
Also, notice the profile deinition of debugAnnotation and the definition of
the <enableDebugAnnotationCompilerArg> property.
This allows us to start a debugging session of the annotation-processor
by running mvn -P debugAnnotation package and attaching the eclipse debugger to the compiler
process the same way as described in 4 above.
The easiest way is to create an eclipse plugin and then debug it directly from eclipse.
It sound a lot harder then it is - this: https://www.youtube.com/watch?v=PjUaHkUsgzo is a 7 minute guide in youtube that can get you started.