Cannot remove java annotation processor from maven project - java

I'm about to refactor my dirty annotation processor. Therefore I wanted to create a new one to extract some responsibilities from the old one.
old: com.company.coma.shared.annotation.ComaToolAnnotationProcessor
new: com.company.coma.shared.annotation.ToolProcessor
Now I have removed the old one from the Configuration in my pom.xml
pom.xml
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<generatedSourcesDirectory>
${project.build.directory}/generated-sources/
</generatedSourcesDirectory>
<annotationProcessors>
<annotationProcessor>
com.company.coma.shared.annotation.ToolProcessor
</annotationProcessor>
<!--<annotationProcessor>-->
<!--com.company.coma.shared.annotation.ComaToolAnnotationProcessor-->
<!--</annotationProcessor>-->
</annotationProcessors>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
I also removed the ComaToolAnnotationProcessor.java file completely and rebuild the whole project afterwards.
Still this is what my clean install
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.5.1:compile (default-compile) on project module-foo: Compilation failure
[ERROR] Annotation processor 'com.company.coma.shared.annotation.ComaToolAnnotationProcessor' not found
What is going on here? How can it still look for this even when I removed any namings of it from the whole project?
EDIT#1: Deactivating the whole annotation processing plugin (maven-compiler) did not help either. I don't understand what is going on. It seems like I have not influence to the dependencies or configurations anymore.

Probably you have (manually or not) added the processor to your META-INF/services file. Therefore it will try to run it, and fail upon not finding the class specified. I believe removing the reference might fix the problem :)

I found the problem. I renamed one of my parent modules lately. But the submodules which actually contained the files to be processed still referred to the old parent artifact. That way all of my configuration of the thought to be new parent did not affected anything. Pretty weird since the the old parent module disappeared completely from my project structure but it surely was still available in my maven repo for sure.
I relied to much on the module-name refactoring feature of my IDE.

Related

How to aggregate maven subproject javadoc output without regenerating javadoc

I have a largish multimodule Maven build. I need to generate the javadoc for all of the modules and produce an "aggregated" javadoc result that I can deploy to a box for consumption by users.
I did have this working perfectly fine for quite a while, until I tried implementing a custom taglet with specific features and requirements, which makes this more complicated to produce.
All of the submodules inherit a parent pom that is not the aggregator pom. In that parent pom I define the maven-javadoc-plugin. This is what it looked like before I added the custom taglet:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.10.4</version>
<configuration>
<additionalparam>-Xdoclint:none</additionalparam>
<bottom>Unified Service Layer - bottom</bottom>
<doctitle>Unified Service Layer - title</doctitle>
<footer>Unified Service Layer - footer</footer>
<groups></groups>
<header>Unified Service Layer - header</header>
<level>public</level>
<packagesheader>Unified Service Layer - packagesheader</packagesheader>
<top>Unified Server Layer - top</top>
<windowtitle>Unified Service Layer - windowtitle</windowtitle>
</configuration>
<executions>
<execution>
<id>module-javadoc-jar</id>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<show>protected</show>
<detectLinks>false</detectLinks>
</configuration>
</execution>
<execution>
<id>aggregated-documentation</id>
<phase>package</phase>
<inherited>false</inherited>
<goals>
<goal>aggregate-jar</goal>
</goals>
<configuration>
<show>protected</show>
<detectLinks>false</detectLinks>
</configuration>
</execution>
</executions>
</plugin>
With this, I could build all all of the modules, which will generate their own javadoc (which I now know is just a validation step, as aggregate-jar doesn't use this output). I have a separate step I call from jenkins that runs "javadoc:aggregate-jar" in the root project, which produces the aggregated javadoc jar that I deploy.
Again, this has been working fine until now.
I implemented a custom javadoc taglet which requires getting access to the Class object associated with the source file it is contained within. I got this to work, at least in the individual module builds by adding the following to the configuration above:
<taglets>
<taglet>
<tagletClass>com.att.det.taglet.ValidationConstraintsTaglet</tagletClass>
</taglet>
<taglet>
<tagletClass>com.att.det.taglet.ValidationConstraintsCombinedTaglet</tagletClass>
</taglet>
</taglets>
<tagletArtifacts>
<tagletArtifact>
<groupId>com.att.detsusl.taglets</groupId>
<artifactId>validationJavadocTaglet</artifactId>
<version>0.0.1-SNAPSHOT</version>
</tagletArtifact>
</tagletArtifacts>
In order to have the taglet get access to the class file, I had to add a minimal plugin configuration to each subproject pom.xml, which looks like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<configuration>
<tagletArtifacts combine.children="append">
<tagletArtifact>
<groupId>com.att.detsusl</groupId>
<artifactId>artifact-name</artifactId>
<version>${current.pom.version}</version>
</tagletArtifact>
</tagletArtifacts>
</configuration>
</plugin>
With these minimal changes, I could run the build in each module, generating the javadoc, and examining the generated javadoc output in each module, verifying that it all worked.
However, the problem is, when I run "javadoc:aggregate-jar" in the root project, all of that already built output is ignored. It reruns the javadoc generation for all of the subprojects, also ignoring the appended tagletArtifacts list in each subproject pom.xml file. As a result, I get ClassNotFound errors when it tries to get the class file.
I could "fix" this by putting all of the subproject GAVs into the top-level "tagletArtifacts" list, but I definitely do not want to do that. I liked the ability to specify this in the subproject pom.xml (with combine.children="append") to make it work.
What I need is an overall javadoc package for all of the subprojects, with the taglet able to get access to the class file, without forcing the parent pom to know about all of its subprojects. How can I do this?
I'm facing the same problem with all aggregate goals. I checked the source code to maven-javadoc-plugin and it turns out that aggregate work by traversing submodules and collecting source files and nothing more, thus completely ignoring any form configurations specified in the submodules.
During execution every submodule is completely ignored:
source
if ( isAggregator() && !project.isExecutionRoot() ) {
return;
}
And during collection of source files submodules are traversed: source
if ( isAggregator() && project.isExecutionRoot() ) {
for ( MavenProject subProject : reactorProjects ) {
if ( subProject != project ) {
List<String> sourceRoots = getProjectSourceRoots( subProject );
So at the moment, there is no way to do this.
This is not easy to fix either since the whole plugin works by composing a single call to the actual javadoc tool. If you would like to respect settings in the submodules as well, you'll have to merge the configuration blocks of them. While this would work in your case with tagletArtifacts, it does not work for all the settings you can specify, e.g. any form of filter, and can therefore not be done in a generic way.

When I use the maven-release-plugin to release a branch, why does it try to create the branch from revision 0?

I'm using the maven-release-plugin. I'm trying to release a branch and it's failing when it tries to execute this command:
cmd.exe /X /C "svn --non-interactive copy --file C:\Users\USER~1\AppData\Local\Temp\maven-scm-711744598.commit --parents --revision 0 https://domain/svn/app/branches/2.4.8.x https://domain/svn/app/tags/App-2.4.8.1"
It gives this error:
svn: E195012: Unable to find repository location for 'https://domain/svn/app/branches/2.4.8.x' in revision 0
I think this is happening in the prepare goal because when it fails it says:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-release-plugin:2.5:prepare
I asked a svn expert about this, and he said:
wait, why is it trying to copy something from r0? By definition there is nothing in r0. r0 is always an empty repository, the first objects are added in r1. That's why it fails. the question is why maven tried it. If you supply a revision argument to 'svn copy' then the branch / tag you create is based on the source from the revision you specify so the source has to exist in that revision (if you don't specify, you get HEAD, i.e., the newest revision) ...and as for that, I know nothing about maven or its plugins
So why is maven trying to copy from revision 0? This is the maven command I ran:
mvn --batch-mode release:prepare release:perform
And my root pom has the maven-release-plugin defined like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-release-plugin</artifactId>
<version>2.5</version>
<configuration>
<autoVersionSubmodules>true</autoVersionSubmodules>
<developmentVersion>2.4.8.2-SNAPSHOT</developmentVersion>
<releaseVersion>2.4.8.1</releaseVersion>
<branchBase>https://domain/svn/app/branches</branchBase>
<tagBase>https://domain/svn/app/tags</tagBase>
</configuration>
</plugin>
Also, my scm tag looks like this:
<scm>
<connection>scm:svn:https://domain/svn/app/branches/2.4.8.x</connection>
</scm>
My svn version is 1.8.5 (r1542147)
Just wanted to add this late answer for if anyone has the same problem and the solution in the comment doesn't work.
We had the same problem in a multi module application, only our parent POM had the SCM tag (which worked perfectly in our other applications). We got the same error but could solve it by adding the corresponding SCM tag to each child POM. We never found out why this was...
As I said as a comment above:
I cleaned up EVERYTHING and ran just release:prepare by itself and it succeeded without issue. Perhaps this is a bug where running release:prepare and release:perform together will cause this
I have not run into this issue since running these commands separately.
I also had this problem. In the affected project I had a custom search and replace of some files during the validate phase and I wanted to check in the changes to Svn before tagging so I added a custom check-in action like this:
<plugin>
<artifactId>maven-release-plugin</artifactId>
<configuration>
<preparationGoals>clean verify scm:checkin -Dmessage="perform release"</preparationGoals>
</configuration>
</plugin>
This had the consequences that when the release plugin tried to check in the changes in the pom file, there were no changes since they were already committed by the custom action. Thus causing this error.
I added a "includes" file list to my custom scm:checkin which only included the files that I had been tampering with and this fixed the problem for me.
The resulting configuration looked like this:
<plugin>
<artifactId>maven-release-plugin</artifactId>
<configuration>
<preparationGoals>clean verify scm:checkin -Dmessage="perform release" -Dincludes="TwogWebUtilsGrailsPlugin.groovy,plugin.xml" -DconnectionType="connection"</preparationGoals>
</configuration>
</plugin>
The reason for my custom replace action is because the project is a Grails plugin and I was following the guidelines in this blog post.
LATE EDIT: After upgrading to maven 3.2, this solution seems to break. I am back to where I started.

maven-release-plugin tag creation

I just used maven-release-plugin to release a version, obviously :)
The scm configuration in my parent pom is as follows:
<scm>
<developerConnection>scm:svn:http:/localhost/svn/project/trunk/project/3. Implementation/02 Source code</developerConnection>
</scm>
As you can see, after trunk we have several more folders (RUP-style) before reaching the source code.
A mvn release:prepare results in the following scm configuration:
<scm>
<developerConnection>scm:svn:http://localhost/svn/project/tags/project-1.0.0/02 Source code</developerConnection>
</scm>
So, somehow, maven-release-plugin manages to replace trunk/project/3. Implementation/02 Source code with tags/project-1.0.0/02 Source code.
Why would this not be tags/project-1.0.0, as I would expect? If I would run mvn release:perform the plugin would checkout the entire 3. Implementation directory.
For reference, my plugin definition is as follows:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-release-plugin</artifactId>
<version>2.5</version>
<configuration>
<tagBase>http://localhost/svn/project/tags</tagBase>
</configuration>
</plugin>
</plugins>
Right, I figured out what causes this, looking at the maven-release-manager source code. When rewriting the developerConnection value, the RewritePomsForReleasePhase class calculates the number of subdirectories we need to remove from the developerConnectionUrl to get to the root of our project, based on the local project. Now there are two problems with this approach:
There is no guarantee the local depth will match the remote one. Although this may be best practice.
The working dir (from where mvn is called) does not have to be in the root of the project.
Both apply to my situation. I checked out the project two directories less deep than remote. To clarify:
http://localhost/project/3. Implementation/02 Source code was checked out to
D:\workspace\project
Also, we have a project-parent dir containing our parent pom.
So now it determines we are 1 deep by looking at the local structure (from work dir, i.e. project-parent, to project dir) and applies this to the developerConnection url. Then it does a substring on the original developerConnection with the result and ends up with 02 Source code in my case.
So long story less long: maven-release-plugin does not work as expected when the local directory structure does not match the remote one. Now I have to checkout honoring the server path and also create a new pom, or move the parent pom to the project root to get it to work...
EDIT: Moving the pom to the project base dir would probably fix the issue, leaving the developerConnection url unchanged. Will confirm this for the next release.

PMD coulnd't find ruleset

I'm on creating a maven based java project, which contains the PMD maven plugin. I use my own rule set XML and it works like a charm, except two rule sets: the emptycode and the unnecessary: when I run the build, maven says: "can't find resource". The role definitions look like:
<role ref="rulesets/emptycode" />
and
<role ref="rulesets/unnecessary" />
In every other cases, this kind of definition works. What I found out is that: there is a rule set with the name "unnecessary" under ecmasrcipt category, so maybe this definition needs some suggestion to use java version. I tried multiple thinks, like set language attribute to the ruleset xml node ("JAVA", based on PMD JavaDoc), and some pre-postfix in ref, but it doesn't work and I found no working solution over the web. Does someone has an idea, what I forgot to set, or what I fail? Thanks for any help!
PMD seems to be a fiddly beastie to use from Maven. I've just figured this out with version 3.0 of the plugin - there are two solutions:
The quick-and-dirty solution: put rulesets in your project:
download the PMD jar (http://sourceforge.net/projects/pmd/files/latest/download)
extract lib/pmd-x.x.x.jar
extract from that PMD jar file the rulesets/<type>/<ruleset>.xml files you want to use
place them in a folder under your project - something like ${basedir}/pmd/...
reference them as follows:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-pmd-plugin</artifactId>
<configuration>
<rulesets>
<ruleset>${basedir}/pmd/<ruleset>.xml</ruleset>
</rulesets>
</configuration>
</plugin>
The advantage is this is easy, the disadvantage is if you update the PMD version in future you'll need to remember to update these files.
The nice solution: reference rulesets in pmd-x.x.x.jar.
create a custom ruleset such as: ${basedir}/pmd/custom.xml (see http://pmd.sourceforge.net/pmd-5.0.2/howtomakearuleset.html)
reference the PMD rulesets in the following way: <rule ref="rulesets/java/imports.xml"/>
NB: the path is the path inside pmd-x.x.x.jar (see quick-and-dirty above) with no leading slash
reference your custom ruleset as follows:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-pmd-plugin</artifactId>
<configuration>
<rulesets>
<ruleset>${basedir}/pmd/custom.xml</ruleset>
</rulesets>
</configuration>
</plugin>
The advantage is this will always reference the current PMD rulesets from the PMD jar, the disadvantage is it's a bit fiddly to get right.
To experiment with this until it was working (maven-pmd-plugin version 3.0) I kept running mvn pmd:pmd (<linkXref>false</linkXref> in pom.xml) and tweaked the paths until I stopped getting errors.

AspectJ: How to get pointcuts to advise classes located in other projects

This should be simple.
Question
How do you get a pointcut in one project to advise the code/classes within another project?
Context
I'm working in eclipse with two projects. For ease of explanation, let's call one science project and the other math project and say the science project relies on the math project and I'm developing in both projects, concurrently. The math project is a core product, in production, and life will be easier if I don't modify the code much.
Currently, I'm debugging the interaction between these two projects. To assist with that, I'm writing an Aspect (within the science project) to log key information as the math code (and science code) executes.
Example
I running a simple example aspect along the lines of:
package org.science.example;
public aspect ScientificLog {
public pointcut testCut() : execution (public * *.*(..));
before() : testCut() {
//do stuff
}
}
Problem
The problem is, no matter what pointcut I create, it only advises code from the science project. No classes from org.math.example are crosscut, AT ALL!I tried adding the math project to the inpath of the science project by going to proect properties > AspectJ Build > Inpath and clicking add project and choosing the math project. That didn't work but it seems like I need to do something along those lines.
Thanks, in advance, for any suggestions...
-gMale
EDIT 1:
Since writing this, I've noticed the project is giving the following error:
Caused by: org.aspectj.weaver.BCException: Unable to continue, this version of AspectJ
supports classes built with weaver version 6.0 but the class
com.our.project.adapter.GenericMessagingAdapter is version 7.0
when batch building BuildConfig[null] #Files=52 AopXmls=#0
So maybe this is setup properly and the error is more subtle. BTW, the class mentioned is from the "science project," so to speak. This happens even after I clean the project. I'm currently googling this error...
EDIT 2:
I found the solution to the error above in
comment #5 here
The problem is the maven-aspectj-plugin's pom file declares a dependency on aspectjtools version 1.6.7. So, when configuring the plugin, that transient dependency has to be modified. Here's the related code snippet for the pom file that fixes the problem by specifying version 1.6.9 instead of 1.6.7:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.3</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>1.6.9</version>
</dependency>
</dependencies>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
</plugin>
Your second problem is unrelated to the first. It is saying that com.our.project.adapter.GenericMessagingAdapter was originally compiled and woven against a new version of AspectJ but is being used to binary weave against an older version of AspectJ.
This is essentially the same problem as when you try to run Java classes compiled under 1.6 on a 1.5 VM.
The version number was revved up for the release of AspectJ 1.6.8 (I think, or maybe it was 1.6.7).
The solution is to make sure you are using the latest version of AspectJ for all of your projects (eg- 1.6.9, or dev builds of 1.6.10).
When you add Math project to the in path of science project, all of math project's code is sent through the aspectj weaver and properly woven. The results of that weave are written to science project's output folder (not Math project's). So, if you were to look in science project's bin folder, you should see the woven classes there.
If you wanted to keep the in path files separate from the regular files, you can specify an inpath out folder. This folder should also be added to the class path as a binary folder. Also, this folder should be placed above the project dependency to Math project in the "Export and Order" tab of the Java build page for Science project.
Finally, if you run the main class from Science project, rather than from Math project, you will be executing the woven code.

Categories

Resources