I'm using a plugin in my pom that looks like this:
<plugin>
<groupId>org.codehaus.enunciate</groupId>
<artifactId>maven-enunciate-plugin</artifactId>
<!-- check for the latest version -->
<version>1.27</version>
<executions>
<execution>
<goals>
<goal>docs</goal>
</goals>
<configuration>
<docsDir>${project.build.directory}/docs</docsDir>
</configuration>
</execution>
</executions>
</plugin>
And I'm wondering, if I run package will this plugin run or is it during site or something else? Is there an easy way to tell by looking at this, or do I have to either
Read the plugin documentation
Experiment through trial and error
I'm hoping there's an easier way. I'm using intellij-idea, if that provides a means I'd be happy with that. Assuming I can't tell without one of these two methods, is it a best practice to always define the phase in the pom so that I can save myself and others time in the future?
You can let maven print out informations regarding the plugin using mavens help plugin - for enunciate simply use the following command:
mvn help:describe -Dplugin=org.codehaus.enunciate:maven-enunciate-plugin -Ddetail
It has actually 6 goals bound to different phases - goal docs will be bound to process-sources-phase
To only extract the goal you are interested in you can furthermore use the following command:
mvn help:describe -Dmojo=docs -DgroupId=org.codehaus.enunciate -DartifactId=maven-enunciate-plugin -Ddetail
You could also omit the -Ddetail part, but it won't give you then any information on the phase it is running.
Related
I have a largish multimodule Maven build. I need to generate the javadoc for all of the modules and produce an "aggregated" javadoc result that I can deploy to a box for consumption by users.
I did have this working perfectly fine for quite a while, until I tried implementing a custom taglet with specific features and requirements, which makes this more complicated to produce.
All of the submodules inherit a parent pom that is not the aggregator pom. In that parent pom I define the maven-javadoc-plugin. This is what it looked like before I added the custom taglet:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.10.4</version>
<configuration>
<additionalparam>-Xdoclint:none</additionalparam>
<bottom>Unified Service Layer - bottom</bottom>
<doctitle>Unified Service Layer - title</doctitle>
<footer>Unified Service Layer - footer</footer>
<groups></groups>
<header>Unified Service Layer - header</header>
<level>public</level>
<packagesheader>Unified Service Layer - packagesheader</packagesheader>
<top>Unified Server Layer - top</top>
<windowtitle>Unified Service Layer - windowtitle</windowtitle>
</configuration>
<executions>
<execution>
<id>module-javadoc-jar</id>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<show>protected</show>
<detectLinks>false</detectLinks>
</configuration>
</execution>
<execution>
<id>aggregated-documentation</id>
<phase>package</phase>
<inherited>false</inherited>
<goals>
<goal>aggregate-jar</goal>
</goals>
<configuration>
<show>protected</show>
<detectLinks>false</detectLinks>
</configuration>
</execution>
</executions>
</plugin>
With this, I could build all all of the modules, which will generate their own javadoc (which I now know is just a validation step, as aggregate-jar doesn't use this output). I have a separate step I call from jenkins that runs "javadoc:aggregate-jar" in the root project, which produces the aggregated javadoc jar that I deploy.
Again, this has been working fine until now.
I implemented a custom javadoc taglet which requires getting access to the Class object associated with the source file it is contained within. I got this to work, at least in the individual module builds by adding the following to the configuration above:
<taglets>
<taglet>
<tagletClass>com.att.det.taglet.ValidationConstraintsTaglet</tagletClass>
</taglet>
<taglet>
<tagletClass>com.att.det.taglet.ValidationConstraintsCombinedTaglet</tagletClass>
</taglet>
</taglets>
<tagletArtifacts>
<tagletArtifact>
<groupId>com.att.detsusl.taglets</groupId>
<artifactId>validationJavadocTaglet</artifactId>
<version>0.0.1-SNAPSHOT</version>
</tagletArtifact>
</tagletArtifacts>
In order to have the taglet get access to the class file, I had to add a minimal plugin configuration to each subproject pom.xml, which looks like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<configuration>
<tagletArtifacts combine.children="append">
<tagletArtifact>
<groupId>com.att.detsusl</groupId>
<artifactId>artifact-name</artifactId>
<version>${current.pom.version}</version>
</tagletArtifact>
</tagletArtifacts>
</configuration>
</plugin>
With these minimal changes, I could run the build in each module, generating the javadoc, and examining the generated javadoc output in each module, verifying that it all worked.
However, the problem is, when I run "javadoc:aggregate-jar" in the root project, all of that already built output is ignored. It reruns the javadoc generation for all of the subprojects, also ignoring the appended tagletArtifacts list in each subproject pom.xml file. As a result, I get ClassNotFound errors when it tries to get the class file.
I could "fix" this by putting all of the subproject GAVs into the top-level "tagletArtifacts" list, but I definitely do not want to do that. I liked the ability to specify this in the subproject pom.xml (with combine.children="append") to make it work.
What I need is an overall javadoc package for all of the subprojects, with the taglet able to get access to the class file, without forcing the parent pom to know about all of its subprojects. How can I do this?
I'm facing the same problem with all aggregate goals. I checked the source code to maven-javadoc-plugin and it turns out that aggregate work by traversing submodules and collecting source files and nothing more, thus completely ignoring any form configurations specified in the submodules.
During execution every submodule is completely ignored:
source
if ( isAggregator() && !project.isExecutionRoot() ) {
return;
}
And during collection of source files submodules are traversed: source
if ( isAggregator() && project.isExecutionRoot() ) {
for ( MavenProject subProject : reactorProjects ) {
if ( subProject != project ) {
List<String> sourceRoots = getProjectSourceRoots( subProject );
So at the moment, there is no way to do this.
This is not easy to fix either since the whole plugin works by composing a single call to the actual javadoc tool. If you would like to respect settings in the submodules as well, you'll have to merge the configuration blocks of them. While this would work in your case with tagletArtifacts, it does not work for all the settings you can specify, e.g. any form of filter, and can therefore not be done in a generic way.
I have a simple question about execution ID in maven plugin.
<groupId>org.codehaus.mojo</groupId>
<artifactId>gwt-maven-plugin</artifactId>
<version>2.7.0</version>
<executions>
<execution>
<id>gwt-process-resources</id>
<goals>
<goal>i18n</goal>
<goal>generateAsync</goal>
</goals>
</execution>
</executions>
Can someone explain to me what does this executionId does? How are goals triggered? Can I call directly the "gwt-process-resources" in order to execute both goals? If yes, how can I do that?
<id></id> exists only for you to be able to distinguish between other executions. This tag will be displayed when you do the actual build.
Your execution example will invoke the two goals you have specified: i18n and generateAsync.
If the plugin isn't bound to a specific phase (process-resources, package, install, etc) your execution will not performed. The plugin's documentation should tell if this is the case.
You can specify/override the default phase by using the <phase> tag:
...
<execution>
<id>gwt-process-resources</id>
<phase>process-resources</phase> <!-- If you need to override -->
<goals>
<goal>i18n</goal>
<goal>generateAsync</goal>
</goals>
</execution>
...
...
Goals are either triggered:
Automatically (implicitly by their default phase or explicitly as above)
By command line execution: mvn <plugin name>:<goal>
Here is a very simple explanation:
You can not call excecution ids directly
mvn gwt-process-resources
will not work since gwt-process-resources is just an id.
If there is no <phase> declaration in the pom then you might want to look at the documentation of the plugin and find the corresponding default phase. If you look at the documentation of the gwt plugin:
gwt:i18n Binds by default to generate-sources.
gwt:generateAsync Binds by default to the lifecycle phase: generate-sources.
How are goals triggered?
if you do
mvn compile
=> compile > generate-sources in maven lifecycle
=> maven execute gwt:i18n after gwt:generateAsync
=> executed in the order they are declared in pom.xml because they are bound to some phase "generate-sources"
Yes, since Maven 3.3.1 you can, but you need to explicitly execute each goal. There are a couple of ways.
This works always:
mvn <group-id>:<artifact-id>:(<version>):<goal>#<execution-id>
in your case:
mvn org.codehaus.mojo:gwt-maven-plugin:i18n#gwt-process-resources (you can skip the version)
The other (more convenient) way is by using the short name of the goals, as found at the top of the plugin page:
mvn gwt:i18n#gwt-process-resources
Note that while execution id's have to be unique among all executions of a single plugin within a POM, they don't have to be unique across an inheritance hierarchy of POMs. Executions of the same id from different POMs are merged. The same applies to executions that are defined by profiles.
https://maven.apache.org/guides/mini/guide-configuring-plugins.html#Using_the_executions_Tag
Some plugins (e.g., compile plugin) will use the "id" in a temporary file name. Therefore, when changing the "id" ensure you don't use characters like ":" that could cause problems formatting a valid file name path.
Due to the way my build system is designed (RTC Build Engine), I would like to provide maven with property values via a properties file, instead of specifying -Dkey=value for every property.
I found a couple of questions on S.O. (How to set build properties from a file in Maven POM? and How to read an external properties file in Maven) that relate precisely to this question, but they are relatively old, and both require custom plugins to work (in alpha state).
I realize that passing parameters to Maven like this is probably not the best solution, but the other option is specifying everything on the command line via -D settings which is not ideal either.
Furthermore, given that this properties file is only really used by the build engine (and not by the individual developer), I don't truly believe it belongs in the pom. But I cannot find any other mechanism that would allow me to specify a plugin to use - settings.xml does not permit specifying plugins.
Is my only choice in this case to use a plugin and specify it in the project pom?
in the pom you can place...
<properties>
<core-version>1234</core-version>
<lib-version>1234</lib-version>
<build-version>9999</lib-version>
<build-date>20150101</build-date>
</properties>
with all the properties you require.
Or you can use...
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>properties-maven-plugin</artifactId>
<version>1.0-alpha-2</version>
<executions>
<execution>
<phase>initialize</phase>
<goals>
<goal>read-project-properties</goal>
</goals>
<configuration>
<files>
<file>dev.properties</file>
</files>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
and the file dev.properties will contain the properties
core-version=1234
lib-version=1234
build-version=9999
build-date=20150101
...
Or... you can inject the properties using a settings.xml file as shown here
You may also find the Maven build number plugin useful... here
The best in such cases is to upgrade to at least Maven 3.2.1 which supports defining such properties on the command line like the following:
mvn -Drevision=1234 -Dchangelist=WhatEver -Dsha1=XXXX clean package
But you can only use the above names.
Excerpt from release notes:
A simple change to prevent Maven from emitting warnings about versions
with property expressions. Allowed property expressions in versions
include ${revision}, ${changelist}, and ${sha1}. These properties can
be set externally, but eventually a mechanism will be created in Maven
where these properties can be injected in a standard way. For example
you may want to glean the current Git revision and inject that value
into ${sha1}. This is by no means a complete solution for continuous
delivery but is a step in the right direction.
We use Jenkins which use md5 fingerprinting to identify artifacts and whether the artifact has changed since the last build. Unfortunately Maven builds always generate binary different artifacts.
Therefore I am looking into making Maven generate the same jar artifact for the same set of input files regardless of where and when they were built, which amongst other things mean that the entries in the jar file must be sorted - not only in the index, but in the order they are written to the jar file.
After examining maven-jar-plugin which use maven-assembly-plugin, my conclusions are that they do not collect all files to be written in memory before writing them all at once, but write one at a time. This mean that it may be better to postprocess the generated jar instead of changing the current behavior so I at that time can sort the entries, zero the timestamps, etc.
I am unfamiliar with writing Maven plugins, so my question is, how should I write a plugin which Maven knows how to tell where the artifact-jar-in-progress is located and how I hook it up in my pom.xml?
(At first I need this to work for jar files, but war files would be nice too).
As mentioned, this can be done based on something similar to maven-shade-plugin. I went ahead and wrote a simple plugin to add this capability -- see https://github.com/manouti/jar-timestamp-normalize-maven-plugin (available on the Central repo).
The behavior is based on the shade plugin. It consists of a single goal called normalize which can be bound to the package lifecycle phase and configured in the project's POM:
<plugins>
<plugin>
<groupId>com.github.manouti</groupId>
<artifactId>jar-timestamp-normalize-maven-plugin</artifactId>
<version>1.0-SNAPSHOT</version>
<executions>
<execution>
<id>jar-normalize</id>
<goals>
<goal>normalize</goal>
</goals>
<phase>package</phase>
</execution>
</executions>
</plugin>
</plugins>
A few notes about the plugin:
The artifact under build is accessed via project#getArtifact() where project is a org.apache.maven.project.MavenProject.
Normalization consists of mainly three steps:
Setting the last modified time of all Jar entries to a specific timestamp (default value is 1970-01-01 00:00:00AM but can be changed via -Dtimestamp system property).
Reordering (alphabetically) of attributes in the manifest except for Manifest-Version which always comes first.
Removing comments from the pom.properties file which contain a timestamp that causes the Jar to differ from one build to another.
Once invoked, the goal will generate the output file next to the original artifact (named artifactId-version-normalized.jar), i.e. in the project.build.directory directory.
To create maven plugin project
mvn archetype:generate \
-DgroupId=sample.plugin \
-DartifactId=hello-maven-plugin \
-DarchetypeGroupId=org.apache.maven.archetypes \
-DarchetypeArtifactId=maven-archetype-plugin
invoke this command it will generate a skeleton project with a class called MyMojo.java
write your stuff inside execute() method, and install that plugin to your repository by mvn clean install
then attach its execution with your project, in your project pom.xml
<build>
<plugins>
<plugin>
<groupId>sample.plugin</groupId>
<artifactId>hello-maven-plugin</artifactId>
<version>1.0-SNAPSHOT</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>sayhi</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
to access project properties inside your Mojo
/**
* The Maven project.
*
* #parameter expression="${project}"
* #required
* #readonly
*/
private MavenProject project;
and then
project.getProperties("build.directory")
and read other properties to determine your jar file packed
See
maven: guide-java-plugin-development
I agree on creating a custom maven plugin seems like a better option. I dont know about an existing plugin provides solution for what you asked.
md5 checksum (or sha-1 in my repository) is generated with install plugin, so seems like you need to extend this or write a new plugin which works after install phase.
I have 2 suggestions about this plugin:
1) When thinking simple, this plugin should:
Read generated jar:
Extract all entries.
Exclude some entries (e.g. MANIFEST.MF).
Sort remaining entries .
Extract md5s for each in memory.
Generate a single md5 from all of those extracted.
However when considering about where & when independency: Accordig to .class file structure Java_class_file there is minor, major versions entries are held in compiled class files. So if compiler changes, .class files will be changed. In this case we need a check on source code level from this point :( So this solution become useless if there is no guarantee on copiler version.
2) As very dirty but easy solution, this plugin may only extract your module's pom.xml file's md5 code. But you must guarantee each change in your jar reflects to a minor version (or built number) manually.
Instead of writing your own plugin you can write a Groovy script that is executed by groovy-maven-plugin:
<plugin>
<groupId>org.codehaus.gmaven</groupId>
<artifactId>groovy-maven-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<source>
import java.util.jar.*
String fileName = '${project.build.directory}/${project.build.finalName}.jar'
println "Editing file ${fileName}"
JarFile file = new JarFile(fileName);
// do your edit
</source>
</configuration>
</execution>
</executions>
</plugin>
Is there something that can be used in Maven to automate this kind of check? I'm seeing checkstyle and PMD but I'm not finding this feature.
Basically I'd like the build to fail if there's a class A and there's not an ATestCase. I know, it is not a strict check and can be easily bypassed by creating just the class, but at the moment that would be enough.
What ou are looking for
As Jens Piegsa pointed id out, what you are looking for is a tool that show you the test coverage, in other words the percentage of code which is used by you tests.
It allow you to see how much you code is tested, in a really more reliable way than (at least test by class).
You can use Cobertura, which well integrated in Maven: http://mojo.codehaus.org/cobertura-maven-plugin/
The way to achieve that
POM Configuration
Just put this code snippet in your pom.xml
<project>
...
<reporting>
<plugins>
...
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>cobertura-maven-plugin</artifactId>
<version>2.6</version>
</plugin>
</plugins>
</reporting>
</project>
Running coverage
And run
mvn cobertura:cobertura
Or run the report phase (binded with site generation)
mvn site:site
Adding quality Threshold
You can even add failing threshold if you want to invalidate low coverage builds
<plugin>
[...]
<configuration>
<check>
<!-- Fail if code coverage does not respects the goals -->
<haltOnFailure>true</haltOnFailure>
<!-- Per-class thresholds -->
<lineRate>80</lineRate>
<!-- Per-branch thresholds (in a if verify that if and else are covered-->
<branchRate>80</branchRate>
<!-- Project-wide thresholds -->
<totalLineRate>90</totalLineRate>
<totalBranchRate>90</totalBranchRate>
</check>
</configuration>
</plugin>
Short answer: No.
Longer answer: I once wrote a unit test to assert that all VO's had a no-args constructor, and I would think that you could use the same approach here.
Basically, iterate through Package.getPackages() (you'll need to filter out JRE packages, but assuming you're using a sensible namespace, this should be no problem). For each package gather all classes not starting or ending with Test and assert that each one has a matching test.
It's not failsafe, but close enough perhaps?
Cheers,