Retrieve version from maven pom.xml in code - java

What is the simplest way to retrieve version number from maven's pom.xml in code, i.e., programatically?

Assuming you're using Java, you can:
Create a .properties file in (most commonly) your src/main/resources directory (but in step 4 you could tell it to look elsewhere).
Set the value of some property in your .properties file using the standard Maven property for project version:
foo.bar=${project.version}
In your Java code, load the value from the properties file as a resource from the classpath (google for copious examples of how to do this, but here's an example for starters).
In Maven, enable resource filtering. This will cause Maven to copy that file into your output classes and translate the resource during that copy, interpreting the property. You can find some info here but you mostly just do this in your pom:
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
</resources>
</build>
You can also get to other standard properties like project.name, project.description, or even arbitrary properties you put in your pom <properties>, etc. Resource filtering, combined with Maven profiles, can give you variable build behavior at build time. When you specify a profile at runtime with -PmyProfile, that can enable properties that then can show up in your build.

The accepted answer may be the best and most stable way to get a version number into an application statically, but does not actually answer the original question: How to retrieve the artifact's version number from pom.xml? Thus, I want to offer an alternative showing how to do it dynamically during runtime:
You can use Maven itself. To be more exact, you can use a Maven library.
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-model</artifactId>
<version>3.3.9</version>
</dependency>
And then do something like this in Java:
package de.scrum_master.app;
import org.apache.maven.model.Model;
import org.apache.maven.model.io.xpp3.MavenXpp3Reader;
import org.codehaus.plexus.util.xml.pull.XmlPullParserException;
import java.io.FileReader;
import java.io.IOException;
public class Application {
public static void main(String[] args) throws IOException, XmlPullParserException {
MavenXpp3Reader reader = new MavenXpp3Reader();
Model model = reader.read(new FileReader("pom.xml"));
System.out.println(model.getId());
System.out.println(model.getGroupId());
System.out.println(model.getArtifactId());
System.out.println(model.getVersion());
}
}
The console log is as follows:
de.scrum-master.stackoverflow:my-artifact:jar:1.0-SNAPSHOT
de.scrum-master.stackoverflow
my-artifact
1.0-SNAPSHOT
Update 2017-10-31: In order to answer Simon Sobisch's follow-up question I modified the example like this:
package de.scrum_master.app;
import org.apache.maven.model.Model;
import org.apache.maven.model.io.xpp3.MavenXpp3Reader;
import org.codehaus.plexus.util.xml.pull.XmlPullParserException;
import java.io.File;
import java.io.FileReader;
import java.io.IOException;
import java.io.InputStreamReader;
public class Application {
public static void main(String[] args) throws IOException, XmlPullParserException {
MavenXpp3Reader reader = new MavenXpp3Reader();
Model model;
if ((new File("pom.xml")).exists())
model = reader.read(new FileReader("pom.xml"));
else
model = reader.read(
new InputStreamReader(
Application.class.getResourceAsStream(
"/META-INF/maven/de.scrum-master.stackoverflow/aspectj-introduce-method/pom.xml"
)
)
);
System.out.println(model.getId());
System.out.println(model.getGroupId());
System.out.println(model.getArtifactId());
System.out.println(model.getVersion());
}
}

Packaged artifacts contain a META-INF/maven/${groupId}/${artifactId}/pom.properties file which content looks like:
#Generated by Maven
#Sun Feb 21 23:38:24 GMT 2010
version=2.5
groupId=commons-lang
artifactId=commons-lang
Many applications use this file to read the application/jar version at runtime, there is zero setup required.
The only problem with the above approach is that this file is (currently) generated during the package phase and will thus not be present during tests, etc (there is a Jira issue to change this, see MJAR-76). If this is an issue for you, then the approach described by Alex is the way to go.

There is also the method described in Easy way to display your apps version number using Maven:
Add this to pom.xml
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>test.App</mainClass>
<addDefaultImplementationEntries>
true
</addDefaultImplementationEntries>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
</build>
Then use this:
App.class.getPackage().getImplementationVersion()
I have found this method to be simpler.

If you use mvn packaging such as jar or war, use:
getClass().getPackage().getImplementationVersion()
It reads a property "Implementation-Version" of the generated META-INF/MANIFEST.MF (that is set to the pom.xml's version) in the archive.

To complement what #kieste has posted, which I think is the best way to have Maven build informations available in your code if you're using Spring-boot: the documentation at http://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#production-ready-application-info is very useful.
You just need to activate actuators, and add the properties you need in your application.properties or application.yml
Automatic property expansion using Maven
You can automatically expand info properties from the Maven project using resource filtering. If you use the spring-boot-starter-parent you can then refer to your Maven ‘project properties’ via #..# placeholders, e.g.
project.artifactId=myproject
project.name=Demo
project.version=X.X.X.X
project.description=Demo project for info endpoint
info.build.artifact=#project.artifactId#
info.build.name=#project.name#
info.build.description=#project.description#
info.build.version=#project.version#

When using spring boot, this link might be useful: https://docs.spring.io/spring-boot/docs/2.3.x/reference/html/howto.html#howto-properties-and-configuration
With spring-boot-starter-parent you just need to add the following to your application config file:
# get values from pom.xml
pom.version=#project.version#
After that the value is available like this:
#Value("${pom.version}")
private String pomVersion;

Sometimes the Maven command line is sufficient when scripting something related to the project version, e.g. for artifact retrieval via URL from a repository:
mvn help:evaluate -Dexpression=project.version -q -DforceStdout
Usage example:
VERSION=$( mvn help:evaluate -Dexpression=project.version -q -DforceStdout )
ARTIFACT_ID=$( mvn help:evaluate -Dexpression=project.artifactId -q -DforceStdout )
GROUP_ID_URL=$( mvn help:evaluate -Dexpression=project.groupId -q -DforceStdout | sed -e 's#\.#/#g' )
curl -f -S -O http://REPO-URL/mvn-repos/${GROUP_ID_URL}/${ARTIFACT_ID}/${VERSION}/${ARTIFACT_ID}-${VERSION}.jar

Use this Library for the ease of a simple solution. Add to the manifest whatever you need and then query by string.
System.out.println("JAR was created by " + Manifests.read("Created-By"));
http://manifests.jcabi.com/index.html

<build>
<finalName>${project.artifactId}-${project.version}</finalName>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>3.2.2</version>
<configuration>
<failOnMissingWebXml>false</failOnMissingWebXml>
<archive>
<manifest>
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
<addDefaultSpecificationEntries>true</addDefaultSpecificationEntries>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
Get Version using this.getClass().getPackage().getImplementationVersion()
PS Don't forget to add:
<manifest>
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
<addDefaultSpecificationEntries>true</addDefaultSpecificationEntries>
</manifest>

Step 1: If you are using Spring Boot, your pom.xml should already contain spring-boot-maven-plugin. You just need to add the following configuration.
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<id>build-info</id>
<goals>
<goal>build-info</goal>
</goals>
</execution>
</executions>
</plugin>
It instructs the plugin to execute also build-info goal, which is not run by default. This generates build meta-data about your application, which includes artifact version, build time and more.
Step2: Accessing Build Properties with buildProperties bean. In our case we create a restResource to access to this build info in our webapp
#RestController
#RequestMapping("/api")
public class BuildInfoResource {
#Autowired
private BuildProperties buildProperties;
#GetMapping("/build-info")
public ResponseEntity<Map<String, Object>> getBuildInfo() {
Map<String, String> buildInfo = new HashMap();
buildInfo.put("appName", buildProperties.getName());
buildInfo.put("appArtifactId", buildProperties.getArtifact());
buildInfo.put("appVersion", buildProperties.getVersion());
buildInfo.put("appBuildDateTime", buildProperties.getTime());
return ResponseEntity.ok().body(buldInfo);
}
}
I hope this will help

I had the same problem in my daytime job. Even though many of the answers will help to find the version for a specific artifact, we needed to get the version for modules/jars that are not a direct dependency of the application. The classpath is assembled from multiple modules when the application starts, the main application module has no knowledge of how many jars are added later.
That's why I came up with a different solution, which may be a little more elegant than having to read XML or properties from jar files.
The idea
use a Java service loader approach to be able to add as many components/artifacts later, which can contribute their own versions at runtime. Create a very lightweight library with just a few lines of code to read, find, filter and sort all of the artifact versions on the classpath.
Create a maven source code generator plugin that generates the service implementation for each of the modules at compile time, package a very simple service in each of the jars.
The solution
Part one of the solution is the artifact-version-service library, which can be found on github and MavenCentral now. It covers the service definition and a few ways to get the artifact versions at runtime.
Part two is the artifact-version-maven-plugin, which can also be found on github and MavenCentral. It is used to have a hassle-free generator implementing the service definition for each of the artifacts.
Examples
Fetching all modules with coordinates
No more reading jar manifests, just a simple method call:
// iterate list of artifact dependencies
for (Artifact artifact : ArtifactVersionCollector.collectArtifacts()) {
// print simple artifact string example
System.out.println("artifact = " + artifact);
}
A sorted set of artifacts is returned. To modify the sorting order, provide a custom comparator:
new ArtifactVersionCollector(Comparator.comparing(Artifact::getVersion)).collect();
This way the list of artifacts is returned sorted by version numbers.
Find a specific artifact
ArtifactVersionCollector.findArtifact("de.westemeyer", "artifact-version-service");
Fetches the version details for a specific artifact.
Find artifacts with matching groupId(s)
Find all artifacts with groupId de.westemeyer (exact match):
ArtifactVersionCollector.findArtifactsByGroupId("de.westemeyer", true);
Find all artifacts where groupId starts with de.westemeyer:
ArtifactVersionCollector.findArtifactsByGroupId("de.westemeyer", false);
Sort result by version number:
new ArtifactVersionCollector(Comparator.comparing(Artifact::getVersion)).artifactsByGroupId("de.", false);
Implement custom actions on list of artifacts
By supplying a lambda, the very first example could be implemented like this:
ArtifactVersionCollector.iterateArtifacts(a -> {
System.out.println(a);
return false;
});
Installation
Add these two tags to all pom.xml files, or maybe to a company master pom somewhere:
<build>
<plugins>
<plugin>
<groupId>de.westemeyer</groupId>
<artifactId>artifact-version-maven-plugin</artifactId>
<version>1.1.0</version>
<executions>
<execution>
<goals>
<goal>generate-service</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>de.westemeyer</groupId>
<artifactId>artifact-version-service</artifactId>
<version>1.1.0</version>
</dependency>
</dependencies>
Feedback
It would be great if maybe some people could give the solution a try. Getting feedback about whether you think the solution fits your needs would be even better. So please don't hesitate to add a new issue on any of the github projects if you have any suggestions, feature requests, problems, whatsoever.
Licence
All of the source code is open source, free to use even for commercial products (MIT licence).

It's very easy and no configuration is needed if you use Spring with Maven.
According to the “Automatic Property Expansion Using Maven” official documentation you can automatically expand properties from the Maven project by using resource filtering. If you use the spring-boot-starter-parent, you can then refer to your Maven ‘project properties’ with #..# placeholders, as shown in the following example:
project.version=#project.version#
project.artifactId=#project.artifactId#
And you can retrieve it with #Value annotation in any class:
#Value("${project.artifactId}#${project.version}")
private String RELEASE;
I hope this helps!

With reference to ketankk's answer:
Unfortunately, adding this messed with how my application dealt with resources:
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
</resources>
</build>
But using this inside maven-assemble-plugin's < manifest > tag did the trick:
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
<addDefaultSpecificationEntries>true</addDefaultSpecificationEntries>
So I was able to get version using
String version = getClass().getPackage().getImplementationVersion();

Preface: Because I remember this often referred-to question after having answered it a few years ago, showing a dynamic version actually accessing Maven POM infos dynamically (e.g. also during tests), today I found a similar question which involved accessing module A's Maven info from another module B.
I thought about it for a moment and spontaneously had the idea to use a special annotation, applying it to a package declaration in package-info.java. I also created a multi-module example project on GitHub. I do not want to repeat the whole answer, so please see solution B in this answer. The Maven setup involves Templating Maven Plugin, but could also be solved in a more verbose way using a combination of resource filtering and adding generated sources directory to the build via Build Helper Maven. I wanted to avoid that, so I simply used Templating Maven.

Accepted answer worked for me once in the step #2 I changed ${project.version} to ${pom.version}

Related

#Sql annotation by passing a sql script present in another module

I have a project compose by more than one module, and an integration test (in the test folder) where I want to run this script using the #sql annotation. By default the class path resource is used.
The test is inside this folder:
mainFolder/module1/src/test/java/com/.../.../controllers/TestClass.java
while the script is present in this folder:
mainFolder/scripts/postgres/script.sql
Basically I'm not sure which string (relative path) I should put in the value parameter of the #Sql annotation.
I am afraid that if your scripts are not copied to the class path there are not a lot of options.
Please, try something like:
#Sql("file:/path-to-mainFolder/mainFolder/scripts/postgres/script.sql")
As you can see in the documentation you can use any valid resource type.
Having said that, I think the best option is to make these resources available in the classpath. If you are using maven, you can use for instance the copy-resources goal of the maven resources plugin to copy your resources when running your tests:
<project>
...
<build>
<plugins>
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>3.2.0</version>
<executions>
<execution>
<id>copy-resources</id>
<!-- here the phase you need: validate, test-compile... -->
<phase>validate</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${basedir}/target/scripts</outputDirectory>
<resources>
<resource>
<!-- Depending on your project, try defining the scripts src location as you consider more appropriate -->
<directory>mainFolder/scripts/postgres/script.sql</directory>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
...
</build>
...
</project>
With this setup, now you can define your script location as a classpath resource:
#Sql("/scripts/postgres/script.sql")
Perhaps, I am not sure about that, in a similar fashion you can use the plugin testResources goal as well.
If you want to use the scripts on your local environment machine you can also add the directory mainFolder/scripts/postgres as a dependency. In Intellij, you con go to Project Structure -> Modules -> your module -> dependencies -> + -> Jars or directories -> your script directory -> Classes.
Then you will be able to use
#Sql("/yourScript.sql")
In DevOps pipeline you would need to add the dedicated directory to your application server classpath.
Keep in mind that you will have to keep unique names for your sql files, because if you intend to use a structure with multiple directory levels, I think the jvm will load only the first entry. It is not a recommended approach, but it is fast if you only need to generate a report or test something on your local environment.
Another approach would be to specify at runtime -Xbootclasspath/a:. Then you would be able to use #Sql with relative paths as you initially wanted.
E.g.
-Xbootclasspath/a:path-to-mainFolder
then you would be able to use
#Sql("/scripts/postgres/script.sql")

How do I reference my lambda from code in AWS Cloud Development Kit?

import software.amazon.awscdk.services.lambda.Function;
Function helloLambda = new Function(helloStack, "hellocdkworld123", FunctionProps.builder()
.functionName("HelloLambda")
.code(Code.fromAsset("target/cdkhello-0.1.jar")) // <- x ?
.runtime(Runtime.JAVA_8)
.handler("com.myorg.functions.HelloLambda::sayHello") <- y?
.build());
There is also a possibility to reference it by S3 bucket. But when I run cdk bootstrap I get a generated bucket with generated name of the jar file. How should I be able to reference that before hand from code? Of course now I could write the exact bucket + file but then purpose of defining it from code is lost right?
First of all, assuming that the method that you want to execute when the Lambda is invoked is sayHello, from the com.myorg.functions.HelloLambda class, then that part of your solution is correct. The more difficult part is actually accessing the JAR with your Lambda code in it.
NOTE: I've updated my original answer with what I think is a better way to accomplish this. In order to avoid confusion and making this answer too wordy, I've removed the original answer, though much of it is common with this one. I credit this answer for helping to improve this answer.
Pass the path to the dependent resource's JAR to CDK
TL;DR
Create a new property for the full path to your Lambda JAR.
Associate dependency and execution related goals into the package phase of the build.
Update cdk.json to point to the the package phase.
Pass the full path via a system property to your CDK code.
Use the System property to pass to Code.asset(...).
Preparation
I've separated out the Lambda and the CDK infrastructure code into separate Maven modules. The intention being that once the Lambda code is compiled, packaged up into an uber JAR (its code plus all of its dependencies' code), the infrastructure module can refer to it as a dependency, passing the full path to the Lambda JAR to the App/Stack class to that it can use it as an asset.
Create a new property for the full path to your Lambda JAR.
In the properties section of your pom.xml, create a new property to refer to your Lambda JAR. Something like this:
<properties>
...
<lambda.jar>${GROUP_ID:ARTIFACT_ID:jar}</lambda.jar>
...
</properties>
Populate a property with the full path to your Lambda dependency's JAR, using the dependency plugin.
<build>
<plugins>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>3.1.1</version>
<executions>
<execution>
<goals>
<goal>properties</goal>
</goals>
<phase>package</phase>
</execution>
</executions>
</plugin>
...
<plugins>
</build>
This associates the properties goal with the process-resources phase. Whenever that phase of the build occurs, the property you've created previously will be populated with the full path to the JAR in your local repository.
Associate dependency and execution related goals into a single phase of the build.
When you create a new CDK Java project, it outputs a file called cdk.json, which points by default to the Maven exec:java goal. In order for your new lambda.jar property to be populated correctly, you need to associate the exec:java goal with the same phase as above.
<build>
<plugins>
...
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.6.0</version>
<executions>
<execution>
<goals>
<goal>java</goal>
</goals>
<phase>package</phase>
</execution>
</executions>
<configuration>
<mainClass>com.myorg.TestingApp</mainClass>
</configuration>
</plugin>
...
</plugins>
</build>
In order for your code to get access to the JAR file that you've generated, you need to create a System property (I couldn't get environment variables to work) to your App class. Your pom.xml started with something like this:
Pass the full path via a system property to your CDK code.
In the configuration section (after mainClass), add a system property for your assets directory, something like this:
<systemProperties>
<systemProperty>
<key>lambda.jar</key>
<value>${lambda.jar}</value>
</systemProperty>
</systemProperties>
Update cdk.json to point to the the common phase you've used.
Your cdk.json of your CDK project should be changed to point to the process-resources phase. Once done it will look like this:
{
"app": "mvn package"
}
It will cause both the goals to be run in succession, and upon execution the path to your Lambda's JAR will be passed as a system property.
Access the property from your App/Stack code.
Finally, now that the system property is created, you can access it from your code by calling System.getProperty("lambda.jar"). Something like this:
final Code code = Code.fromAsset(System.getProperty("lambda.jar"));
You can then use the code reference wherever needed when defining your Lambda functions.

How to aggregate maven subproject javadoc output without regenerating javadoc

I have a largish multimodule Maven build. I need to generate the javadoc for all of the modules and produce an "aggregated" javadoc result that I can deploy to a box for consumption by users.
I did have this working perfectly fine for quite a while, until I tried implementing a custom taglet with specific features and requirements, which makes this more complicated to produce.
All of the submodules inherit a parent pom that is not the aggregator pom. In that parent pom I define the maven-javadoc-plugin. This is what it looked like before I added the custom taglet:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.10.4</version>
<configuration>
<additionalparam>-Xdoclint:none</additionalparam>
<bottom>Unified Service Layer - bottom</bottom>
<doctitle>Unified Service Layer - title</doctitle>
<footer>Unified Service Layer - footer</footer>
<groups></groups>
<header>Unified Service Layer - header</header>
<level>public</level>
<packagesheader>Unified Service Layer - packagesheader</packagesheader>
<top>Unified Server Layer - top</top>
<windowtitle>Unified Service Layer - windowtitle</windowtitle>
</configuration>
<executions>
<execution>
<id>module-javadoc-jar</id>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<show>protected</show>
<detectLinks>false</detectLinks>
</configuration>
</execution>
<execution>
<id>aggregated-documentation</id>
<phase>package</phase>
<inherited>false</inherited>
<goals>
<goal>aggregate-jar</goal>
</goals>
<configuration>
<show>protected</show>
<detectLinks>false</detectLinks>
</configuration>
</execution>
</executions>
</plugin>
With this, I could build all all of the modules, which will generate their own javadoc (which I now know is just a validation step, as aggregate-jar doesn't use this output). I have a separate step I call from jenkins that runs "javadoc:aggregate-jar" in the root project, which produces the aggregated javadoc jar that I deploy.
Again, this has been working fine until now.
I implemented a custom javadoc taglet which requires getting access to the Class object associated with the source file it is contained within. I got this to work, at least in the individual module builds by adding the following to the configuration above:
<taglets>
<taglet>
<tagletClass>com.att.det.taglet.ValidationConstraintsTaglet</tagletClass>
</taglet>
<taglet>
<tagletClass>com.att.det.taglet.ValidationConstraintsCombinedTaglet</tagletClass>
</taglet>
</taglets>
<tagletArtifacts>
<tagletArtifact>
<groupId>com.att.detsusl.taglets</groupId>
<artifactId>validationJavadocTaglet</artifactId>
<version>0.0.1-SNAPSHOT</version>
</tagletArtifact>
</tagletArtifacts>
In order to have the taglet get access to the class file, I had to add a minimal plugin configuration to each subproject pom.xml, which looks like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<configuration>
<tagletArtifacts combine.children="append">
<tagletArtifact>
<groupId>com.att.detsusl</groupId>
<artifactId>artifact-name</artifactId>
<version>${current.pom.version}</version>
</tagletArtifact>
</tagletArtifacts>
</configuration>
</plugin>
With these minimal changes, I could run the build in each module, generating the javadoc, and examining the generated javadoc output in each module, verifying that it all worked.
However, the problem is, when I run "javadoc:aggregate-jar" in the root project, all of that already built output is ignored. It reruns the javadoc generation for all of the subprojects, also ignoring the appended tagletArtifacts list in each subproject pom.xml file. As a result, I get ClassNotFound errors when it tries to get the class file.
I could "fix" this by putting all of the subproject GAVs into the top-level "tagletArtifacts" list, but I definitely do not want to do that. I liked the ability to specify this in the subproject pom.xml (with combine.children="append") to make it work.
What I need is an overall javadoc package for all of the subprojects, with the taglet able to get access to the class file, without forcing the parent pom to know about all of its subprojects. How can I do this?
I'm facing the same problem with all aggregate goals. I checked the source code to maven-javadoc-plugin and it turns out that aggregate work by traversing submodules and collecting source files and nothing more, thus completely ignoring any form configurations specified in the submodules.
During execution every submodule is completely ignored:
source
if ( isAggregator() && !project.isExecutionRoot() ) {
return;
}
And during collection of source files submodules are traversed: source
if ( isAggregator() && project.isExecutionRoot() ) {
for ( MavenProject subProject : reactorProjects ) {
if ( subProject != project ) {
List<String> sourceRoots = getProjectSourceRoots( subProject );
So at the moment, there is no way to do this.
This is not easy to fix either since the whole plugin works by composing a single call to the actual javadoc tool. If you would like to respect settings in the submodules as well, you'll have to merge the configuration blocks of them. While this would work in your case with tagletArtifacts, it does not work for all the settings you can specify, e.g. any form of filter, and can therefore not be done in a generic way.

Specify pom properties via a properties file?

Due to the way my build system is designed (RTC Build Engine), I would like to provide maven with property values via a properties file, instead of specifying -Dkey=value for every property.
I found a couple of questions on S.O. (How to set build properties from a file in Maven POM? and How to read an external properties file in Maven) that relate precisely to this question, but they are relatively old, and both require custom plugins to work (in alpha state).
I realize that passing parameters to Maven like this is probably not the best solution, but the other option is specifying everything on the command line via -D settings which is not ideal either.
Furthermore, given that this properties file is only really used by the build engine (and not by the individual developer), I don't truly believe it belongs in the pom. But I cannot find any other mechanism that would allow me to specify a plugin to use - settings.xml does not permit specifying plugins.
Is my only choice in this case to use a plugin and specify it in the project pom?
in the pom you can place...
<properties>
<core-version>1234</core-version>
<lib-version>1234</lib-version>
<build-version>9999</lib-version>
<build-date>20150101</build-date>
</properties>
with all the properties you require.
Or you can use...
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>properties-maven-plugin</artifactId>
<version>1.0-alpha-2</version>
<executions>
<execution>
<phase>initialize</phase>
<goals>
<goal>read-project-properties</goal>
</goals>
<configuration>
<files>
<file>dev.properties</file>
</files>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
and the file dev.properties will contain the properties
core-version=1234
lib-version=1234
build-version=9999
build-date=20150101
...
Or... you can inject the properties using a settings.xml file as shown here
You may also find the Maven build number plugin useful... here
The best in such cases is to upgrade to at least Maven 3.2.1 which supports defining such properties on the command line like the following:
mvn -Drevision=1234 -Dchangelist=WhatEver -Dsha1=XXXX clean package
But you can only use the above names.
Excerpt from release notes:
A simple change to prevent Maven from emitting warnings about versions
with property expressions. Allowed property expressions in versions
include ${revision}, ${changelist}, and ${sha1}. These properties can
be set externally, but eventually a mechanism will be created in Maven
where these properties can be injected in a standard way. For example
you may want to glean the current Git revision and inject that value
into ${sha1}. This is by no means a complete solution for continuous
delivery but is a step in the right direction.

Post-process jar after assembly but before installation (to get idempotent builds)

We use Jenkins which use md5 fingerprinting to identify artifacts and whether the artifact has changed since the last build. Unfortunately Maven builds always generate binary different artifacts.
Therefore I am looking into making Maven generate the same jar artifact for the same set of input files regardless of where and when they were built, which amongst other things mean that the entries in the jar file must be sorted - not only in the index, but in the order they are written to the jar file.
After examining maven-jar-plugin which use maven-assembly-plugin, my conclusions are that they do not collect all files to be written in memory before writing them all at once, but write one at a time. This mean that it may be better to postprocess the generated jar instead of changing the current behavior so I at that time can sort the entries, zero the timestamps, etc.
I am unfamiliar with writing Maven plugins, so my question is, how should I write a plugin which Maven knows how to tell where the artifact-jar-in-progress is located and how I hook it up in my pom.xml?
(At first I need this to work for jar files, but war files would be nice too).
As mentioned, this can be done based on something similar to maven-shade-plugin. I went ahead and wrote a simple plugin to add this capability -- see https://github.com/manouti/jar-timestamp-normalize-maven-plugin (available on the Central repo).
The behavior is based on the shade plugin. It consists of a single goal called normalize which can be bound to the package lifecycle phase and configured in the project's POM:
<plugins>
<plugin>
<groupId>com.github.manouti</groupId>
<artifactId>jar-timestamp-normalize-maven-plugin</artifactId>
<version>1.0-SNAPSHOT</version>
<executions>
<execution>
<id>jar-normalize</id>
<goals>
<goal>normalize</goal>
</goals>
<phase>package</phase>
</execution>
</executions>
</plugin>
</plugins>
A few notes about the plugin:
The artifact under build is accessed via project#getArtifact() where project is a org.apache.maven.project.MavenProject.
Normalization consists of mainly three steps:
Setting the last modified time of all Jar entries to a specific timestamp (default value is 1970-01-01 00:00:00AM but can be changed via -Dtimestamp system property).
Reordering (alphabetically) of attributes in the manifest except for Manifest-Version which always comes first.
Removing comments from the pom.properties file which contain a timestamp that causes the Jar to differ from one build to another.
Once invoked, the goal will generate the output file next to the original artifact (named artifactId-version-normalized.jar), i.e. in the project.build.directory directory.
To create maven plugin project
mvn archetype:generate \
-DgroupId=sample.plugin \
-DartifactId=hello-maven-plugin \
-DarchetypeGroupId=org.apache.maven.archetypes \
-DarchetypeArtifactId=maven-archetype-plugin
invoke this command it will generate a skeleton project with a class called MyMojo.java
write your stuff inside execute() method, and install that plugin to your repository by mvn clean install
then attach its execution with your project, in your project pom.xml
<build>
<plugins>
<plugin>
<groupId>sample.plugin</groupId>
<artifactId>hello-maven-plugin</artifactId>
<version>1.0-SNAPSHOT</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>sayhi</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
to access project properties inside your Mojo
/**
* The Maven project.
*
* #parameter expression="${project}"
* #required
* #readonly
*/
private MavenProject project;
and then
project.getProperties("build.directory")
and read other properties to determine your jar file packed
See
maven: guide-java-plugin-development
I agree on creating a custom maven plugin seems like a better option. I dont know about an existing plugin provides solution for what you asked.
md5 checksum (or sha-1 in my repository) is generated with install plugin, so seems like you need to extend this or write a new plugin which works after install phase.
I have 2 suggestions about this plugin:
1) When thinking simple, this plugin should:
Read generated jar:
Extract all entries.
Exclude some entries (e.g. MANIFEST.MF).
Sort remaining entries .
Extract md5s for each in memory.
Generate a single md5 from all of those extracted.
However when considering about where & when independency: Accordig to .class file structure Java_class_file there is minor, major versions entries are held in compiled class files. So if compiler changes, .class files will be changed. In this case we need a check on source code level from this point :( So this solution become useless if there is no guarantee on copiler version.
2) As very dirty but easy solution, this plugin may only extract your module's pom.xml file's md5 code. But you must guarantee each change in your jar reflects to a minor version (or built number) manually.
Instead of writing your own plugin you can write a Groovy script that is executed by groovy-maven-plugin:
<plugin>
<groupId>org.codehaus.gmaven</groupId>
<artifactId>groovy-maven-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<source>
import java.util.jar.*
String fileName = '${project.build.directory}/${project.build.finalName}.jar'
println "Editing file ${fileName}"
JarFile file = new JarFile(fileName);
// do your edit
</source>
</configuration>
</execution>
</executions>
</plugin>

Categories

Resources