I have a Maven Scala project that will be deployed on some container and therefore mark several of the dependencies with scope provided meaning those dependencies will be used for compiling but not taken into account for transitive resolution as they are "provided at runtime". However, when I run the following command, it produces the intended jar with dependencies but also including those dependencies that were marked as provided.
mvn clean install assembly:assembly -DdescriptorId=jar-with-dependencies -DskipTests
I tried existing answers to this problem e.g. Excluding “provided” dependencies from Maven assembly but for some reason produces an incorrect choice of dependencies and even missing the main code. In this OP I'd like to find a cleaner, more up to date solution to this problem ... is there one?
You may be better off with a different maven plugin. See Difference between maven plugins ( assembly-plugins , jar-plugins , shaded-plugins. Shade would probably suit you best in this case. What you are looking to create is referred to an uber-jar.
Regarding Shade, from the Maven website:
This plugin provides the capability to package the artifact in an uber-jar, including its dependencies and to shade - i.e. rename - the packages of some of the dependencies.
The goals for the Shade Plugin are bound to the package phase in the build lifecycle.
Configuring Your Shade Plugin:
<project>
...
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<!-- put your configurations here -->
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
...
</project>
Note that the default implementation replaces your project's artifact with the shade version. Need both? Look here: Attaching the Shaded Artifact
Merging several jars at once is not necessarily utter simplicity and so Shade has the concept of Resource Transformers (link also has more samples).
Aggregating classes/resources from several artifacts into one uber JAR is straight forward as long as there is no overlap. Otherwise, some kind of logic to merge resources from several JARs is required. This is where resource transformers kick in.
The project site is actually quite good. There are lots of varied examples.
Related
I have a project with a parent directory containing the following in its pom.xml:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>2.20.1</version>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
and
<modules>
<module>submodule</module>
</modules>
The submodule subdirectory again contains a pom.xml with a reference to its parent artifact. The subdirectory also contains a number of integration tests which run fine if I move the failsafe plugin into the submodule's pom.xml and then invoke mvn verify from the parent directory but this will not work with this current (preferred) setup (There are no errors, the tests are simply not executed).
I've tried adding the submodule artifact to dependenciesToScan in the failsafe plugin's configuration but that did not solve the problem. Do I need to add the submodule as a dependency in the parent pom.xml? Because that results in a "dependency is referencing itself" error while processing the pom.xml.
Help would be appreciated.
EDIT: I have figured it out, someone else working on the project had wrapped the build section in a profile section, I did not realise this at first because the whole file is rather large and unwieldly and I had overlooked the corresponding git commit. By undoing that change and following the instructions in the link posted by Gerald Broser I managed to solve my problem (I suppose just executing the respective profile would have also done it, but that change was uncalled for anyway).
See Maven Failsafe Plugin / Usage / Usage in multi-module projects:
When you are defining a shared definition of the Failsafe Plugin in a parent pom, it is considered best practice to define an execution id in order to allow child projects to override the configuration.
try to call
mvn clean verify -P <module>
Working on one Maven project, I faced quite a curious kind of dependencies cycle: there are two maven modules, one of which is instrumenting the Java bytecode, and at the same time uses assertions (for unit-testing purposes), defined in another module, which is in turn supposed to be instrumented by the first one.
So, it's not just a cycle, it's cycle, spreaded between maven phases. I failed to solve it by means of reorganizing Maven modules, and I doubt that it is possible in such case.
Hypothetical solution for this problem might be to reorganise build lifecycle in a following way:
Compile the first module's sources
Compile the second module's sources
Instrument the second module using the 1st module's classes
Test first and second modules
Package them
Install/deploy them
I doubt that Maven was designed for such hacks. What about other tools? Can it be done with Gradle or Ivy? Or maybe it is possible in Maven by some plugin? Or probably the problem is typical and has more straight solution?
PS: please do not suggest to outline common dependencies to a separate module. If it was so simple, I wouldn't be here.
In my opinion you should look on Gradle for this task, specifically to it's multi-project builds. Gradle allows to access tasks from any project of multi-project build from any build script. Therefore, you should define tasks you need in subprojects and call them from the root project in any order you want.
Gradle proposal was really good, but was not applicable in my case due to internal obstacles.
For Maven, I just had to admit that separating the instrumentation code and test assertions to modules is not possible in my case: they are too coupled at build time. So, instead of trying to separate them at build time, I managed to separate them afterwards.
Note that this solution is not nice way of doing stuff: you may get class loading exceptions at runtime if the classes you are trying to separate are actually using each other. Also, there won't be any transitive resolution between the separated jars - Maven will treat them as completely independent.
So, I managed to get instrumentation code and test assertions separated to two jar artifacts by following this step sequence:
Let them both be in one Maven module.
Do compilation and instrumentation on the module's build phase and testing on test phase as usual. It is no more the problem since all the nesessary stuff for this phases is located right in the module.
At package phase, configure maven jar plugin to collect additional artifacts with limited set of class files.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>
<execution>
<id>jar-api</id>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<classifier>api</classifier>
<includes>
<include>com/example/api/**</include>
</includes>
</configuration>
</execution>
<execution>
<id>jar-codegen</id>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<classifier>codegen</classifier>
<skipIfEmpty>false</skipIfEmpty>
<includes>
<include>oo/example/codegen/**</include>
</includes>
</configuration>
</execution>
<execution>
<id>jar-tests</id>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<skipIfEmpty>false</skipIfEmpty>
<classifier>tests</classifier>
<includes>
<include>com/example/tests/**</include>
</includes>
</configuration>
</execution>
</executions>
</plugin>
At install phase, these additional artifacts will be installed to local repository together with the assembled module.
I have a multimodules project:
parent
|____ module1
|____ module2
|____ module3
I want to generate aggregated Javadoc for all the modules. This works by using something like this in the parent's pom.xml (which has a pom packaging and defines the children modules):
//...
<modules>
<module>module1</module>
<module>module2</module>
<module>module3</module>
</modules>
//...
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.10.3</version>
<executions>
<execution>
<id>aggregate</id>
<goals>
<goal>aggregate</goal>
</goals>
<phase>prepare-package</phase>
</execution>
</executions>
</plugin>
</plugins>
</build>
The aggregated Javadoc is generated correctly, that works well!
But the problem is that I need to include the generated aggregated Javadoc in the module3 final .jar! In other words, I want the resulting module3.jar to contain a copy of that generated aggregated Javadoc of all the modules!
That's why I try to run the maven-javadoc-plugin plugin at the prepare-package phase in the parent project: I'd like the Javadoc to be generated before the packaging of module3 is done, so I can include it (by copying it using a maven-antrun-plugin plugin, for example).
But, and here's my problem, it seems that even if I use the prepare-package phase, the aggregated Javadoc is not generated yet when the package phase is run for the module3 artifact! It's like if the parent plugin is run after all the children plugins, even if it is declared using a phase which is supposed to be run before...
Any idea on how I could generate the aggregated Javadoc for all the modules before the package phase of the module3, so I can include that Javadoc?
I wish someone finds a better solution, but here's the workaround I did, if it can help someone one day:
I do not let Maven generate the aggregated Javadoc by itself. I prevent that by wrapping the maven-javadoc-plugin plugin in a <profile>. I gave an "aggregatedJavadoc" id to mine.
Then, in module3's pom.xml, I added a exec-maven-plugin plugin that, ultimately, programatically calls the aggregate goal, in the "aggregatedJavadoc" profile, on the parent module, at the prepare-package phase! Then I copy the resulting Javadoc to the build output folder of the module3 module, so it is included in the resulting .jar.
The script that is called by the exec-maven-plugin plugin is custom in my case, but many solutions can be use to programmatically call the target Maven goal: Apache Maven Invoker, for example.
I have a number of Maven projects being built my Jenkins server. These projects have dependencies on each other, e.g.
service-base -> java-base -> pom-base
In other words, the Maven project service-base depends on the Maven project java-base. Naturally, my POM files look like this:
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>my.com</groupId>
<artifactId>service-base</artifactId>
<dependencies>
<dependency>
<groupId>my.com</groupId>
<artifactId>java-base</artifactId>
<version>1.0.0</version>
</dependency>
</dependencies>
</project>
The issue is that none of my Maven projects have "releases" per-se, since I'm using continuous integration to release my changes. Currently, I allow artifact overwriting in my Maven repo and keep all of my versions at 1.0.0. This is because I release my packages many times a day and changing the versions in all the POM files each time I submit a new package version.
Ideally, what I would like is for Jenkins to generate a new version, e.g. 1.0.{BUILD_NUMBER} and then for it update the dependencies all the way up the dependency tree.
Question: Is this possible? Or does anyone have any other solutions to versioning?
Here is how I achieved the same, using Maven profiles, Maven classifiers and Jenkins parametrized builds.
You can define a jenkins profile (or whatever name you prefer) in the pom of the concerned projects. This profile will not be active by default, so your local builds will keep on working as usual. However, this profile will be activated on the Jenkins builds (via the -Pjenkins option on the Maven execution).
How this profile look like in the project at the top of the hierarchy:
<profiles>
<profile>
<id>jenkins</id>
<properties>
<groupId>${project.groupId}</groupId>
<artifactId>${project.artifactId}</artifactId>
<version>${project.version}</version>
<packaging>${project.packaging}</packaging>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.5</version>
<executions>
<execution>
<id>generate-default-version</id>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<classifier>${BUILD_NUMBER}</classifier>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<version>2.4</version>
<executions>
<execution>
<id>install-default-version</id>
<phase>install</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<file>${project.build.directory}/${project.build.finalName}-${BUILD_NUMBER}.${project.packaging}</file>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
What is the profile doing?
We are using the Maven Jar Plugin to generate on the package phase yet another artefact for the same project, so the project will create the normal jar plus another jar having as classifier the BUILD_NUMBER (i.e. myproject-1.0.jar and myproject-1.0-4567.jar)
We are also using the Maven Install Plugin to install the additional artefact (the myproject-1.0-4567.jar) into the local Maven cache (so it will be visible to other dependent projects)
We need to define some properties for the Install Plugin, otherwise the install-file will not work
Hence, when on your Jenkins build you will execute the following:
mvn clean install -Pjenkins -DBUILD_NUMBER=${BUILD_NUMBER}
Jenkins will actually pass its BUILD_NUMBER to Maven, which will use it as defined in the jenkins profile and create (and install) an additional artefact for us using it as classifier.
Fine, now we have a dynamically created artefact using the Jenkins build number and available for other projects/builds.
But how other projects can use it?
We define in the dependent projects another profile (or again called jenkins for coherency) and re-define the dependency we now need at runtime:
<profiles>
<profile>
<id>jenkins</id>
<dependencies>
<dependency>
<groupId>com.sample</groupId>
<artifactId>test</artifactId>
<version>1.1.0</version>
<classifier>${BUILD_NUMBER}</classifier>
</dependency>
</dependencies>
</profile>
</profiles>
Note: we are actually overriding as part of the profile a dependency and saying we want that specific classifier for it. Which classifier? The BUILD_NUMBER classifier, which will be available in the local Maven cache of the Jenkins server because installed by the previous build.
But how can the dependent build know which build number and as such which classifier to use, dynamically?
Using Jenkins parametrized builds and the Jenkins Parametrized Trigger plugin.
So, to summarize:
Provider project defines the profile to create additional classifier
Consumer project defines the profile to use as dependency a specific classifier
If a project is Provider for others and Consumer of others, it can then merge the two approaches above in the same profile
The first Jenkins build activates this specific profile and pass to Maven its build number
The downstream Jenkins builds are triggered by the first, which is passing them its build number via the Parametrized Plugin
Each downstream build would then resolve the classifier specified by the parameter and, if required, also create yet another classifier for its own build (according to its profile)
Using this approach, you local builds will keep on working as usual and no classifier would be used, while Jenkins builds would use an additional classifier used across them.
We have an Eclipse Plugin which we build using Maven and Tycho. Currently
however, we still provide all project dependencies through a bunch of manually
added JAR files and not by Maven. This is due to the following reasons: (1) The
dependencies are not available through a standard Eclipse update site (at least
not in a current version), (2) the dependencies are not available as bundles.
The biggest part of these dependencies are the Selenium libraries (API, Remote,
browser-specific libs and their transitive dependencies, such as Guava, etc.)
I've wasted hours, trying to pull those dependencies during our Maven build.
Following this SO question, I tried the p2-maven-plugin, created an update
site with our dependencies which I added to my Eclipse target platform. However,
during runtime, classes, which are referenced across different JARs could not be
loaded (I assume, from my very limited OSGi knowledge, because some
necessary information was missing in the MANIFEST.MF files). Here's an example
of the issue, when trying to create a RemoteWebDriver, which uses the
DesiredCapabilities class (both classes in different bundles):
Exception in thread "Thread-8" java.lang.NoClassDefFoundError: org/openqa/selenium/remote/DesiredCapabilities
at org.openqa.selenium.remote.RemoteWebDriver.startSession(RemoteWebDriver.java:243)
at org.openqa.selenium.remote.RemoteWebDriver.<init>(RemoteWebDriver.java:126)
at org.openqa.selenium.remote.RemoteWebDriver.<init>(RemoteWebDriver.java:153)
…
Caused by: java.lang.ClassNotFoundException: org.openqa.selenium.remote.DesiredCapabilities cannot be found by org.seleniumhq.selenium.remote-driver_2.45.0
at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:439)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:352)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:344)
at org.eclipse.osgi.internal.loader.ModuleClassLoader.loadClass(ModuleClassLoader.java:160)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
Is there anything I still need to take care of, when using the p2-maven-plugin? The relevant parts of the pom.xml looked like this:
<plugin>
<groupId>org.reficio</groupId>
<artifactId>p2-maven-plugin</artifactId>
<version>1.1.1-SNAPSHOT</version>
<executions>
<execution>
<id>default-cli</id>
<configuration>
<artifacts>
<artifact>
<id>org.seleniumhq.selenium:selenium-remote-driver:2.45.0</id>
</artifact>
</artifacts>
</configuration>
</execution>
</executions>
</plugin>
Couldn't get it to work, so we're now using the maven-dependency-plugin with the copy-dependencies, which we execute during the Maven initialize phase to pull all necessary dependencies (contrary to my initial feeling, this can be combined with the pom.xml using the eclipse-plugin packaging and the "manifest first" approach). The relevant part looks like this:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.10</version>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>initialize</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<includeScope>runtime</includeScope>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
The Maven dependencies are then copied to target/dependency.
Only little issue: The Bundle-ClassPath in the MANIFEST.MF needs to be manually updated in case the name of a JAR file changes when updating Maven dependencies (e.g. commons-io-2.4.jar becomes commons-io-2.5.jar).
[edit] Revisiting this answer in regards to the last sentence above: The version numbers can be conveniently stripped through the following option: <stripVersion>true</stripVersion>. This means, the above library will be renamed to commons-io.jar and thus no paths need to be updated when a version number changes.
Another possibility:
Some jar files may be broken (if you're using Eclipse, it's commonplace hibernate-commons-annotations-4.0.1.Final.jar; invalid LOC header (bad signature)? ). To check this possibility, try manually opening the jar to see if it's okay.
I also build an Eclipse plugin with Maven and Tycho. I have the same problem: the bundle org.eclipse.team.svn.core and org.eclipse.team.svn.ui are not available through a standard Eclipse update site.
Maybe you can try this to solve this kind of problem:
In Dependencies, find the box Automated Management of
Dependencies.
Add the wanted plugin using Add...
Choose Analyze code and add dependencies to the MANIFEST.MF via: Import-Package
Click on Add Dependencies so that you find required packages in the box Imported Packages nearby.
Then you can run the Maven build.