Customized build lifecycle for Java project - java

Working on one Maven project, I faced quite a curious kind of dependencies cycle: there are two maven modules, one of which is instrumenting the Java bytecode, and at the same time uses assertions (for unit-testing purposes), defined in another module, which is in turn supposed to be instrumented by the first one.
So, it's not just a cycle, it's cycle, spreaded between maven phases. I failed to solve it by means of reorganizing Maven modules, and I doubt that it is possible in such case.
Hypothetical solution for this problem might be to reorganise build lifecycle in a following way:
Compile the first module's sources
Compile the second module's sources
Instrument the second module using the 1st module's classes
Test first and second modules
Package them
Install/deploy them
I doubt that Maven was designed for such hacks. What about other tools? Can it be done with Gradle or Ivy? Or maybe it is possible in Maven by some plugin? Or probably the problem is typical and has more straight solution?
PS: please do not suggest to outline common dependencies to a separate module. If it was so simple, I wouldn't be here.

In my opinion you should look on Gradle for this task, specifically to it's multi-project builds. Gradle allows to access tasks from any project of multi-project build from any build script. Therefore, you should define tasks you need in subprojects and call them from the root project in any order you want.

Gradle proposal was really good, but was not applicable in my case due to internal obstacles.
For Maven, I just had to admit that separating the instrumentation code and test assertions to modules is not possible in my case: they are too coupled at build time. So, instead of trying to separate them at build time, I managed to separate them afterwards.
Note that this solution is not nice way of doing stuff: you may get class loading exceptions at runtime if the classes you are trying to separate are actually using each other. Also, there won't be any transitive resolution between the separated jars - Maven will treat them as completely independent.
So, I managed to get instrumentation code and test assertions separated to two jar artifacts by following this step sequence:
Let them both be in one Maven module.
Do compilation and instrumentation on the module's build phase and testing on test phase as usual. It is no more the problem since all the nesessary stuff for this phases is located right in the module.
At package phase, configure maven jar plugin to collect additional artifacts with limited set of class files.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>
<execution>
<id>jar-api</id>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<classifier>api</classifier>
<includes>
<include>com/example/api/**</include>
</includes>
</configuration>
</execution>
<execution>
<id>jar-codegen</id>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<classifier>codegen</classifier>
<skipIfEmpty>false</skipIfEmpty>
<includes>
<include>oo/example/codegen/**</include>
</includes>
</configuration>
</execution>
<execution>
<id>jar-tests</id>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<skipIfEmpty>false</skipIfEmpty>
<classifier>tests</classifier>
<includes>
<include>com/example/tests/**</include>
</includes>
</configuration>
</execution>
</executions>
</plugin>
At install phase, these additional artifacts will be installed to local repository together with the assembled module.

Related

ANT to Maven Migration. Maven alternative to build.xml

Background:
I am working on a open source tool called draw.io which is based on ANT build system and uses Java servelets to handle request. I am supposed to migrate it to spring boot with using same front end files. I put those files in static folder and tried to build the project. I figured that the front end js files were not getting build (i.e. were not getting converted to app.min.js, which is the main entry point for front end files), in the process and none of the js changes were getting reflected in the file.
I figured that this process was mentioned in build.xml as part of various steps which is ANT specific configuration. Now, I have to achieve the same in maven as the migration process.
How do we convert build.xml to maven or what is the maven alternative of achieving the tasks mentioned in the build.xml as part of build process?
This is the high level view of build.xml:->
I am also providing the link of build.xml here...
Please provide me with some direction.
Before migrating to maven, I hope you understand why you are moving to maven from ant.
You should try for finding alternative plugins for the relevant ant task. The below plugin might do what you are trying to achieve in ant
<plugin>
<groupId>com.github.blutorange</groupId>
<artifactId>closure-compiler-maven-plugin</artifactId>
<version>2.16.0</version>
<configuration>
<!-- Base configuration for all executions (bundles) -->
<baseSourceDir>${project.basedir}/src/main/resources</baseSourceDir>
<baseTargetDir>${project.build.directory}/generated-resources</baseTargetDir>
</configuration>
<executions>
<!-- Process all files in the "includes" directory individually-->
<execution>
<id>default-minify</id>
<configuration>
<encoding>UTF-8</encoding>
<sourceDir>includes</sourceDir>
<targetDir>includes</targetDir>
<includes>**/*.js</includes>
<skipMerge>true</skipMerge>
<closureLanguageOut>ECMASCRIPT5</closureLanguageOut>
</configuration>
<goals>
<goal>minify</goal>
</goals>
<phase>generate-resources</phase>
</execution>
</executions>
</plugin>
More details about the plugin : closure-compiler-maven-plugin
There are few cases during my ant to maven migration, I came across some custom tasks which I was not able to find appropriate plugins.
I used maven-antrun-plugin which keeps existing ant tasks in maven.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.8</version>
<executions>
...
</executions>
</plugin>
More details about how to use the maven antrun plugin : See this
tutorial
For Maven, you need a pom.xml. There you need to define and configure the plugins you need. If you have a specific procedure written in Ant that you want to reuse, you can call it with the Maven Antrun Plugin.
Generally, Maven is very different from Ant. You don't write procedural code, but configure plugins running in a lifecycle.

Maven: How to have jar-with-dependencies exclude "provided" dependencies?

I have a Maven Scala project that will be deployed on some container and therefore mark several of the dependencies with scope provided meaning those dependencies will be used for compiling but not taken into account for transitive resolution as they are "provided at runtime". However, when I run the following command, it produces the intended jar with dependencies but also including those dependencies that were marked as provided.
mvn clean install assembly:assembly -DdescriptorId=jar-with-dependencies -DskipTests
I tried existing answers to this problem e.g. Excluding “provided” dependencies from Maven assembly but for some reason produces an incorrect choice of dependencies and even missing the main code. In this OP I'd like to find a cleaner, more up to date solution to this problem ... is there one?
You may be better off with a different maven plugin. See Difference between maven plugins ( assembly-plugins , jar-plugins , shaded-plugins. Shade would probably suit you best in this case. What you are looking to create is referred to an uber-jar.
Regarding Shade, from the Maven website:
This plugin provides the capability to package the artifact in an uber-jar, including its dependencies and to shade - i.e. rename - the packages of some of the dependencies.
The goals for the Shade Plugin are bound to the package phase in the build lifecycle.
Configuring Your Shade Plugin:
<project>
...
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<!-- put your configurations here -->
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
...
</project>
Note that the default implementation replaces your project's artifact with the shade version. Need both? Look here: Attaching the Shaded Artifact
Merging several jars at once is not necessarily utter simplicity and so Shade has the concept of Resource Transformers (link also has more samples).
Aggregating classes/resources from several artifacts into one uber JAR is straight forward as long as there is no overlap. Otherwise, some kind of logic to merge resources from several JARs is required. This is where resource transformers kick in.
The project site is actually quite good. There are lots of varied examples.

Maven + Tycho, adding Maven dependencies

We have an Eclipse Plugin which we build using Maven and Tycho. Currently
however, we still provide all project dependencies through a bunch of manually
added JAR files and not by Maven. This is due to the following reasons: (1) The
dependencies are not available through a standard Eclipse update site (at least
not in a current version), (2) the dependencies are not available as bundles.
The biggest part of these dependencies are the Selenium libraries (API, Remote,
browser-specific libs and their transitive dependencies, such as Guava, etc.)
I've wasted hours, trying to pull those dependencies during our Maven build.
Following this SO question, I tried the p2-maven-plugin, created an update
site with our dependencies which I added to my Eclipse target platform. However,
during runtime, classes, which are referenced across different JARs could not be
loaded (I assume, from my very limited OSGi knowledge, because some
necessary information was missing in the MANIFEST.MF files). Here's an example
of the issue, when trying to create a RemoteWebDriver, which uses the
DesiredCapabilities class (both classes in different bundles):
Exception in thread "Thread-8" java.lang.NoClassDefFoundError: org/openqa/selenium/remote/DesiredCapabilities
at org.openqa.selenium.remote.RemoteWebDriver.startSession(RemoteWebDriver.java:243)
at org.openqa.selenium.remote.RemoteWebDriver.<init>(RemoteWebDriver.java:126)
at org.openqa.selenium.remote.RemoteWebDriver.<init>(RemoteWebDriver.java:153)
…
Caused by: java.lang.ClassNotFoundException: org.openqa.selenium.remote.DesiredCapabilities cannot be found by org.seleniumhq.selenium.remote-driver_2.45.0
at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:439)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:352)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:344)
at org.eclipse.osgi.internal.loader.ModuleClassLoader.loadClass(ModuleClassLoader.java:160)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
Is there anything I still need to take care of, when using the p2-maven-plugin? The relevant parts of the pom.xml looked like this:
<plugin>
<groupId>org.reficio</groupId>
<artifactId>p2-maven-plugin</artifactId>
<version>1.1.1-SNAPSHOT</version>
<executions>
<execution>
<id>default-cli</id>
<configuration>
<artifacts>
<artifact>
<id>org.seleniumhq.selenium:selenium-remote-driver:2.45.0</id>
</artifact>
</artifacts>
</configuration>
</execution>
</executions>
</plugin>
Couldn't get it to work, so we're now using the maven-dependency-plugin with the copy-dependencies, which we execute during the Maven initialize phase to pull all necessary dependencies (contrary to my initial feeling, this can be combined with the pom.xml using the eclipse-plugin packaging and the "manifest first" approach). The relevant part looks like this:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.10</version>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>initialize</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<includeScope>runtime</includeScope>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
The Maven dependencies are then copied to target/dependency.
Only little issue: The Bundle-ClassPath in the MANIFEST.MF needs to be manually updated in case the name of a JAR file changes when updating Maven dependencies (e.g. commons-io-2.4.jar becomes commons-io-2.5.jar).
[edit] Revisiting this answer in regards to the last sentence above: The version numbers can be conveniently stripped through the following option: <stripVersion>true</stripVersion>. This means, the above library will be renamed to commons-io.jar and thus no paths need to be updated when a version number changes.
Another possibility:
Some jar files may be broken (if you're using Eclipse, it's commonplace hibernate-commons-annotations-4.0.1.Final.jar; invalid LOC header (bad signature)? ). To check this possibility, try manually opening the jar to see if it's okay.
I also build an Eclipse plugin with Maven and Tycho. I have the same problem: the bundle org.eclipse.team.svn.core and org.eclipse.team.svn.ui are not available through a standard Eclipse update site.
Maybe you can try this to solve this kind of problem:
In Dependencies, find the box Automated Management of
Dependencies.
Add the wanted plugin using Add...
Choose Analyze code and add dependencies to the MANIFEST.MF via: Import-Package
Click on Add Dependencies so that you find required packages in the box Imported Packages nearby.
Then you can run the Maven build.

How to check project boundaries access in Maven projects

I have a set of Maven projects and I'd like to define access rules.
For example, projects Database and Cache may only be accessed by project DataLayer, but not from project UiLayer. I'm speaking in terms of maven projects, but a package level access verification may also work, as long as it's easy to integrate into maven projects.
I've looked at Macker, which has a nice set of features such as access control b/w java packages, style checking etc, but have been having hard time tying that into a set of maven projects.
There's the macker-maven-plugin, which is still under development, and I've been able to make it work for me, but I'm afraid it's not going to serve me well.
This plugin runs verifications on all project's classes.
This means that I'll have to have macker-rules.xml defining access rules in each and every maven project from now on in order to make sure rules are not broken. This looks like a maintenance nightmare.
So - did I miss something with usage of macker-maven-plugin? Perhaps I'm not using it correctly.
I have no experience with JDepend, but from short reading it looks like the thin version of macker. There is a jDepend maven plugin, but it's functionality is merely generating reports about usage and statistics, but what I really need is something else, an access check which fails the build if it fails.
Can someone suggest a better alternative for project access checks or package access checks for maven projects?
Thanks
I think you are looking for banned dependencies from maven-enforcer-plugin.
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>1.0</version>
<executions>
<execution>
<id>enforce-banned-dependencies</id>
<goals>
<goal>enforce</goal>
</goals>
<configuration>
<rules>
<bannedDependencies>
<excludes>
<exclude>org.apache.maven</exclude>
<exclude>org.apache.maven:badArtifact</exclude>
<exclude>*:badArtifact</exclude>
</excludes>
<includes>
<!--only 1.0 of badArtifact is allowed-->
<include>org.apache.maven:badArtifact:1.0</include>
</includes>
</bannedDependencies>
</rules>
<fail>true</fail>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
If you split your Maven project into subprojects and structure the APIs right, it might be possible to implement your access constraints as a side-effect of the subproject dependencies.

In Maven2, is there a way to scope a dependency to "package only" and keep it off the test classpath?

Setup
I'm packaging a WAR with a number of legacy jar dependencies in it (for the sake of keeping this simple, they can not possibly be altered or excluded from the deployed WAR).
Issue
Including either or both of two of these jars will cause inexorable errors at test-time. If I exclude the dependencies altogether, the tests pass happily, but the WAR will lack real-world runtime classes it needs.
Hope
Maven2 offers compile, test, runtime, system, and provided scopes. Sadly, none of these will be included in the assembly but kept off the test classpath. My hope is that I'm missing some obvious way to handle this case entirely within the dependency management feature.
Fear
I'll have to use the assembly plugin to copy these problem jars into the target. I don't want to have to skirt the dependency management system to copy jars in the clear into the target, as I don't want to manage these jars outside the internal repository.
Thoughts? Alternatives?
Of course it figures that I'd come across a potential solution to this moments after posting a question. It appears that the copy goal of the dependency plugin may handle this. Going to try this out now: http://maven.apache.org/plugins/maven-dependency-plugin/usage.html
Edit: Turns out that this worked fine for my needs, snippet below:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<phase>prepare-package</phase>
<goals>
<goal>copy</goal>
</goals>
</execution>
</executions>
<configuration>
<artifactItems>
<artifactItem>
<groupId>group</groupId>
<artifactId>artifact</artifactId>
<version>version</version>
<type>jar</type>
<outputDirectory>${project.build.directory}/${project.build.finalName}/WEB-INF/lib</outputDirectory>
</artifactItem>
</artifactItems>
</configuration>
</plugin>
Break the test classes out of the jar file into a separate jar which is only for tests, and add an exclusion to the dependency on the jar with the legacy dependencies.

Categories

Resources