Can Maven install POM when building and artifact with classifier? - java

I have a Maven project, which uses JAR packaging. When I run the install phase, it will install both Project-1.0.jar and Project-1.0.pom files in my local repository.
Now I would like the JAR to be built with a classifier. This is easy enough: I just add the line to my jar plugin configuration:
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<classifier>whatever</classifier>
[...]
</configuration>
</plugin>
Now, this works in that it installs Project-1.0-whatever.jar in my repo, but no longer installs a POM.
In case it matters, I want to use this feature in conjunction with profiles, i.e. I want to build JARs with different classifiers with different profiles.
The reason I want the POM is because I have other projects depending on this one. When I build one of these, it will try to find a POM for this dependency. If it can't, it will happily use the JAR, but that is not an acceptable solution for me for a couple of reasons:
It's bad enough that it will try to contact external repos and look for it, but even worse, we use a share repo, so it will download the POM from the shared repo, which may not be what I want - for example if I just made changes to the POM and am trying to test them.
Is there a solution, or can anyone suggest a reasonable workaround?
EDIT: I just discovered that the issue affects Maven 2.2.1, but not Maven 3.0.5. This may therefore be a bug or a difference in features between versions. I would still be interested in solutions/workarounds for Maven 2, as migrating the project to Maven 3 is a complicated affair and not likely to happen.

The reason turned out to be nothing to do with Maven version as such, and everything to do with the version of maven-install-plugin. It turns out versions prior to 2.3 have this bug.
Old installations of Maven are somewhat likely to suffer this issue, as Maven 2 will use any version of a plugin that it has unless a version has been explicitly specified in the POM, but maven-install-plugin is included by default and it's quite possible for a POM not to explicitly specify it at all (as it was in my case).

Related

How is default Maven plugin version decided?

I wonder when I did not specify a plugin version in some module's pom.xml like in:
<build>
...
<plugin>
<groudId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
</plugin>
...
</build>
What is the default plugin version used when I run "mvn compile"?
I have tried it and see actually it is using maven-compiler-plugin version 3.1 with above plugin element commented, my Maven version is 3.6.3.
I have spent 1 hour to google through Maven's documentation and related posts, but not find exact answer. I really like to know how that version is being decided?
The magic is not happening in the super pom, but in the so called bindings descriptor as available at https://github.com/apache/maven/blob/master/maven-core/src/main/resources/META-INF/plexus/default-bindings.xml.
However, they are moving to the matching packaging plugin, for example for the maven-jar-plugin it is located at https://github.com/apache/maven-jar-plugin/blob/master/src/main/filtered-resources/META-INF/plexus/components.xml
These versions haven't been updated, because it would be weird if 2 users with different Maven versions have different results (e.g. one has a broken build, the other not). Hence it is better to specify the plugin versions in the pom, don't rely of the defaults provided by Maven.
In the end it is all described at https://maven.apache.org/guides/introduction/introduction-to-the-lifecycle.html
It is impossible for maven to work without defining versions of the
artifacts , so somewhere it must be mentioned, lets dig in part by
part.
All pom.xmls are logically inherit from the super POM. You can always see what your "real" pom.xml looks like by typing:
mvn help:effective-pom
The resulting pom.xml that is printed is a combination of the super POM, your pom.xml, and any parent POMs in the mix as well.
Note from Maven 3 the super POM does not contain any of the (default lifecycle) plugins versions but earlier till Maven 2 it used to have.
The Maven 3 super POM is provided by the org.apache.maven.model.superpom.DefaultSuperPomProvider class https://github.com/apache/maven/blob/bce33aa2662a51d18cb00347cf2fb174dc195fb1/maven-model-builder/src/main/java/org/apache/maven/model/superpom/DefaultSuperPomProvider.java#L56-L85
The resource it loads can be found here: https://github.com/apache/maven/blob/bce33aa2662a51d18cb00347cf2fb174dc195fb1/maven-model-builder/src/main/resources/org/apache/maven/model/pom-4.0.0.xml#L23-L149
Edit:
As per Maven Coordinates
groupId:artifactId:version are all required fields (although,
groupId and version need not be explicitly defined if they are
inherited from a parent - more on inheritance later). The three fields
act much like an address and timestamp in one. This marks a specific
place in a repository, acting like a coordinate system for Maven
projects:
version: This is the last piece of the naming puzzle. groupId:artifactId denotes a single project but they cannot delineate which incarnation of that project we are talking about. Do we want the junit:junit of 2018 (version 4.12), or of 2007 (version 3.8.2)? In short: code changes, those changes should be versioned, and this element keeps those versions in line. It is also used within an artifact's repository to separate versions from each other. my-project version 1.0 files live in the directory structure $M2_REPO/org/codehaus/mojo/my-project/1.0.

Using third-party libraries in Eclipse RCP Tycho app

I've created a boiler-plate project following vogella's extensive Tycho tutorial.
Facts:
There's no feature, and there's no plugin. The only plugin is the RCP app, which is also the entry-point.
Problem:
I have no idea in which pom.xml do I include the 3rd party dependencies.
I cannot include them in the RCP project, because the packaging of that pom is eclipse-plugin, and not jar. From what I've noticed, if I change the packaging to jar, then the "Maven Dependencies" library is added automatically. If I change back to eclipse-plugin, they get removed.
Questions:
Where do I add the dependencies? There's no pom with jar packaging in my project.
Should I create a separate project with the necessary JARs? How do I include that dependency to my entire project?
Is it really that much of a good practice to create a separate plugin and a feature for this RCP app?
Related solutions:
"Update projects" doesn't work, and neither do the n other solutions in the other SO questions.
There's also this question and that question, but I don't fully get the answers
I think that you have a fundamental misunderstanding.
Maven: Maven determines all of the project dependencies via the pom.xml and resolves transitive dependencies automatically (assuming that all of the pom files and artifacts exist in repositories that you've configured and correctly declare their dependencies).
Tycho: The problem is that Eclipse already has its own project model based on product files, feature.xml files, and plug-in MANIFEST.MF files. Tycho leverages the Maven machinery for Eclipse, but the idea is that the pom.xml files just configure the Maven plug-ins and declare the packaging type. That provides an entry point for Maven, but then Tycho takes over. While Maven would normally build the dependency chain from information in the pom.xml files, Tycho is building the dependency change from information in the product, feature, and MANIFEST.MF files. You don't put any dependencies in the pom.xml files. Tycho also uses Eclipse p2 repositories (instead of normal Maven repositories) for finding dependent plug-ins that are not found in the local modules or target platform.
That's actually a benefit for many Eclipse developers since they've already set up everything properly in their Eclipse plug-ins, features, and products. They do not want to have to repeat all of the dependencies in the pom.xml.
Using Libraries in Eclipse plug-ins: In Eclipse, if you want to use a library that is not already packaged as an Eclipse plug-in, you have a few options. Your plug-in can include a set of JARs in a libs folder and then include that libs folder in the plug-in and runtime classpath (see the build.properties file). Another option is to create your own "library plug-in" that repackages a JAR library as an Eclipse plug-in. See also https://wiki.eclipse.org/FAQ_What_is_the_classpath_of_a_plug-in%3F. That's the answer that you're getting above.
The problem is that if you're trying to include a complex library with multiple JARs that is normally distributed and included in a standard Java project via Maven. We hit this problem with the Jersey JAX-RS implementation in my project. There's no p2 repository that includes all of the pieces of the libraries as plug-ins with correct dependency information.
Easy Solution: If you need a common library, check the Orbit project first to see whether the libraries have already been packaged as Eclipse plug-ins, http://www.eclipse.org/orbit/. In that case, you can download them and include them in your target platform, or you can pull them in dynamically at (Tycho) build time from their p2 repository. Your plug-ins would just include those plug-ins as dependencies (in the their MANIFEST.MF files).
Workaround / Solution: In our case, Jersey JAX-RS was not available as an Eclipse plug-in, and it had a bunch of transitive dependencies. The workaround was to create an Eclipse "library plug-in" like I mentioned above with two pom files. We initially created a skeleton plug-in with an empty libs folder. One pom file is just a standard Maven pom file with <packaging>jar</packaging> that declares the top-level dependencies required to pull in the Jersey JAX-RS implementation and all of its dependencies. The dependencies are declared with <scope>compile</scope>. We use the maven-dependency-plugin to copy all of those dependencies into the project's libs folder.
<plugin>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>compile</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<outputDirectory>libs</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
We actually ended up running Maven with that pom by hand from time to time to update the libs, and then we just checked the plug-in with all of its dependent JARs into source control. Checking the build later, I see that we actually populate the libs folder on-the-fly with Maven with a separate build task just before we start the Maven/Tycho part of the build. Of course, plug-in's MANIFEST-MF file's Bundle-ClassPath and Export-Package entries are coming straight from source control. We have to check those from time to time to ensure that they match the libraries and packages that we're getting from Maven. (That doesn't tend to change much unless we bump major library versions or add a new dependency at the Maven level.) The plug-in's build.properties has the libs/ folder as part of bin.includes.
In the development environment, after we first check out the code, we just run mvn (with an External Tools launch config that's also checked in with the project) on the project's "copy dependencies" pom file. That populates the libs folder with all of the JAX-RS libraries and dependencies. We only have to run it again when we update something about the dependencies or when we're jumping between branches that have different versions of the JAX-RS dependencies. We set .gitignore to ensure that we don't commit the libs to Git.
The other pom for this project is set up like a normal Tycho pom file with <packaging>eclipse-plugin</packaging>. During our automated build, we run one step early in the build process (just after check out) that calls mvn with the jar pom to populate the libs. Then we proceed with the main Maven/Tycho build using the eclipse-plugin pom. The eclipse-plugin pom has no dependency information (as I said above). It's just providing Tycho a way to recognize the Eclipse plug-in and build it based on its MANIFEST.MF and build.properties files. But the built plug-in includes and exposes all of those libs that were populated by the mvn call to the jar pom step.
So, it's a bit of a mess, but that's the best solution we found a couple of years ago when we hit this problem. I'm not sure whether Tycho is doing any work to permit some sort of hybrid Maven/Tycho build that could do this automatically as part of the build. I guess I should ask the developers. :)
Your questions:
Where do I add the dependencies? There's no pom with jar packaging in my project. Answer: The workaround above lets you do it with one project. You just have two pom files, like pom_deps.xml and pom.xml. You just have to invoke the pom_deps.xml separately to populate the libs folder (in the dev environment and with your automated builds).
Should I create a separate project with the necessary JARs? How do I include that dependency to my entire project? Answer: the workaround that I described above lets you do it with a single project. Another way to do it is to create a separate JAR project, but I don't think that your Eclipse RCP app can really include a <packaging>jar</packaging> module in a useful way. The only way I've found to do it is to use a similar workaround. You build the JAR module first, install it into the maven repository, and then have one of your plug-in projects bundle the JAR in its libs folder. (If you really want to do it that way, ask. We have a case where we have to do that, too, and I can provide the steps we do in development and the build to make it work. I think the single project workaround that I provided above makes more sense for your case.)
Is it really that much of a good practice to create a separate plugin and a feature for this RCP app? Answer: that's really a separate question. If you have a feature with multiple plug-ins, you have the same problem. Tycho can handle the product/feature/plug-ins, but it cannot jump across into Maven-based dependency resolution. You'll end up having to use the same workarounds
Summary: The fundamental issue is that Eclipse plug-ins can't "see" a bare JAR library. The plug-in needs to have the library included in its local libs folder (with a matching Bundle-ClassPath entry in MANIFEST.MF), or it needs to depend on some other plug-in that exports the appropriate packages. Tycho just resolves dependencies via Eclipse plug-ins, and it cannot leverage normal Maven dependency resolution directly to pull in a bunch of JARs. If all of your dependencies are already plug-ins, you're fine. If not, you may have to use the workaround above to package a set of libraries for your plug-ins to use.
Just adding the plugin to pom dependencies and including the entry <pomDependencies>consider</pomDependencies> in the configuration of target-platform-configuration makes it work.
<plugins>
<plugin>
<groupId>org.eclipse.tycho</groupId>
<artifactId>target-platform-configuration</artifactId>
<version>${tycho.version}</version>
<configuration>
<!-- The configuration to make tycho consider the maven dependencies -->
<pomDependencies>consider</pomDependencies>
<!-- other configurations -->
</configuartion>
</plugin>
<!-- other plugins-->
</plugins>
<dependencies>
<!-- An example third-party bundle (plugin) present in maven repository-->
<dependency>
<groupId>org.apache.felix</groupId>
<artifactId>org.apache.felix.gogo.shell</artifactId>
<version>1.1.0</version>
</dependency>
</dependencies>
Reference link here.

Dependency in Maven

I am really new to maven. I am bit confused about the dependency feature. I know that I can add dependency in the pom file like this
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.6.1</version>
</dependency>
What does this actually mean? Does it mean that I dont need to import the slf4j jar files into my project? If so how does my project get access to those libraries?
I have read about dependency from maven site but didnt help me much.
Can some one explain it in a simpler way.
Thanks
Nutshell: It means your project has a dependency on slf4j, version 1.6.1.
Furthermore:
If you build your project with Maven (or your IDE is Maven-aware), you don't have to do anything else in order to use slf4j. (Aside from normal source-code considerations, like a reasonable import statement, etc.)
slf4j v. 1.6.1 will be retrieved from a default Maven repository to your local repository, meaning...
... ~/.m2/repository is your repository. slf4j will be put in $M2_HOME/org/slf4j/$(artifactId}/1.6.1 and will include (in general) a jar file, a pom file, and a hash file.
Slf4j's dependencies will be downloaded into your local repository as well.
Dependencies of those dependencies will be downloaded ad infinitum/ad nauseum. (The source of "first use of a library downloads the internet" jokes if there are a lot of dependencies; not the case for slf4j.) This is "transitive dependency management"--one of Maven's original purposes.
If you were not using maven, you would manually download and use the dependencies that you needed for your project. You would probably place them in a lib folder and specify this location in your IDE as well as your build tool.
maven manages these dependencies for you. You specify the dependency your project needs in the prescribed format and maven downloads them for you from the internet and manages them. When building your project, maven knows where it has placed these dependencies and uses them. Most IDEs also know where these dependencies are, when they discover that it is a maven project.
Why is this a big deal? Typically most open source libraries release newer versions on a regular basis. If your project uses these, then each time a newer version is needed, you would need to manually download it and manage it. More importantly, each dependency, in turn may have other dependencies (called transitive dependency). If you do not use maven, you would need to identify, download and manage these transitive dependencies as well.
It becomes complex the more such dependencies that your project uses. It is possible that two dependencies end up using different versions of a dependency common to them.
When compiling your project, Maven will download the corresponding .jar file from a repository, usually the central repository (you can configure different repositories, either for mirroring or for your own libraries which aren't available on the central repositories).
If your IDE know about Maven, it will parse the pom and either download the dependencies itself or ask Maven to do so. Then it will open the dependencies' jars, and this is how you get autocompletion: the IDE "imports" the jars for you behind the scenes.
The repository contains not only the ".jar" file for the dependency, but also a ".pom" file, which describes its dependencies. So, maven will recursively download its dependencies, and you will get all the jars you need to compile your software.
Then, when you will try to run your software, you will have to tell the JVM where to find these dependencies (ie, you have to put them on the class path).
What I usually do is copy the dependencies to a target/lib/ directory, so it is easy to deploy the software and to launch it. To do so, you can use the maven-dependency-plugin, which you specify in the <build>:
<build>
<plugin>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<outputDirectory>${project.build.directory}/lib</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
</build>
There are a variety of servers on the internet that host artifacts (jars) that you can download as part of a maven build. You can add dependencies like you show above to describe what jars you need in order to build your code. When maven goes to build, it will contact one of these servers and download the jar to your computer and place it in a local repository usually
${user_home}/.m2/repository
The servers that maven contacts must be configured in your maven project pom file, under a section like
<repositories>
<repository>
</repository>
</repositories>
The prototypical server can be seen at repo1.maven.org
The nice thing about maven is that if a jar you list is needed, it will pull not only that jar, but any jars that that jar needs. Obviously, since you are pulling the jars to your machine, it only downloads them when it can't find them on your machine, thus not slowing down your build everytime (just the first time).

Using Hudson and Maven to release an application for multiple platforms

This question is not really about best practices or architecture, but about how to specifically configure Hudson and Maven to accomplish what I want. I'm a bit lost.
I have a Java application which uses SWT, and I need to build copies for different platforms. For now, all I need is Linux i386 and Linux amd64, but in the future, I need to add Windows x86/x64 as well, so I want to make sure I set it up "right" the first time around.
My application has all of the dependencies and other information listed in the Project pom.xml, including the different SWT jars to grab depending on OS, arch, and family.
My question is, how do I do builds for both linux i386 and linux amd64 with a minimal amount of configuration duplication? Right now I'm doing the following:
Project specifies all dependencies in pom.xml, and this project is set to build in Hudson and deploy the resulting .jar to Nexus
Builder-linux-i386 runs after Project and specifies any JNI files for i386 and uses the de.tarent maven-pkg-plugin to grab the project jar from Nexus and assemble it along with all dependencies into a single 'fat' jar file, and then into a .deb file for installation.
Builder-linux-amd64 does the same, but for amd64 files
I have been trying to specify which dependencies to use in the Builder projects by adding -P profilename to their Hudson projects, where profilename is a profile named in the Project pom. Maven doesn't seem to like this and prints that it is not activating that profile. It only uses the default profile from Project's pom.
What is the correct way to set this up? I want to have all of my dependencies specified in my Project pom, and have a Hudson project which compiles the jar for that project and deploy it to Nexus, and then independent projects which grab that jar and assemble it along with platform-specific files for release. I don't want to build the entire original project repeatedly, and I don't want to have a ton of duplicated configuration info or copy-pasted poms.
I have it working for unix-amd64 only because that's what the build machine is, so Maven targets that architecture. Also, I feel like the setup isn't as clean as it could be. Advice?
You have an syntax error. It needs to be -Pprofilename. It works for me this way.
Edit
Since the profile is read. There might be an syntax error in your profile configuration. I found a profile in one of projects, that I integrate into our CI environment. It defines some dependencies, it might help you.
<profile>
<id>junit</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<skip>false</skip>
<testNGArtifactName>none:none</testNGArtifactName>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.4</version>
<scope>test</scope>
</dependency>
</dependencies>
</profile>
Profiles should work in the way you desciped it (you could post an other question about this).
But at (least for web applications) there is an other way: Try to use classifier instead of profiles to build for different environments. -- You can have a look at this blog: http://blog.jayway.com/2010/01/21/one-artifact-with-multiple-configurations-in-maven/
The purpuse of this solution is, that you are able to build (if you want (controlled by an profile)) for all environments at once.
The builder projects do not see the profiles from the main Project because it is not actually a parent. I cannot define it as a in the builder projects because my projects are not set up that way, and I'm building using variables like ${SVN_REVISION}, which maven does not like.
I have given up and instead copy-pasted the profiles into the 'builder' projects. This isn't the prettiest but for now it works.

Managing maven dependancies - New Versions and Non-Repo libraries

Warning: I have just picked up Maven, so things mentioned might be wrong or not best practice.
I have a medium size open source project that I am migrating to Maven from the basic
NetBeans project management. This is not a developer team sharing the same room, this is 1-5 people over the internet sharing a SVN repo. Reading over the how-tos on dependencies, it seems that the only way to get dependencies is to get them from an online repo or install them locally.
This is not what I was looking for. I want to keep all dependencies in the SVN for many reasons including portability (anybody can pass by, check out the repo, build, and use; all that simply without manual adding to local repo's and whatnot), getting newer versions (discussed below), and manual versioning.
The other issue I have with the maven repository is that they are quite behind in versions. Logback for example is 0.9.18 in mvnbrowser but 0.9.24 officially. PircBot is 1.4.6 in mvnbrowser but 1.5.0 officially. Why such old versions?
Issue 3 is that I have dependencies that don't even exist in the repos, like Easier Java Persistence.
So
How can I force all dependencies to come from /lib for example
On a related note, can mvn build from library's SVN repo directly? Just curious
Is there an automatic way to get the newest version directly from a dependencies site/svn repo if they also use Maven? IE libraries like commons-lang or logback
Is there a better way of managing dependencies? (IE Ivy or some weird POM option I'm missing)
FYI, this is a Java project with 3 modules, project global dependencies and module specific dependencies.
Bonus points if it can work with the bundled version of Maven that comes with NetBeans.
Not a duplicate of
Maven: add a dependency to a jar by relative path - Not wanting to install to local repository
maven compile fails because i have a non-maven jar - Don't think a System dependency is the right answer
maven look for new versions of dependencies - Still uses(?) repository, just the latest (old) version
This is not what I was looking for. I want to keep all dependencies in the SVN for many reasons (...)
I will come back on this but the solution I described in Maven: add a dependency to a jar by relative path (using a file-based repository) allows to implement such a solution.
The other issue I have with the maven repository is that they are quite behind in versions. Logback for example is 0.9.18 in mvnbrowser but 0.9.24 officially. PircBot is 1.4.6 in mvnbrowser but 1.5.0 officially. Why such old versions?
It looks like mvnbrowser indices are totally out of date (making it useless as repository search engine) because the maven central repository does have logback-core-0.9.24.jar (the logback project is doing what has to be done to make this happen) but only has an old pircbot-1.4.2.jar. Why? Ask the pircbot team. Anyway, you're right, the central repository might not always have ultimate versions.
Issue 3 is that I have dependencies that don't even exist in the repos, like Easier Java Persistence.
Yeah, this happens too.
How can I force all dependencies to come from /lib for example
As previously hinted, you should re-read carefully the solution suggested in Maven: add a dependency to a jar by relative path. This solution is not about installing libraries to the local repository but is about using a file-based repository (that could thus be stored in SVN). You might have missed the point, this matches your use case. And also check Brett's answer for a variation.
On a related note, can mvn build from library's SVN repo directly? Just curious
Didn't get that one. Can you clarify?
Is there an automatic way to get the newest version directly from a dependencies site/svn repo if they also use Maven? IE libraries like commons-lang or logback
Maven supports version ranges and you could use a syntax allowing to use "any version greater than X". But I do NOT recommend using version ranges at all, for the sake of build reproducibility. You don't want the build to suddenly fail because of some automatic update that happened on your back. Only upgrade if you need bug fixes or new features, but do it explicitly (if it ain't broke, don't fix it).
You might also find mentions of the LATEST and RELEASE version markers. I don't recommend them neither for the same reasons as above and even less since they're removed from Maven 3.x.
Is there a better way of managing dependencies? (IE Ivy or some weird POM option I'm missing)
Can't say for Ivy. But in the Maven land, if you can't host up a "corporate" repository for your project (Nexus, Archiva, Artifactory), then the file-based repository is IMO the best approach.
Setup your own Maven repository.
http://archiva.apache.org/

Categories

Resources