I am facing a unique issue -
I have a plugin dependency which is present in multiple repositories. The version number is same just the snapshot qualifier( time-stamp is different ).
Is there a way I can force Maven/Tycho to prefer the snapshot from a particular repository?
EDIT : They are P2 plugin repositories created for Eclipse PDE Build
You can specify a filter on the target platform to remove all but one version:
<plugin>
<groupId>org.eclipse.tycho</groupId>
<artifactId>target-platform-configuration</artifactId>
<version>${tycho-version}</version>
<configuration>
<filters>
<filter>
<type>eclipse-plugin</type>
<id>id.of.dependency</id>
<restrictTo>
<version>1.2.3.2014020241355</version>
</restrictTo>
</filter>
</filters>
</configuration>
<plugin>
XML elements in lists are on a FIFO basis with Maven. So, if you define your repository at hte very top (before the other ones), Maven should end up resolving it from there.
If you're using an artifact repository manager, you could define routing rules.
I guess you could create a profile and add the repository you want as the only repository.
You could exclude all transitive snapshot deps and add the dependency explicitly:
<dependency>
<groupId>com.sun.something</groupId>
<artifactId>something</artifactId>
<version>version</version>
<exclusions>
<exclusion>
<artifactId>transitive</artifactId>
<groupId>com.sun.somethingelse</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.sun.somethingelse</groupId>
<artifactId>transitive</artifactId>
<version>version</version>
</dependency>
Related
This is similar to Exclude dependency in child pom inherited from parent pom, except that it has to do with test vs compile scopes.
I have a parent POM that includes the org.slf4j:slf4j-api dependency so that all descendant projects will be using SLF4J for the logging API. Then, so that all projects can have some logging for unit tests (regardless of which SLF4J implementation they use in the main, that is non-test, part of the project), I include SLF4J Simple, but only in the test scope:
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<scope>test</scope>
</dependency>
(I understand the view that parent POMs should not declare dependencies and only use dependency management. While I don't disagree in general, configuring tests is a different story. I don't want every single subproject to have to declare JUnit, Hamcrest, Hamcrest Optional, Mockito, Simple Logging, etc. The testing framework should be uniform across all our projects without a huge amount of ceremony just to set up a project.)
This works fine until one project Foo wants to use Logback as the SLF4J implementation.
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.4.1</version>
</dependency>
That works fine for the Foo application itself, but now for the Foo tests, there are suddenly two competing SLF4J implementations: Logback and SLF4J simple. This presents a bindings conflict:
SLF4J: Class path contains multiple SLF4J providers.
SLF4J: Found provider [ch.qos.logback.classic.spi.LogbackServiceProvider#363ee3a2]
SLF4J: Found provider [org.slf4j.simple.SimpleServiceProvider#4690b489]
SLF4J: See https://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual provider is of type [ch.qos.logback.classic.spi.LogbackServiceProvider#363ee3a2]
I need to do one of the following:
In the POM where I bring in the ch.qos.logback:logback-classic dependency, I need to exclude the org.slf4j:slf4j-simple from the parent POM. (This is the preferred solution.)
In the POM where I bring in the ch.qos.logback:logback-classic dependency, I need to specify that ch.qos.logback:logback-classic is for all scopes except the test scope (so as not to conflict with org.slf4j:slf4j-simple).
I don't readily see how to do either of these. Any ideas?
One suggestion was to redeclare org.slf4j:slf4j-simple with <scope>provided</scope>. Thus pom.xml for project Foo would look like this:
…
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<scope>provided</scope>
</dependency>
…
Unfortunately that doesn't work. SLF4J still sees two SLF4J providers on the classpath, and is showing the message seen above. A scope of provided simply keeps the dependency from being included transitively in other projects; it doesn't seem to remove it from the classpath of the current project.
It sounds like you are trying to build the Cathedral using wrong tools and instead of Cathedral you are getting pagan temple :)
technically, it is possible to override classpath/module dependencies imposed by parent pom by defining system scope, something like:
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>2.0.1</version>
<scope>system</scope>
<systemPath>${project.basedir}/../dummy.jar</systemPath>
</dependency>
however, I wouldn't recommend to do that
another option is to take advantage of classpathDependencyExcludes config option of surefire plugin, something like:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<classpathDependencyExcludes>org.slf4j:slf4j-simple</classpathDependencyExcludes>
</configuration>
</plugin>
If particular parent does not suit child's needs, child may adopt another parent :) There is no strict requirement that the aggregator pom must be the parent pom
the real problem is unlike modern build tools maven does not distinguish test compile and test runtime scopes, however it is possible to emulate such behaviour
<properties>
<surefire.runtime>${project.build.directory}/surefire-runtime/slf4j-simple-2.0.1.jar</surefire.runtime>
</properties>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy-surefire-runtime</id>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>2.0.1</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/surefire-runtime/</outputDirectory>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<additionalClasspathElements>${surefire.runtime}</additionalClasspathElements>
</configuration>
</plugin>
yep, too many words there, but in my opinion that is only correct configuration for test runtime dependencies, m.b. it worth to submit a corresponding PR to surefire project - I believe that needs to write about 10 LoC to avoid maven-dependency-plugin configuration and able to configure test runtime in the following way:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<additionalClasspathElements>
<additionalClasspathElement>org.slf4j:slf4j-api:2.0.1</additionalClasspathElement>
</additionalClasspathElements>
</configuration>
</plugin>
I have a maven project with a webapp for which I need two versions, each one having its own set of dependencies. The intent is to support two different (and conflicting) versions of a storage client. The webapp code, configuration file and anything but certain libraries is the same in both cases. The right client is loaded at runtime : I just need to drop the right jar (and its dependencies) in the lib folder of the webapp.
If I deploy the dependencies manually, I lose the opportunity to check for version conflicts (which I do when I build a maven project with all its dependencies correctly set).
I do not want to deploy the webapp(s) on the maven repository since it is not a library and it only makes a big archive (mainly because of the embedded dependencies) that consumes space for nothing. Thus, to build the final wars, I cannot add a dependency on the webapp project.
I do not want to duplicate the common webapp class files and configuration files in two different modules. It would make future evolutions more difficult because of the necessary synchronization between the two modules each time one file is updated.
Any suggestion on how to solve this ?
Note that the best solution should allow to build both wars at once.
Use Maven profiles.
http://maven.apache.org/guides/introduction/introduction-to-profiles.html
You can put certain dependencies into certain profiles and activate/deactivate them through the command line with the -P parameter.
I guess defining two profiles in your pom might do the trick :
<project [...]>
[...]
<profiles>
<profile>
<id>storage1</id>
<dependencies>
<dependency>
<groupId>my.group.storage</groupId>
<artifactId>thisOne</artifactId>
<version>13</version>
</dependency>
</dependencies>
</profile>
<profile>
<id>storage2</id>
<dependencies>
<dependency>
<groupId>my.group.storage</groupId>
<artifactId>thisOtherOne</artifactId>
<version>37</version>
</dependency>
</dependencies>
</profile>
</profiles>
[...]
</project>
Call one or the other with mvn -P storage1 or mvn -P storage2. You can also make one active by default, use activation triggers based on other properties, etc.
Here's their introduction article.
In the end, I did not use profiles. There was an issue building both webapp versions at once.
Instead I used war overlays https://maven.apache.org/plugins/maven-war-plugin/overlays.html.
First, I created a skinny war version of the webapp. The skinny war does not include libraries nor META-INF files. Only resources like configuration files. The webapp classes are packaged in a jar (using the attachedClasses configuration option of the maven-war-plugin). I do not mind having this war deployed since it is very lightweigth. Here is the configuration of the maven-war-plugin :
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<configuration>
<outputFileNameMapping>#{groupId}#.#{artifactId}#-#{version}##{dashClassifier?}#.#{extension}#</outputFileNameMapping>
<attachClasses>true</attachClasses>
<archive>
<addMavenDescriptor>false</addMavenDescriptor>
</archive>
<packagingExcludes>WEB-INF/classes/**/*,WEB-INF/lib/*</packagingExcludes>
</configuration>
</plugin>
Then, I created 2 additional modules, one for each flavour of the webapp. In the dependencies, I set :
- the webapp as a dependency of type war
- the jar of the webapp classes
- the storage client library
That way, maven checks for dependency conflicts in all the libraries. The webapp classes are imported through the dependency. The overlay war is used to build the final war. No duplicate code between the 2 flavours of the webapp. Only the client dependency changes between the 2 pom files. Here is an excerpt of one of them :
<dependencies>
<dependency>
<groupId>com.storage</groupId>
<artifactId>client</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>com.group.id</groupId>
<artifactId>webapp</artifactId>
<version>${project.version}</version>
<classifier>classes</classifier>
<type>jar</type>
</dependency>
<dependency>
<groupId>com.group.id</groupId>
<artifactId>webapp</artifactId>
<version>${project.version}</version>
<type>war</type>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-deploy-plugin</artifactId>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
</plugins>
</build>
I'm developing a Java application using Apache Spark. I use this version:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.2.2</version>
</dependency>
In my code, there is a transitional dependency:
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5.2</version>
</dependency>
I package my application into a single JAR file. When deploying it on EC2 instance using spark-submit, I get this error.
Caused by: java.lang.NoSuchFieldError: INSTANCE
at org.apache.http.conn.ssl.SSLConnectionSocketFactory.<clinit>(SSLConnectionSocketFactory.java:144)
at com.amazonaws.http.apache.client.impl.ApacheConnectionManagerFactory.getPreferredSocketFactory(ApacheConnectionManagerFactory.java:87)
at com.amazonaws.http.apache.client.impl.ApacheConnectionManagerFactory.create(ApacheConnectionManagerFactory.java:65)
at com.amazonaws.http.apache.client.impl.ApacheConnectionManagerFactory.create(ApacheConnectionManagerFactory.java:58)
at com.amazonaws.http.apache.client.impl.ApacheHttpClientFactory.create(ApacheHttpClientFactory.java:50)
at com.amazonaws.http.apache.client.impl.ApacheHttpClientFactory.create(ApacheHttpClientFactory.java:38)
This error shows clearly that SparkSubmit has loaded an older version of the same Apache httpclient library and this conflict happens for this reason.
What is a good way to solve this issue?
For some reason, I cannot upgrade Spark on my Java code. However, I could do that with the EC2 cluster easily. Is it possible to deploy my java code on a cluster with a higher version say 1.6.1 version?
As said in your post, Spark is loading an older version of the httpclient. The solution is to use the Maven's relocation facility to produce a neat conflict-free project.
Here's an example of how to use it in your pom.xml file :
<project>
<!-- Your project definition here, with the groupId, artifactId, and it's dependencies -->
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>org.apache.http.client</pattern>
<shadedPattern>shaded.org.apache.http.client</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
This will move all files from org.apache.http.client to shaded.org.apache.http.client, resolving the conflict.
Original post :
If this is simply a matter of transitive dependencies, you could just add this to your spark-core dependency to exclude the HttpClient used by Spark :
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.2.2</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
</exclusion>
</exclusions>
</dependency>
I also added the scope as provided in your dependency as it will be provided by your cluster.
However, that might muck around with Spark's internal behaviour. If you still get an error after doing this, you could try using Maven's relocation facility that should produce a neat conflict-free project.
Regarding the fact you can't upgrade Spark's version, did you use exactly this dependency declaration from mvnrepository ?
Spark being backwards compatible, there shouldn't be any problem deploying your job on a cluster with a higher version.
Recently I have added the Ban Transitive Dependencies plugin to my pom.xml as seen below:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>1.4.1</version>
<executions>
<execution>
<id>enforce-banned-dependencies</id>
<goals>
<goal>enforce</goal>
</goals>
<configuration>
<rules>
<banTransitiveDependencies>
<excludes>
<!-- the rule will not fail even if it detects ignoredArtifact
of group org.apache.maven, because it is excluded -->
</excludes>
<includes>
</includes>
</banTransitiveDependencies>
</rules>
</configuration>
</execution>
</executions>
</plugin>
When I try building my application with maven, I will get the following error:
[WARNING] Rule 0: org.apache.maven.plugins.enforcer.BanTransitiveDependencies failed with message:
org.hamcrest:hamcrest-all:jar:1.2:test has transitive dependencies:
commons-lang:commons-lang:jar:2.6:test
I am not sure I understand what is happening here. Why is the banning transitive dependencies failing?
By the way I have the following dependency in pom.xml:
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>3.4</version>
</dependency>
So am I supposed to change the version of hamcrest-all? or should I add the commons-lang 2.6 to my pom.xml as well?
Could you please explain what is the right way to "ban transitive dependencies" ?
The banTransitiveDependencies rule is used to verify that your project doesn't inherit of an unwanted transitive dependencies. You configure it by:
<excludes>: list of dependencies to ignore.
<includes>: list of dependencies to consider. Those are exceptions to the <excludes> configuration.
By default, it excludes nothing, meaning that all transitive dependencies are banned by default. There is a slight difference between excluding nothing by default and including everything. The point is that you should define what you want to exclude in a global way and in that subset, define what you want to include.
This is why, in your example, the build fails: you have the default where nothing is excluded and you have a transitive dependency on commons-lang:commons-lang:jar:2.6.
The example from the documentation explains that:
<excludes>
<!-- the rule will not fail even if it detects ignoredArtifact
of group org.apache.maven, because it is excluded -->
<exclude>org.apache.maven:ignoredArtifact</exclude>
<exclude>*:anotherIgnoredArtifact</exclude>
</excludes>
<includes>
<!-- override "org.apache.maven:ignoredArtifact" to fail
if exactly 1.0 version of ignoreArtifact is detected
to be transitive dependency of the project -->
<include>org.apache.maven:ignoredArtifact:[1.0]</include>
</includes>
In this configuration, they want to ban version 1.0 of org.apache.maven:ignoredArtifact as transitive.
So they redefine <excludes> so that all transitive dependencies matching org.apache.maven:ignoredArtifact are excluded, i.e. all dependencies having a group id of org.apache.maven and artifact id of ignoredArtifact (which means all versions with those ids). Then, they redefine <includes> so that only version 1.0 of org.apache.maven:ignoredArtifact is banned.
The BanTransitiveDependencies rule will trigger whenever one of your dependencies' dependencies (i.e. transitive dependencies) are included in the build.
In order to avoid this warning, you'd have to exclude commons-lang:commons-lang:jar:2.6 when declaring your dependency on org.hamcrest:hamcrest-all:1.2:
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<version>1.2</version>
<scope>test</scope>
<exclusions>
<exclusion>
<artifactId>commons-lang</artifactId>
<groupId>commons-lang</groupId>
</exclusion>
</exclusions>
</dependency>
I have a troubles with plugin dependecies.
I want to use "proguard-maven-plugin", but by default this plugin use proguard 4.3. Proguard 4.3 don't support jdk 7.
To fix my problem i'm just need to use proguard 4.6+. Buuut, last version in central repo is 4.4. I can manualy download proguard 4.6+ from proguard repo, but how i can include it to plugin?
I have my own nexus repo, and i put proguard 4.8 there. How can i load dependecies for "proguard-maven-plugin" from my repo?
i did as written there: http://www.sonatype.com/people/2008/04/how-to-override-a-plugins-dependency-in-maven/, but maven looking proguard 4.8 in central repo. How can i force maven search in my own repo?
Sorry my terrible English, i hope you understand me.
You have to edit the file .m2/settings.xml in your home folder (and the home folder of every user who runs the Maven job). There you have to add your Nexus as a repository as described here. Basically the config looks like this:
<settings>
...
<mirrors>
<mirror>
<id>mynexus</id>
<name>My Nexus</name>
<url>http://mynexusurl</url>
<mirrorOf>*</mirrorOf>
</mirror>
</mirrors>
...
</settings>
Additionally you have to configure your Nexus to mirror the central repo.
Maybe this will also help, I don't use it, but this is the form of the config for changing a dependency
<plugin>
<artifactId>proguard-maven-plugin</artifactId>
<version>2.0.4</version>
<dependencies>
<dependency>
<groupId>net.sf.proguard</groupId>
<artifactId>proguard</artifactId>
<version>4.6</version>
</dependency>
</dependencies>
</plugin>
Also an ant-run example
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.7</version>
<dependencies>
<dependency>
<groupId>ant-contrib</groupId>
<artifactId>ant-contrib</artifactId>
<version>1.0b2</version>
<exclusions>
<exclusion>
<groupId>ant</groupId>
<artifactId>ant</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.ant</groupId>
<artifactId>ant-nodeps</artifactId>
<version>1.8.1</version>
</dependency>
</dependencies>
</plugin>
Note also! for maven2 - issues resolved maven3 http://jira.codehaus.org/browse/MNG-1323
For multi module reactor builds dependencies for plugin resolved in first use of the plugin. If your dependency isn't being downloaded in reactor build but works fine in single module then you may need to include it in an earlier project - easiest done by adding to pluginManagement of shared parent