What is the reason behind 'BanTransitiveDependencies failed'? - java

Recently I have added the Ban Transitive Dependencies plugin to my pom.xml as seen below:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>1.4.1</version>
<executions>
<execution>
<id>enforce-banned-dependencies</id>
<goals>
<goal>enforce</goal>
</goals>
<configuration>
<rules>
<banTransitiveDependencies>
<excludes>
<!-- the rule will not fail even if it detects ignoredArtifact
of group org.apache.maven, because it is excluded -->
</excludes>
<includes>
</includes>
</banTransitiveDependencies>
</rules>
</configuration>
</execution>
</executions>
</plugin>
When I try building my application with maven, I will get the following error:
[WARNING] Rule 0: org.apache.maven.plugins.enforcer.BanTransitiveDependencies failed with message:
org.hamcrest:hamcrest-all:jar:1.2:test has transitive dependencies:
commons-lang:commons-lang:jar:2.6:test
I am not sure I understand what is happening here. Why is the banning transitive dependencies failing?
By the way I have the following dependency in pom.xml:
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>3.4</version>
</dependency>
So am I supposed to change the version of hamcrest-all? or should I add the commons-lang 2.6 to my pom.xml as well?
Could you please explain what is the right way to "ban transitive dependencies" ?

The banTransitiveDependencies rule is used to verify that your project doesn't inherit of an unwanted transitive dependencies. You configure it by:
<excludes>: list of dependencies to ignore.
<includes>: list of dependencies to consider. Those are exceptions to the <excludes> configuration.
By default, it excludes nothing, meaning that all transitive dependencies are banned by default. There is a slight difference between excluding nothing by default and including everything. The point is that you should define what you want to exclude in a global way and in that subset, define what you want to include.
This is why, in your example, the build fails: you have the default where nothing is excluded and you have a transitive dependency on commons-lang:commons-lang:jar:2.6.
The example from the documentation explains that:
<excludes>
<!-- the rule will not fail even if it detects ignoredArtifact
of group org.apache.maven, because it is excluded -->
<exclude>org.apache.maven:ignoredArtifact</exclude>
<exclude>*:anotherIgnoredArtifact</exclude>
</excludes>
<includes>
<!-- override "org.apache.maven:ignoredArtifact" to fail
if exactly 1.0 version of ignoreArtifact is detected
to be transitive dependency of the project -->
<include>org.apache.maven:ignoredArtifact:[1.0]</include>
</includes>
In this configuration, they want to ban version 1.0 of org.apache.maven:ignoredArtifact as transitive.
So they redefine <excludes> so that all transitive dependencies matching org.apache.maven:ignoredArtifact are excluded, i.e. all dependencies having a group id of org.apache.maven and artifact id of ignoredArtifact (which means all versions with those ids). Then, they redefine <includes> so that only version 1.0 of org.apache.maven:ignoredArtifact is banned.

The BanTransitiveDependencies rule will trigger whenever one of your dependencies' dependencies (i.e. transitive dependencies) are included in the build.
In order to avoid this warning, you'd have to exclude commons-lang:commons-lang:jar:2.6 when declaring your dependency on org.hamcrest:hamcrest-all:1.2:
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<version>1.2</version>
<scope>test</scope>
<exclusions>
<exclusion>
<artifactId>commons-lang</artifactId>
<groupId>commons-lang</groupId>
</exclusion>
</exclusions>
</dependency>

Related

Maven exclude/remove test dependency defined in parent POM

This is similar to Exclude dependency in child pom inherited from parent pom, except that it has to do with test vs compile scopes.
I have a parent POM that includes the org.slf4j:slf4j-api dependency so that all descendant projects will be using SLF4J for the logging API. Then, so that all projects can have some logging for unit tests (regardless of which SLF4J implementation they use in the main, that is non-test, part of the project), I include SLF4J Simple, but only in the test scope:
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<scope>test</scope>
</dependency>
(I understand the view that parent POMs should not declare dependencies and only use dependency management. While I don't disagree in general, configuring tests is a different story. I don't want every single subproject to have to declare JUnit, Hamcrest, Hamcrest Optional, Mockito, Simple Logging, etc. The testing framework should be uniform across all our projects without a huge amount of ceremony just to set up a project.)
This works fine until one project Foo wants to use Logback as the SLF4J implementation.
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.4.1</version>
</dependency>
That works fine for the Foo application itself, but now for the Foo tests, there are suddenly two competing SLF4J implementations: Logback and SLF4J simple. This presents a bindings conflict:
SLF4J: Class path contains multiple SLF4J providers.
SLF4J: Found provider [ch.qos.logback.classic.spi.LogbackServiceProvider#363ee3a2]
SLF4J: Found provider [org.slf4j.simple.SimpleServiceProvider#4690b489]
SLF4J: See https://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual provider is of type [ch.qos.logback.classic.spi.LogbackServiceProvider#363ee3a2]
I need to do one of the following:
In the POM where I bring in the ch.qos.logback:logback-classic dependency, I need to exclude the org.slf4j:slf4j-simple from the parent POM. (This is the preferred solution.)
In the POM where I bring in the ch.qos.logback:logback-classic dependency, I need to specify that ch.qos.logback:logback-classic is for all scopes except the test scope (so as not to conflict with org.slf4j:slf4j-simple).
I don't readily see how to do either of these. Any ideas?
One suggestion was to redeclare org.slf4j:slf4j-simple with <scope>provided</scope>. Thus pom.xml for project Foo would look like this:
…
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<scope>provided</scope>
</dependency>
…
Unfortunately that doesn't work. SLF4J still sees two SLF4J providers on the classpath, and is showing the message seen above. A scope of provided simply keeps the dependency from being included transitively in other projects; it doesn't seem to remove it from the classpath of the current project.
It sounds like you are trying to build the Cathedral using wrong tools and instead of Cathedral you are getting pagan temple :)
technically, it is possible to override classpath/module dependencies imposed by parent pom by defining system scope, something like:
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>2.0.1</version>
<scope>system</scope>
<systemPath>${project.basedir}/../dummy.jar</systemPath>
</dependency>
however, I wouldn't recommend to do that
another option is to take advantage of classpathDependencyExcludes config option of surefire plugin, something like:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<classpathDependencyExcludes>org.slf4j:slf4j-simple</classpathDependencyExcludes>
</configuration>
</plugin>
If particular parent does not suit child's needs, child may adopt another parent :) There is no strict requirement that the aggregator pom must be the parent pom
the real problem is unlike modern build tools maven does not distinguish test compile and test runtime scopes, however it is possible to emulate such behaviour
<properties>
<surefire.runtime>${project.build.directory}/surefire-runtime/slf4j-simple-2.0.1.jar</surefire.runtime>
</properties>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy-surefire-runtime</id>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>2.0.1</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/surefire-runtime/</outputDirectory>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<additionalClasspathElements>${surefire.runtime}</additionalClasspathElements>
</configuration>
</plugin>
yep, too many words there, but in my opinion that is only correct configuration for test runtime dependencies, m.b. it worth to submit a corresponding PR to surefire project - I believe that needs to write about 10 LoC to avoid maven-dependency-plugin configuration and able to configure test runtime in the following way:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<additionalClasspathElements>
<additionalClasspathElement>org.slf4j:slf4j-api:2.0.1</additionalClasspathElement>
</additionalClasspathElements>
</configuration>
</plugin>

Conflict between httpclient version and Apache Spark

I'm developing a Java application using Apache Spark. I use this version:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.2.2</version>
</dependency>
In my code, there is a transitional dependency:
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5.2</version>
</dependency>
I package my application into a single JAR file. When deploying it on EC2 instance using spark-submit, I get this error.
Caused by: java.lang.NoSuchFieldError: INSTANCE
at org.apache.http.conn.ssl.SSLConnectionSocketFactory.<clinit>(SSLConnectionSocketFactory.java:144)
at com.amazonaws.http.apache.client.impl.ApacheConnectionManagerFactory.getPreferredSocketFactory(ApacheConnectionManagerFactory.java:87)
at com.amazonaws.http.apache.client.impl.ApacheConnectionManagerFactory.create(ApacheConnectionManagerFactory.java:65)
at com.amazonaws.http.apache.client.impl.ApacheConnectionManagerFactory.create(ApacheConnectionManagerFactory.java:58)
at com.amazonaws.http.apache.client.impl.ApacheHttpClientFactory.create(ApacheHttpClientFactory.java:50)
at com.amazonaws.http.apache.client.impl.ApacheHttpClientFactory.create(ApacheHttpClientFactory.java:38)
This error shows clearly that SparkSubmit has loaded an older version of the same Apache httpclient library and this conflict happens for this reason.
What is a good way to solve this issue?
For some reason, I cannot upgrade Spark on my Java code. However, I could do that with the EC2 cluster easily. Is it possible to deploy my java code on a cluster with a higher version say 1.6.1 version?
As said in your post, Spark is loading an older version of the httpclient. The solution is to use the Maven's relocation facility to produce a neat conflict-free project.
Here's an example of how to use it in your pom.xml file :
<project>
<!-- Your project definition here, with the groupId, artifactId, and it's dependencies -->
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>org.apache.http.client</pattern>
<shadedPattern>shaded.org.apache.http.client</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
This will move all files from org.apache.http.client to shaded.org.apache.http.client, resolving the conflict.
Original post :
If this is simply a matter of transitive dependencies, you could just add this to your spark-core dependency to exclude the HttpClient used by Spark :
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.2.2</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
</exclusion>
</exclusions>
</dependency>
I also added the scope as provided in your dependency as it will be provided by your cluster.
However, that might muck around with Spark's internal behaviour. If you still get an error after doing this, you could try using Maven's relocation facility that should produce a neat conflict-free project.
Regarding the fact you can't upgrade Spark's version, did you use exactly this dependency declaration from mvnrepository ?
Spark being backwards compatible, there shouldn't be any problem deploying your job on a cluster with a higher version.

NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor conflits on Elastic Search jar

While creating Elasticsearch Client, I'm getting the exception java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
After some lookup, seams like the Guava-18 is being overwrite by an older version at runtime, and Guava-18 only works during compile task.
My Maven configuration is the follow:
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.0</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
How can i force the Guava-18 version at execution time?
You should try to find where the "old" version of guava is taken from and to exclude it once for all.
Find the dependency :
mvn dependency:tree | grep guava
Exclude it :
<dependency>
<groupId>org.whatever</groupId>
<artifactId>the_lib_that_includes_guava</artifactId>
<version>0.97</version>
<exclusions>
<exclusion>
<artifactId>com.google</artifactId>
<groupId>guava</groupId>
</exclusion>
</exclusions>
</dependency>
See https://maven.apache.org/guides/introduction/introduction-to-optional-and-excludes-dependencies.html for more info on the dependency exclusion.
I add the correct dependency of elasticsearch resolve the problem
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>18.0</version>
</dependency>
Add a dependencyManagement block solves this problem:
<dependencyManagement>
<!-- enforce dependency guava version 20.0 -->
<dependencies>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>20.0</version>
</dependency>
</dependencies>
</dependencyManagement>
Reference:
http://techidiocy.com/maven-dependency-version-conflict-problem-and-resolution/
I was also seeing the error message mentioned by the OP when creating an Elasticsearch Client instance. In my case it was occurring in a Spring Boot app at application startup. Spring Boot was attempting to auto-configure the Elasticsearch Client via dependencies brought in by spring-boot-starter-data-elasticsearch. The underlying guava version being brought in was:
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>18.0</version>
</dependency>
This was all working fine until I introduced the following google-api-client dependency...
<dependency>
<groupId>com.google.api-client</groupId>
<artifactId>google-api-client</artifactId>
<version>1.23.0</version>
</dependency>
...which depends on following guava dependency:
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava-jdk5</artifactId>
<version>17.0</version>
</dependency>
This caused a class path collision and the fix was to exclude the older guava version from the google-api-client like so:
<dependency>
<groupId>com.google.api-client</groupId>
<artifactId>google-api-client</artifactId>
<version>1.23.0</version>
<exclusions>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava-jdk5</artifactId>
</exclusion>
</exclusions>
</dependency>
The best soltion is to use the shade plugin for maven. Adding this to your pom.xml should fix it:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>com.google.common</pattern>
<shadedPattern>shaded.com.google.common</shadedPattern>
</relocation>
</relocations>
<artifactSet>
<includes>
<include>com.google.guava:guava</include>
</includes>
</artifactSet>
</configuration>
</execution>
</executions>
</plugin>
This will create an upper jar with the same name including only a shaded guava inside.
I had a similar problem. I created a .jar file (Java source), then I wanted to load that file into the Spark Shell. It turns out that Spark Shell loads jars from something similar to this
spark-[version]-bin-hadoop[version]/jars/".
That directory had an older version of the guava which causes the error. I had the correct version in my pom.xml. I even added exclusions and all suggested responses. In conclusion, it is indeed a wrong version of guava. I copied the version that matches my pom.xml file. Hope this helps. Regards.
SOLVED: I updated the Guava dependency to latest version and it solved the
<!-- https://mvnrepository.com/artifact/com.google.guava/guava -->
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>latest</version>
</dependency>
I was struggling with this issue from past 2 months and finally found the solution.
I added too many external jars in my Project Structure which actually created some jars in Library Root resulting in conflicts whenever something is added in pom.xml.
So what needs to be done is delete all external jar files from your project and keep only the ones which are from maven like Maven:org...
My project structure:
For SBT solution:
Use shading the library in build.sbt
// Shading com.google.**
// We need com.google.guava above 18 version but spark uses version 14 and in that we don't have directExecutor() method
// as spark give preference to spark used libraries, our code was failing
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.google.**" -> "shadeio.#1").inAll
)

Specify both jar and test-jar types in Maven dependencies

I have a project called "commons" that contains common includes for both runtime and test.
In the main project I added a dependency for commons:
<dependency>
<groupId>com.alexb</groupId>
<artifactId>commons</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
However the test common files are not included. So I added :
<dependency>
<groupId>com.alexb</groupId>
<artifactId>commons</artifactId>
<version>1.0-SNAPSHOT</version>
<type>test-jar</type>
</dependency>
However when type is test-jar, the runtime is not included.
Unfortunatelly, it seems I cannot include both:
<type>jar,test-jar</type>
What can I do to include both?
As #khmarbaise mentioned in the comments you should separate your test-jar part project.
I presume you have in the commons pom.xml something like this which generates common test-jar.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.1.1</version>
<executions>
<execution>
<goals>
<goal>test-jar</goal>
</goals>
</execution>
</executions>
</plugin>
The problem with this approach is that you don't get the transitive test-scoped dependencies automatically.
Check this link for more details:
https://maven.apache.org/plugins/maven-jar-plugin/examples/create-test-jar.html

Snapshot dependency present in multiple repositories

I am facing a unique issue -
I have a plugin dependency which is present in multiple repositories. The version number is same just the snapshot qualifier( time-stamp is different ).
Is there a way I can force Maven/Tycho to prefer the snapshot from a particular repository?
EDIT : They are P2 plugin repositories created for Eclipse PDE Build
You can specify a filter on the target platform to remove all but one version:
<plugin>
<groupId>org.eclipse.tycho</groupId>
<artifactId>target-platform-configuration</artifactId>
<version>${tycho-version}</version>
<configuration>
<filters>
<filter>
<type>eclipse-plugin</type>
<id>id.of.dependency</id>
<restrictTo>
<version>1.2.3.2014020241355</version>
</restrictTo>
</filter>
</filters>
</configuration>
<plugin>
XML elements in lists are on a FIFO basis with Maven. So, if you define your repository at hte very top (before the other ones), Maven should end up resolving it from there.
If you're using an artifact repository manager, you could define routing rules.
I guess you could create a profile and add the repository you want as the only repository.
You could exclude all transitive snapshot deps and add the dependency explicitly:
<dependency>
<groupId>com.sun.something</groupId>
<artifactId>something</artifactId>
<version>version</version>
<exclusions>
<exclusion>
<artifactId>transitive</artifactId>
<groupId>com.sun.somethingelse</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.sun.somethingelse</groupId>
<artifactId>transitive</artifactId>
<version>version</version>
</dependency>

Categories

Resources