This is similar to Exclude dependency in child pom inherited from parent pom, except that it has to do with test vs compile scopes.
I have a parent POM that includes the org.slf4j:slf4j-api dependency so that all descendant projects will be using SLF4J for the logging API. Then, so that all projects can have some logging for unit tests (regardless of which SLF4J implementation they use in the main, that is non-test, part of the project), I include SLF4J Simple, but only in the test scope:
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<scope>test</scope>
</dependency>
(I understand the view that parent POMs should not declare dependencies and only use dependency management. While I don't disagree in general, configuring tests is a different story. I don't want every single subproject to have to declare JUnit, Hamcrest, Hamcrest Optional, Mockito, Simple Logging, etc. The testing framework should be uniform across all our projects without a huge amount of ceremony just to set up a project.)
This works fine until one project Foo wants to use Logback as the SLF4J implementation.
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.4.1</version>
</dependency>
That works fine for the Foo application itself, but now for the Foo tests, there are suddenly two competing SLF4J implementations: Logback and SLF4J simple. This presents a bindings conflict:
SLF4J: Class path contains multiple SLF4J providers.
SLF4J: Found provider [ch.qos.logback.classic.spi.LogbackServiceProvider#363ee3a2]
SLF4J: Found provider [org.slf4j.simple.SimpleServiceProvider#4690b489]
SLF4J: See https://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual provider is of type [ch.qos.logback.classic.spi.LogbackServiceProvider#363ee3a2]
I need to do one of the following:
In the POM where I bring in the ch.qos.logback:logback-classic dependency, I need to exclude the org.slf4j:slf4j-simple from the parent POM. (This is the preferred solution.)
In the POM where I bring in the ch.qos.logback:logback-classic dependency, I need to specify that ch.qos.logback:logback-classic is for all scopes except the test scope (so as not to conflict with org.slf4j:slf4j-simple).
I don't readily see how to do either of these. Any ideas?
One suggestion was to redeclare org.slf4j:slf4j-simple with <scope>provided</scope>. Thus pom.xml for project Foo would look like this:
…
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<scope>provided</scope>
</dependency>
…
Unfortunately that doesn't work. SLF4J still sees two SLF4J providers on the classpath, and is showing the message seen above. A scope of provided simply keeps the dependency from being included transitively in other projects; it doesn't seem to remove it from the classpath of the current project.
It sounds like you are trying to build the Cathedral using wrong tools and instead of Cathedral you are getting pagan temple :)
technically, it is possible to override classpath/module dependencies imposed by parent pom by defining system scope, something like:
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>2.0.1</version>
<scope>system</scope>
<systemPath>${project.basedir}/../dummy.jar</systemPath>
</dependency>
however, I wouldn't recommend to do that
another option is to take advantage of classpathDependencyExcludes config option of surefire plugin, something like:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<classpathDependencyExcludes>org.slf4j:slf4j-simple</classpathDependencyExcludes>
</configuration>
</plugin>
If particular parent does not suit child's needs, child may adopt another parent :) There is no strict requirement that the aggregator pom must be the parent pom
the real problem is unlike modern build tools maven does not distinguish test compile and test runtime scopes, however it is possible to emulate such behaviour
<properties>
<surefire.runtime>${project.build.directory}/surefire-runtime/slf4j-simple-2.0.1.jar</surefire.runtime>
</properties>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy-surefire-runtime</id>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>2.0.1</version>
<type>jar</type>
<overWrite>false</overWrite>
<outputDirectory>${project.build.directory}/surefire-runtime/</outputDirectory>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<additionalClasspathElements>${surefire.runtime}</additionalClasspathElements>
</configuration>
</plugin>
yep, too many words there, but in my opinion that is only correct configuration for test runtime dependencies, m.b. it worth to submit a corresponding PR to surefire project - I believe that needs to write about 10 LoC to avoid maven-dependency-plugin configuration and able to configure test runtime in the following way:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<additionalClasspathElements>
<additionalClasspathElement>org.slf4j:slf4j-api:2.0.1</additionalClasspathElement>
</additionalClasspathElements>
</configuration>
</plugin>
Related
I created a unit-test maven project as a base project which other project can extend and use Described here. Here is the pom.xml -
<dependencies>
<dependency>
<groupId>org.powermock</groupId>
<artifactId>powermock-mockito-release-full</artifactId>
<version>1.6.4</version>
<type>pom</type>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<goals>
<goal>test-jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
This generates two jars in target -
unit-test-0.0.1-SNAPSHOT.jar
unit-test-0.0.1-SNAPSHOT-tests.jar
Now, I have some spring boot microservice projects say service-a and service-b which are using another maven project say super-service as dependency. service-a and service-b are using super-service as following declaration in respective services pom.xml -
<dependency>
<groupId>com.super.service</groupId>
<artifactId>super-service</artifactId>
<version>0.0.1-SNAPSHOT</version>
</dependency>
I have written unit test for classes in super-service by using above unit-test maven project which is working fine. The pom.xml of super-service is -
<dependency>
<groupId>com.unit-test</groupId>
<artifactId>unit-test</artifactId>
<version>0.0.1-SNAPSHOT</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
But service-a and service-b is not working in same manner. I thought above dependency should get resolved through the base i.e. super-service but it's not and test gets fail. Then I have tried to repeat the same dependency declaration in each services pom.xml but still maven test gets fail.
I tried the other way described in that URL which tells to move source files from src/test/java to src/main/java but that also worked only for super-service project and not for service-a and service-b.
You're doing the wrong thing. You should not run the unit-tests of your dependencies -- if you don't think it can pass its tests you shouldn't be using it.
Only use a dependency when you have evidence that it passed its tests. This is controlled by your CI process which should only put successfully tested binaries in your binary repository. For most people this means that Jenkins should run your tests and if they all pass only then should it put the binary in nexus for you to depend on it. As it does the release to nexus the CI process, through maven, should also update the version number from a snapshot to a release version so that you're never depending on snapshots.
I'm developing a Java application using Apache Spark. I use this version:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.2.2</version>
</dependency>
In my code, there is a transitional dependency:
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5.2</version>
</dependency>
I package my application into a single JAR file. When deploying it on EC2 instance using spark-submit, I get this error.
Caused by: java.lang.NoSuchFieldError: INSTANCE
at org.apache.http.conn.ssl.SSLConnectionSocketFactory.<clinit>(SSLConnectionSocketFactory.java:144)
at com.amazonaws.http.apache.client.impl.ApacheConnectionManagerFactory.getPreferredSocketFactory(ApacheConnectionManagerFactory.java:87)
at com.amazonaws.http.apache.client.impl.ApacheConnectionManagerFactory.create(ApacheConnectionManagerFactory.java:65)
at com.amazonaws.http.apache.client.impl.ApacheConnectionManagerFactory.create(ApacheConnectionManagerFactory.java:58)
at com.amazonaws.http.apache.client.impl.ApacheHttpClientFactory.create(ApacheHttpClientFactory.java:50)
at com.amazonaws.http.apache.client.impl.ApacheHttpClientFactory.create(ApacheHttpClientFactory.java:38)
This error shows clearly that SparkSubmit has loaded an older version of the same Apache httpclient library and this conflict happens for this reason.
What is a good way to solve this issue?
For some reason, I cannot upgrade Spark on my Java code. However, I could do that with the EC2 cluster easily. Is it possible to deploy my java code on a cluster with a higher version say 1.6.1 version?
As said in your post, Spark is loading an older version of the httpclient. The solution is to use the Maven's relocation facility to produce a neat conflict-free project.
Here's an example of how to use it in your pom.xml file :
<project>
<!-- Your project definition here, with the groupId, artifactId, and it's dependencies -->
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>org.apache.http.client</pattern>
<shadedPattern>shaded.org.apache.http.client</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
This will move all files from org.apache.http.client to shaded.org.apache.http.client, resolving the conflict.
Original post :
If this is simply a matter of transitive dependencies, you could just add this to your spark-core dependency to exclude the HttpClient used by Spark :
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.2.2</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
</exclusion>
</exclusions>
</dependency>
I also added the scope as provided in your dependency as it will be provided by your cluster.
However, that might muck around with Spark's internal behaviour. If you still get an error after doing this, you could try using Maven's relocation facility that should produce a neat conflict-free project.
Regarding the fact you can't upgrade Spark's version, did you use exactly this dependency declaration from mvnrepository ?
Spark being backwards compatible, there shouldn't be any problem deploying your job on a cluster with a higher version.
Recently I have added the Ban Transitive Dependencies plugin to my pom.xml as seen below:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>1.4.1</version>
<executions>
<execution>
<id>enforce-banned-dependencies</id>
<goals>
<goal>enforce</goal>
</goals>
<configuration>
<rules>
<banTransitiveDependencies>
<excludes>
<!-- the rule will not fail even if it detects ignoredArtifact
of group org.apache.maven, because it is excluded -->
</excludes>
<includes>
</includes>
</banTransitiveDependencies>
</rules>
</configuration>
</execution>
</executions>
</plugin>
When I try building my application with maven, I will get the following error:
[WARNING] Rule 0: org.apache.maven.plugins.enforcer.BanTransitiveDependencies failed with message:
org.hamcrest:hamcrest-all:jar:1.2:test has transitive dependencies:
commons-lang:commons-lang:jar:2.6:test
I am not sure I understand what is happening here. Why is the banning transitive dependencies failing?
By the way I have the following dependency in pom.xml:
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>3.4</version>
</dependency>
So am I supposed to change the version of hamcrest-all? or should I add the commons-lang 2.6 to my pom.xml as well?
Could you please explain what is the right way to "ban transitive dependencies" ?
The banTransitiveDependencies rule is used to verify that your project doesn't inherit of an unwanted transitive dependencies. You configure it by:
<excludes>: list of dependencies to ignore.
<includes>: list of dependencies to consider. Those are exceptions to the <excludes> configuration.
By default, it excludes nothing, meaning that all transitive dependencies are banned by default. There is a slight difference between excluding nothing by default and including everything. The point is that you should define what you want to exclude in a global way and in that subset, define what you want to include.
This is why, in your example, the build fails: you have the default where nothing is excluded and you have a transitive dependency on commons-lang:commons-lang:jar:2.6.
The example from the documentation explains that:
<excludes>
<!-- the rule will not fail even if it detects ignoredArtifact
of group org.apache.maven, because it is excluded -->
<exclude>org.apache.maven:ignoredArtifact</exclude>
<exclude>*:anotherIgnoredArtifact</exclude>
</excludes>
<includes>
<!-- override "org.apache.maven:ignoredArtifact" to fail
if exactly 1.0 version of ignoreArtifact is detected
to be transitive dependency of the project -->
<include>org.apache.maven:ignoredArtifact:[1.0]</include>
</includes>
In this configuration, they want to ban version 1.0 of org.apache.maven:ignoredArtifact as transitive.
So they redefine <excludes> so that all transitive dependencies matching org.apache.maven:ignoredArtifact are excluded, i.e. all dependencies having a group id of org.apache.maven and artifact id of ignoredArtifact (which means all versions with those ids). Then, they redefine <includes> so that only version 1.0 of org.apache.maven:ignoredArtifact is banned.
The BanTransitiveDependencies rule will trigger whenever one of your dependencies' dependencies (i.e. transitive dependencies) are included in the build.
In order to avoid this warning, you'd have to exclude commons-lang:commons-lang:jar:2.6 when declaring your dependency on org.hamcrest:hamcrest-all:1.2:
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<version>1.2</version>
<scope>test</scope>
<exclusions>
<exclusion>
<artifactId>commons-lang</artifactId>
<groupId>commons-lang</groupId>
</exclusion>
</exclusions>
</dependency>
I have one aspect with using aspectJ as below:
public aspect TestAspect {
pointcut classicPointcut(PersistenceManagerImpl object) : execution(manager.PersistenceManagerImpl.new(..)) && target(object);
after(PersistenceManagerImpl object) : classicPointcut(object){
System.err.println(object.getClass().getSimpleName());
}
}
this aspect is in module aspect. this module is packaking as jar. PersistenceManagerImpl is in other module but i need use it in module aspect. For dependency management i use maven. But here is of course problem with cyclic reference. Exists some way how can a resolve this problem ?
----------EDIT----------
I get only this error:
java.lang.NoSuchMethodError:TestAspect.ajc$after$TestAspect$1$cc149106(Ljava/lang/Object;)V
When i move my aspect to same module, when is PersistenceManagerImpl i obtain correct solution(of course). But this is not, what i wanted.
Could you put the error result of the compiling code? You could try to put another module as dependency first then later put the dependency on the aspectj maven plugin at weaveDependency in pom.xml as follow:
....
<dependency>
<groupId>com.maventest</groupId>
<artifactId>mytest</artifactId>
<version>1.0-SNAPSHOT</version>
<scope>compile</scope>
</dependency>
....
....
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.8</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
<configuration>
<source>${maven.compiler.source}</source>
<target>${maven.compiler.target}</target>
<showWeaveInfo>true</showWeaveInfo>
<complianceLevel>${maven.compiler.target}</complianceLevel>
<encoding>${project.build.sourceEncoding}</encoding>
<weaveDependencies>
<weaveDependency>
<groupId>com.maventest</groupId>
<artifactId>mytest</artifactId>
</weaveDependency>
</weaveDependencies>
</configuration>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj.version}</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>${aspectj.version}</version>
</dependency>
</dependencies>
</plugin>
ps: You could see my question post asking the same thing here
Your aspect seems to be specific for the module in which PersistenceManagerImpl is located, so that module should be a dependency of the aspect module. On the other hand, that module depends on the aspect module because it needs it as an <aspectLibrary> in the AspectJ Maven configuration. Indeed a circular dependency, but an unnecessary one. You have two options:
Move the aspect to the application module which it is specific for because IMO it belongs there if it explicitly uses specific classes from there. Only aspects which implement cross-cutting concerns in a way applicable to multiple modules should be in their own aspect library.
Following the previous thought, you could make your aspect more general, e.g. do something like this:
public aspect TestAspect {
pointcut classicPointcut(Object object) :
execution(*..PersistenceManagerImpl.new(..)) &&
target(object);
after(Object object) : classicPointcut(object){
System.err.println(object.getClass().getSimpleName());
}
}
I have a Maven project structure like this:
main (POM project)
|-ejb (POM project)
| |-data1 (EJB module)
| |-data2 (EJB module)
| |-ejb-jsf-converters (Java application)
|-web (POM project)
| |-... (A bunch of Web applications)
|-ear (POM project)
|-web1-ear (Java EE7 EAR project)
|-web2-ear (Java EE7 EAR project)
I can compile everything in the order data1, data2, ejb-jsf-converters, web and finally ear. After that I can deploy web1-ear and web2-ear and all works fine. But having to compile everything in a specific order is annoying. I want to be able to just compile main.
But I can't.
The problem is that I have a lot of entities in data1. And these entities of course have a lot of meta-model classes. To generate them, I have these 2 snippets in data1's pom.xml:
...
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>eclipselink</artifactId>
<version>2.4.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>javax.persistence</artifactId>
<version>2.0.5</version>
<scope>provided</scope>
</dependency>
...
<plugin>
<groupId>org.bsc.maven</groupId>
<artifactId>maven-processor-plugin</artifactId>
<version>2.2.4</version>
<executions>
<execution>
<id>process</id>
<goals>
<goal>process</goal>
</goals>
<phase>generate-sources</phase>
<configuration>
<compilerArguments>-Aeclipselink.persistencexml=src/main/resources/META-INF/persistence.xml</compilerArguments>
<processors>
<processor>org.eclipse.persistence.internal.jpa.modelgen.CanonicalModelProcessor</processor>
</processors>
</configuration>
</execution>
</executions>
</plugin>
...
When I compile main instead of data1, all of the sudden, persistence.xml cannot be found and no meta-model classes are generated or found. I can change
<compilerArguments>-Aeclipselink.persistencexml=src/main/resources/META-INF/persistence.xml</compilerArguments>
to
<compilerArguments>-Aeclipselink.persistencexml=ejb/data1/src/main/resources/META-INF/persistence.xml</compilerArguments>
and then I can compile the main project. But when I do that, I no longer can compile data1. Then it's data1 that cannot find persistence.xml and generate meta-model classes. Being able to compile main is great, but being forced to compile main each time I just want to compile data1 is a pain. Compiling main takes 10-15 times longer than compiling data1.
I know I can make it work by specifying the absolute path, instead of a relative path, but many different machines need to compile this. The absolute path will not be the same on all of them.
I thought about using an environment variable, but I'm not sure it will work very well in all cases. For example when the project is compiled by a Jenkins slave.
How can I make this work, so I can compile both main and data1 as I see fit?
I got it working. I changed my dependencies to
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>eclipselink</artifactId>
<version>2.6.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>javax.persistence</artifactId>
<version>2.1.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>org.eclipse.persistence.jpa.modelgen.processor</artifactId>
<version>2.5.2</version>
<scope>provided</scope>
</dependency>
and removed
<compilerArguments>-Aeclipselink.persistencexml=src/main/resources/META-INF/persistence.xml</compilerArguments>
completely. It complains a lot about missing meta-model classes while compiling, but it works, so clearly the classes are generated anyhow.