How to safely manage with such a problem:
In maven we have 2 libs (A with version 20, B with version 30) that depends on C (1.0 and 1.1 respectively). We get jar hell in target libs
*--A20--C1.0
\--B30--C1.1
Then I add dependencyManagment section and force C version to 1.1. It works as expected.
*--A20--C1.1 (not C1.0)
\--B30--C1.1
After several months, we decide to upgrade A to version 50. Now it depends on C version 2.0. But project still uses 1.1 due to dependencyManagement. It's a problem now
*--A50--C1.1 (not C2.0 as needed)
\--B30--C1.1
The question is: is it possible to override transitive dependency and this override will work only if version of transitive dependency match the version we set? Otherwise this override will be ignored and we'll see an error (with help of maven-enforcer plugin, for instance) I wonder if we have some plugin to check this case?
No.
I understand your use case, but I do not see anything in Maven to produce the result you want.
Related
My program relies on the following code to get available system memory:
import oshi.SystemInfo;
import oshi.hardware.HardwareAbstractionLayer;
SystemInfo si = new SystemInfo();
HardwareAbstractionLayer hal = si.getHardware();
// Next line throws exception: NoClassDefFoundError -> com/sun/jna/platform/win32/Psapi
long availableBytes = hal.getMemory().getAvailable();
double availableMegabytes = ((double) availableBytes) / 1048576;
double availableGigabytes = ((double) availableMegabytes)/1024;
Update: After deleting every occurrence of oshi-core from every project in Workspace (to remove possibility of transient conflict dependency - only 4.2.1 is left). Now the error I get is -> java.lang.NoClassDefFoundError: com/sun/jna/platform/win32/VersionHelpers
In pom.xml I've added oshi-core dependency - I've tried almost every version starting from version 3.4.0 to latest version 4.2.1 and they all result in the same error.
I realize oshi-core relies on jna and jna-platform. In Dependency Hierarchy I see both have resolved (compiled) to version 5.5.0.
What is causing this error and how can it be solved?
Thanks!
P.S
I've seen some other threads with similar error but could not find any thread with this exact problem (missing com/sun/jna/platform/win32/Psapi)
While you've pointed out in your comments that you think the latest version of JNA is being resolved, the errors indicate that your project does not have the most recent version of jna-platform (or possibly it has multiple versions linked on the classpath). This is nearly always the case for NoClassDefFoundError and while you're troubleshooting in the right direction, evidence indicates there's an old jna-platform version in your project somewhere.
The com.sun.jna.platform.win32.VersionHelpers class is in jna-platform version 5.3.0 and newer. The GetPerformanceInfo() method required for the method call giving you the error is in the com.sun.jna.platform.win32.Psapi class is in jna-platform version 4.3.0 and newer. If your classloader can't find these classes, then you don't have the correct jars linked to your project -- or you have incorrect jars linked alongside the correct ones.
Maven resolves dependencies by level... first it does all the dependencies you list in your POM (in order), then the transitive dependencies of those projects (in order) and so on. Ensuring the most recent version of JNA is used can be enforced by either (or both) of:
Specify oshi-core dependency earlier in your list of dependencies
in your POM, specifically, before any project that depends on an
earlier version of JNA.
Explicitly specify the jna and
jna-platform versions (5.5.0) in your top-level POM.
Also, in Eclipse, be sure to go through the menus to Update Maven Project to ensure your dependencies are in sync after changes in the POM.
It's possible that your local repository is not downloading the updated jar, in which case you can purge it (or just delete any JNA artifacts, or everything, from C:\Users\<username>\.m2\repository and let it rebuild.)
Also check the classpath in Eclipse. If you have manually added dependencies (e.g., to JNA) before setting up your POM to get them from Maven, you could be using those.
If the above hints do not resolve your problem, please post the contents of the dependencies section your pom.xml file so we can provide additional advice.
Seems oshi-core relies on internal undocumented features of the Sun / Oracle JVM, and you're running on a different and/or newer JVM that doesn't have that undocumented feature anymore. That's the risk of using undocumented features.
Get a newer/other version of oshi-core that supports the version of the JVM you're using, or switch to use a JVM that oshi-core supports.
General Description:
I have two projects A and B.
Project A, must use the version v1 of the L library/API.
Project B, must use the version v2 of the L library/API.
Project A has a dependency on project B (In project A, i need to call a method contained in B).
Concrete description:
Project A is actually a machine learner which has a collection of algorithms which are using an older version of spark-mllib.
I want to integrate the XGBOOST-spark algorithm in project A.
The problem is that the XGBOOST api, specifically: ml.dmlc.xgboost4j.scala.spark.XGBoost.train() method, expects an RDD<org.apache.spark.ml.feature.LabeledPoint>. But the org.apache.spark.ml.feature.LabeledPoint is only available in the newer version of spark-mllib. And from project A (which uses the older version of spark-mllib), I only have acces to an org.apache.spark.mllib.regression.LabeledPoint. So I cannot directly integrate XGBOOST in project A without upgrading the spark-mllib version of project A.
Fortunately, the newer version of spark-mllib has a method of converting from the old LabeledPoint (org.apache.spark.mllib.regression.LabeledPoint) to the new LabeledPoint (org.apache.spark.ml.feature.LabeledPoint). The method is: org.apache.spark.mllib.regression.LabeledPoint.asML().
So, the question is: Is there any clever way of using that method .asML() which is available only in the newer version of spark, so that I can convert the LabeledPoint and pass it to the XGBOOST API?
I am not familiar with how the dependencies are treated by maven but I thought of something like:
Create a project B that uses the newer version of spark-mllib, and the XGBOOST-API, and in which we have a class and a method that receives the parameters (from project A), converts the old LabeledPoint to the new LabeledPoint, calls the XGBoost.train() method which generates a model, and then we pass back the model to project A. We import that class in project A (from project B), call it's method, get the model, and we continue with our business as usual.
Of course, I tried to do that. But it doesn't work. I think that's because of the fact that we can only have one version of spark-mllib in the whole dependency tree. Since the class from project B throws java.lang.NoSuchMethodError: org.apache.spark.mllib.regression.LabeledPoint.asML()Lorg/apache/spark/ml/feature/LabeledPoint; , it seems that in the whole dependency tree, we actually use the older version of spark-mllib (and that happens because the older version is closer to the root of the dependency tree). Even though in project B we use the newer version of spark-mllib, which has the asML() method available.
So, the actual question is: Is there any clever way of making this work? Without upgrading the spark-mllib version on project A? Upgrading is not a viable option. Project A is big and if I upgrade that version, I screw up just about everything.
[Update]
I even tried to use a ClassLoader (URLClassLoader) in order to load the class directly from spark-mllib_2.11-2.3.0.jar and print all the available methods. Code here:
URLClassLoader clsLoader = URLClassLoader.newInstance(new URL[] {
new URL("file:///home/myhome/spark-mllib_2.11-2.3.0.jar")
});
Class cls = clsLoader.loadClass("org.apache.spark.mllib.regression.LabeledPoint");
Method[] m = cls.getDeclaredMethods();
for (int i = 0; i < m.length; i++)
System.out.println(m[i].toString());
In my .pom file of this project, if I add a dependency of:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.3.0</version>
</dependency>
The method public org.apache.spark.ml.feature.LabeledPoint org.apache.spark.mllib.regression.LabeledPoint.asML() is present the results if i use the 2.3.0 version.
But when I use the version 1.6.2 of spark-mllib, it isn't there anymore.
Even though the asML() method is within the spark-mllib's jar. Which is kind of weird.
You can achieve this by creating a shaded dependency of Project B and using it in Project A. Refer to this answer for understanding maven shading and how to use it.
In my project, jersey-core is pull from many dependencies. I don't know from which ones. I believed it doesn't matter because I thought that if multiples dependencies pull the same one, than gradle would always take the higher version. I was wrong.
[ERROR] [main] [n/a] org.apache.catalina.core.ContainerBase.[Tomcat].[localhost].[/] - StandardWrapper.Throwable
java.lang.NoSuchMethodError: com.sun.jersey.core.reflection.ReflectionHelper.getContextClassLoaderPA()Ljava/security/PrivilegedAction;
at com.sun.jersey.spi.scanning.AnnotationScannerListener.<init>(AnnotationScannerListener.java:94) ~[jersey-server-1.19.jar:1.19]
AnnotationScannerListener is 1.19, ReflectionHelper is 1.1, and the method getContextClassLoaderPA() does not exist in ReflectionHelper 1.1
How can I force gradle to always take the higher version?
I use intellij.
By default gradle should add the highest version of a dependency to the classpath.
You can force the version of a dependency to be a specific version like so:
configurations.all {
resolutionStrategy {
// force certain versions of dependencies (including transitive)
// *append new forced modules:
force 'asm:asm-all:3.3.1', 'commons-io:commons-io:1.4'
}
}
This example was lifted directly from https://docs.gradle.org/current/dsl/org.gradle.api.artifacts.ResolutionStrategy.html, which might be worth a read, along with https://docs.gradle.org/current/userguide/dependency_management.html
Another piece of advice, if you want to find out what is pulling in conflicting versions of jars, you can do the following:
gradle dependencyInsight --dependency $dependencyName --configuration $configurationName
where $dependencyName should be substituted for the name of your dependency (such as asm-all), and $configurationName should be replaced with the configuration name you wish to check for (such as compile). This will give you a graph of what versions are being pulled in by which dependencies.
Java 8 here.
Say there is an old version of the widget libray, with Maven coordinates widgetmakers:widget:1.0.4, that has a class defined in it like so:
public class Widget {
private String meow;
// constructor, getters, setters, etc.
}
Years pass. The maintainers of this widget library decide that a Widget should never meow, rather, that it should in fact bark. And so a new release is made, with Maven coordinates widgetmakers:widget:2.0.0 and with Widget looking like:
public class Widget {
private Bark bark;
// constructor, getters, setters, etc.
}
So now I go to build my app, myapp. And, wanting to use the latest stable versions of all my dependencies, I declare my dependencies like so (inside of build.gradle):
dependencies {
compile (
,'org.slf4j:slf4j-api:1.7.20'
,'org.slf4j:slf4j-simple:1.7.20'
,'bupo:fizzbuzz:3.7.14'
,'commons-cli:commons-cli:1.2'
,'widgetmakers:widget:2.0.0'
)
}
Now let's say that this (fictional) fizzbuzz library has always depended on a 1.x version of the widget library, where Widget would meow.
So now, I'm specifying 2 versions of widget on my compile classpath:
widgetmakers:widget:1.0.4 which is pulled in by the fizzbuzz library, as a dependency of it; and
widgetmakers:widget:2.0.0 which I am referencing directly
So obviously, depending on which version of Widget gets classloaded first, we will either have a Widget#meow or a Widget#bark.
Does Gradle provide any facilities for helping me out here? Is there any way to pull in multiple versions of the same class, and configure fizzbuzz classes to use the old version of Widget, and my classes to use the new version? If not, the only solutions I can think of are:
I might be able to accomplish some kind of shading- and/or fatjar-based soltuion, where perhaps I pull in all my dependencies as packages under myapp/bin and then give them different version-prefixes. Admittedly I don't see a clear solution here, but am sure something is feasible (yet totally hacky/nasty). Or...
Carefully inspect my entire dependency graph and just make sure that all of my transitive dependencies don't conflict with each other. In this case for me, this means either submitting a pull-request to the fizzbuzz maintainers to upgrade it to the latest widget version, or, sadly, downgrading myapp to use the older widget version.
But Gradle (so far) has been magic for me. So I ask: is there any Gradle magic that can avail me here?
Don't know the specifics of Gradle, as I'm a Maven person, but this is more generic anyway. You basically have two options (and both are hacky):
ClassLoader magic. Somehow, you need to convince your build system to load two versions of the library (good luck with that), then at runtime, load the classes that use the old version with a ClassLoader that has the old version. I have done this, but it's a pain. (Tools like OSGI may take away some of this pain)
Package shading. Repackage the library A that uses the old version of library B, so that B is actually inside A, but with a B-specific package prefix. This is common practice, e.g. Spring ships its own version of asm. On the Maven side, the maven-shade-plugin does this, there probably is a Gradle equivalent. Or you can use ProGuard, the 800 pound gorilla of Jar manipulation.
Gradle will only set up the classpath with your dependencies, it doesn't provide its own runtime to encapsulate dependencies and its transitive dependencies. The version active at runtime will be the one according to the classloading rules, which I believe is the first jar in the classpath order to contain the class. OSGI provides runtime that can deal with situations like this and so will the upcoming module system.
EDIT: Bjorn is right in that it will try to resolve conflicts in different versions; it'll compile the classpath based on its strategies, so the order you put your dependencies in the file doesn't matter. However you still only get one class per classname, it won't resolve OP's issue
If you have different versions of a library with otherwise equal coordinates, Gradles conflict resolution mechanism comes into play.
The default resolution strategy is to use the newest requested version of the library. You will not get multiple versions of the same library in your dependendcy graph.
If you really need different versions of the same library at runtime you would have to either do some ClassLoader magic which definitely is possible or do some shading for one of the libraries or both.
Regarding conflict resolution, Gradle has built-in the newest strategy that is default and a fail strategy that fails if different versions are in the dependency graph and you have to explicitly resolve version conflicts in your build files.
Worse case is when the same class appears in multiple jars. This is more insidious - look at the metrics jars from Codahale and Dropwizard with incompatible versions of the same class in the two jars.
The gradle classpath-hell plugin can detect this horror.
I've got a Maven project that contains two dependencies, A and B. Each of these depends transitively on C, but they depend on different versions of C. Let's say that A depends on C version 1, and B depends on C version 2.
Unfortunately, A is not bytecode-compatible with version 2, nor B with version 1. (As it happens, A is source-compatible with version 2, but I don't think that will help us here.)
This means that I need both versions of the transitive dependency in my project, and I need A to use version 1, and B to use version 2.
Is there a way of doing this?
I had assumed that I would need to use the shade plugin to relocate the package name of A and all its dependencies, but this doesn't seem to be possible. If I shade A, its dependencies don't get shaded, and it still picks up version 2, and fails to run.
Create another project wrapper A named A-wrapper. Relocate C in A-wrapper.
Then in your main project, depends on A-wrapper and B.
I've met a similar problem on pb2 and pb3 and it is resolved using this way.
https://stackoverflow.com/a/41394239/1395722
Assuming dependency A requires v1 of C and dependency B requires v2 of C. You can create an uber jar of A containing v1 of C but changing the packaging using shade plugin,
Example jar A has contents of C with new packaging "v1.c.something". Do the same for B, so jar B has contents of C with new packaging "v2.c.something". You need to include only the conflicting dependencies not all.