I am working with a usecase where my application is using okhttp [version:3.14.9] which is written in java, I am also importing some external dependency which uses okhttp [version:4.10.0] which is written in Kotlin, the external dependency is using a method of okhttp (4.10.0) which is not in okhttp (3.14.9). since my application during runtime calls the method with the (3.14.9) dependency hence unable to find the method and throws java.lang.NoSuchMethodError.
I know this is such a common problem which every other java application faced at some point of time. So I want to know the right approach which is used widely by most java applications.
There is no proper solution for this problem. You can choose only one version.
Way around this is to use OSGi, but it will require major redesign of your application.
This is a classic problem where you have more than 1 version of the dependency available during the runtime. I would suggest you uplift the version of okHttp in your application to match that of the external dependency. This would remove many headaches like "class path resolution" issues.
If you're using a build tool like Maven, there may be another solution where you use define 3.14.9v as the compile time dependency and include 4.10.0v as the runtime dependency so only 4.10.0v is packaged-in and available during the runtime.
I would try to upgrade my code from 3.14 to 4.10 to match the external dependency. Generally it is good to upgrade packages whenever you can, so this should be a good thing (unless there is a strong reason why you can't). The worse case is when an external dependency forces you to downgrade, which is frustrating but sometimes unavoidable.
Related
I am a library maintainer and I came to a point where I discovered that one of my libraries can result into a dependency issue for the end user as it provides a transitive dependency to the end users. The end-user might be using a different version of the transitive one and therefore it can result into unexpected behaviour. I don't want to enforce the end user to use a specific version by providing the transitive one however I still want my library to be functional and therefore I don't know what is the best way to solve this issue. Should I use the default dependency scope or should I switch to the scope provided?
I also want to provide some context to make this question more clear. I created sslcontext-kickstart which is just a high-level library for configuring ssl for a server or client. The library has additional separate dependencies which the end-user can use to make it more easy to use for their use case. So the core library has only a dependency on slf4j-api. However there are separate libraries which contain mappers which I also created for apache4, apache5, netty and jetty which relay on the core library. Apache4, apache5, netty and jetty are currently a compile scoped dependency and therefor the end-user will also get the version which is specified in my pom. Let's assume someone is using the apache4 version. So should the end-user exclude the dependency manually when they are using my library and don't want the specific transitive apache4 dependency? Or should I mark apache4 as provided scope type. In that way there will be no transitive dependency, however the end-user should have the apache4 dependency present on their classpath or else they will get a runtime exception when they use my library.
What do you guys think regarding this topic? What are my options and which should I choose in your opinion?
Two options:
Mark the seldomly needed dependencies as optional (see e.g. https://stackoverflow.com/a/40398649/927493).
Create different JARs for the different target platforms like sslcontenxt-kickstart-for-apache4. Then the users of the library can choose whichever fits best for their needs without getting the unwanted dependencies.
We have a project which depends on Aspose Words' com.aspose:aspose-words:16.10.0:jdk16.
The POM for aspose-words declares no dependencies, but this turns out to be a lie. It actually uses jai-core, latest version of which is at javax.media:jai-core:1.1.3.
The POM for jai-core, though, also lies - it declares no dependencies, but actually depends on jai-codec, which is at com.sun.media:jai-codec:1.1.3.
Getting these projects to fix things seems impractical. JAI is basically a dead project and Maven Central have no idea who added that POM so there is nobody responsible for fixing the metadata. Aspose refuse to fix things without a test reproducing it, even if you can show them their own code doing it wrong, and even if they fixed it, they would then add their dependency on jai-core:1.1.3, which only fixes half the problem anyway.
If I look at our entire tree of dependencies, this is only one example of the problem. Others are lurking, masked out by other dependency chains coincidentally pulling in the missing dependency. In some cases, we have even reported POM issues to projects, only for them to say that the dependency "isn't real", despite their classes clearly referring to a class in the other library.
I can think of a few equally awkward options:
Create jai-core:1.1.3.1 and aspose-words:16.10.0.1 and fix their POMs to include the missing dependencies, but whoever updates them in the future will have to do the same thing. Plus, any other library I don't know about which happens to depend on jai-core would also have to be updated.
Add a dependency from our own project, even though it really isn't one.
Edit the POM for the versions which are there now to fix the problem directly, only caveat being that people might have cached the wrong one.
So I guess I have two related questions about this:
Is there no proper way to resolve this? It seems like any non-toy project would eventually hit this problem, so there not being an obviously correct way to deal with it is worrying.
Is there a way to stop incorrect dependency metadata getting into the artifact server in the first place? It's getting kind of out of hand, because other devs on the team are adding the dependencies without checking things properly, and then I'm left to clean up their error when something breaks a year later.
Tunaki has already given many good approaches. Let me add the following:
We had to deal with a lot of legacy jars which are some old or strange versions of already existing jars on MavenCentral. We gave them a special kind of version number (like 1.2.3-companyname) and created a POM for them that fitted our purposes. This is - more or less - your first "awkward option". This is what I would go for in your case; additionally, I would define the version in the dependencyManagement, so that Maven dependency mediation will not set it to some other version.
If a new version of your jar comes around, you can check if it still has the same problems (if they did a correct Maven build, they should have all dependencies inside the POM). If so, you need to fix it again.
I wouldn't change poms for already existing versions because it confuses people and may lead to inconsistency problems because Maven will not grab the new POM if an old version is already in the local repository. Adding the dependency to your own project is an option if you have very few projects to manage so that you still see what is going on (a proper comment on the dependencies in the POM could make it clearer).
JAI is optional for Aspose.Words for Java. Aspose.Words for Java uses JAI image encoders and decoders only if they available. And it will work okay without JAI.
The codecs complement standard java ImageIO encoders/decoders. The most notable addition is support of Tiff.
JAI (Java Advanced Imaging) is not usual library. First of all - it is native library. I.e. it has separate distributives for different platforms. It has also "portable" pure-java distributive, but if you want full power of JAI - you should stick to native option.
Another thing: usually you should run installation of JAI native distributive on the host system. I.e. it installed like desktop application, not like usual java library. Again, JAI codec acts not like usual library: if it installed on system - it will plug into ImageIO, irrelevant to classpath.
So, i don't know good way to install JAI using Maven - it is like using Maven to install Skype or any other desktop application. But it is IMHO, I am not great specialist on Maven:)
We are having a discussion at work and an interesting point came up:
Say you are developing a small library, call it somelib. Say that somelib needs to do some logging, but you don't want to reinvent the wheel, so you decide to use a 3rd party logging library.
Additionally, you want to make integration of somelib as painless as possible, so you distribute a single JAR file (somelib.jar), which has the other logging JAR, call it logger.jar, embedded inside of it. Much like what Maven's jar-with-dependencies assembly does.
Now comes the issue. Since your product is a library, what if your customer is using somelib and also happen to be using a different version of the same logging library on their own. Now we have a classpath problem.
This seems to me like it would be a common problem for people that write libraries, so what is the typical solution?
Do they avoid using JAR bundling methods altogether? Even if we do that, there is still an issue with a user's code expecting version X of the logging library, and somelib's code expecting version Y.
Do they somehow insert a dummy package prefix so that the logger classes in somelib won't conflict?
What about dynamic loading of the logger library? (though this still has versioning problems from 1.)
You may consider to use OSGI or wait for JDK 8 and its Jigsaw project.
I'm facing the following problem: I have one module in my webapp that needs jaxb 1.x and the other module needs jaxb 2.x. The first module doesn't work with the new version of jaxb, and the opposite. How can I use these two jars in one project?
Thanks.
For a regular application, usually very different versions use different package names. If this is the case, you can use them both at once without problem. However if they are the same, you can use jarjar to rename the package.
However since you are using a web container each application should use the version you deploy and not the other version. i.e. the web container works it out for you.
OSGi is another container which manages the versions much more explicitly and give you more control over these issues (however I believe you need it just for this)
You have got a jar-hell issue. Generally speaking in normal java environment it's impossible to solve this problem. You have to force modularization into your project by using OSGI. Starting point: http://www.osgi.org/About/HowOSGi
If you are using the JAXB reference implementation, then you can use your JAXB 1 models with the JAXB 2 runtime by including the jaxb1-impl.jar.
http://jaxb.java.net/faq/index.html#running1Apps
As Shaman said is imposible to resolve this issue.
Let's see: the servlet container JRE has only one classloader, and this classloader can load and use one class from jaxb or the other, but not both that will give you a classdefnotfound exception or something similar.
You can not solve this directly:
you can get the code (is opensource) and change the package of one to another name so the classloader can use both. I do not recommend you this solution, is a bad one.
Better is that you migrate the code to use the most modern API (jaxb 2)
I am trying to create a runnable JAR using Eclipse, but an running into problems. The Eclipse workspace contains two separate projects which depend on the same library. I can create the runnable JAR, but the problem is when I run it I receive a java.lang.NoSuchMethodError exception.
I believe I'm receiving the java.lang.NoSuchMethodError exception because the libraries are different versions. Is there a common solution to fix this problem? If not, what would you recommend I do?
If the major version number changes it means that backwards compatibility may have changed.
You could try with the latest version and hope that they just did add methods and that the old way of working, but even if NoSuchMethod exception is not thrown there is no guarantee (maybe with the new API you should call differente methods to get the same results).
I would contact the provider of the library and ask them if compatibility is broken. If they do not answer or it is broken, and you have the source code, the only possibility would be refactoring one of the libraries (probably 1.0); v.g. putting all of it in new packet v1. Then you would have to change the project that depends of it.
If none of the above works, then the solution would be an OSGi container or to setup project A and project B as two different executables and setup project B as a server that answer project A messages. Messy
The fix is to only include one version of the library which can satisfy both of the libraries that use it. If that's not possible, you'll have to find a different way of going about things such that you can eliminate the conflict. Options include:
Remove one or more of the uses from your code that are causing the NoSuchMethodError.
Modify the source of one or more of the libraries so they can happily coexist.
Use an OSGi container, which would allow you to have two versions of the same library in the same application.
As SJuan stated, you could use OSGI to set it up correctly. http://en.wikipedia.org/wiki/Java_Classloader#JAR_hell