In a java application that uses plugins (such as Minecraft server, an IDE ...etc) many times it's necessary to use some library and shadow it in your .jar.
The problem comes across when another plugin also uses that library and also includes it in it's .jar but with a different version, both plugins will work fine until they meet together under the same classloader, then, the classloader will load just one version of the library and the other plugin will be running with a different version than the one it was designed for, leading to NoSuchMethodExceptions and/or different runtime behavior if the library is not backward compatible.
Note: This question is not about how to compile with maven when 2 dependencies have the same transitory dependency but in a different version. The problem is way different in this case because you cannot resolve or even detect dependency issues at compile time since the different versions of the library aren't going to be in the same .jar, neither in the same project, only under the same classloader at a given time by 2 plugins probably made by different authors with different purposes.
What is the general strategy to avoid such conflicts?
If I'm making my own library and it cannot be backward compatible, what can I do to avoid this problem?
This is a classic java problem, and there are some major redesigns of java technologies underway to change this, though these are a while off (Java 9 or Java 10).
For now, the best thing you can do is explicit and regular dependency analysis of your projects.
To start, run this: mvn dependency:tree
You will be able to see multiple versions of the same dependency. Unfortunately when using external libraries, you may need to conform to the same dependency version as they use, or look for an updated build of the external dependency.
I have used a lot of time in dependency analysis, and it mostly has to be done manually (one of the bugbears of working with java...)
If you are making your own software, us explicit API versioning and make a significant change to the version whenever you make a breaking change.
For example, use "version 1.1", "version 1.2" and so on. Let people know that version 1.1.1 and 1.1.4 are backwards compatible with 1.1, but as soon as you introduce an API breaking change (this happens), go to version 1.2 and force people to update.
Unfortunately this is an issue faced in my application especially when you can't manage or maintain any third party pr external system. To minimize such situations you should create parent poms and put external dependencies there. Also regularly check if any mismatch or conflicts are there and if can be removed at development stage. Sometimes u reach dead lock when your application need a specific dependency but rest of the architecture is build on different one. Roll out new changes with custom build packs and huge changes and sometimes you can't help it. Precaution is better than cure in this case.
#vikingsteve's answer explains how to design your APIs to avoid the problem. Unfortunately, sometimes practical considerations will get in the way. For example, when your code depends on two large libraries that (in turn) have dependencies on different versions of a third library. And those versions are incompatible.
If you are in this situation, and you can't solve it by rewriting the libraries, bugging the library developers / vendors to fix it, etc., there are still a couple of options:
You could restructure your software do that the libraries with conflicting dependencies are in separate applications; e.g. split them into a "client" and a "server", or "peers" communicating over a message bus, or a parent and child process talking over a pipe.
You could restructure your codebase so that the two libraries are loaded in two separate child classloaders of your main classloader. This would allow the two versions of the common dependency to be loaded by the two child classloaders .... and co-exist within the JVM.
Obviously, neither of these approaches is a magic bullet. And both will entail some rework of your application. But they may solve your "dependency hell" problems.
My solution was to create another independent running jar application with the external dependencies I needed and perform the tasks there.
Related
While importing a Gradle project with IntelliJ IDEA the resulting classpath contains different versions of multiple Java libraries.
How to avoid that?
If Gradle is doing that you probably should not prevented this unless you fully understand why it is doing so.
There are several reasons why multiple versions are necessary, for instance your build process has plugins that require different versions of a library.
One plugin could be developed by a vendor and other by a random team some place and they have different requirements. Some times even your own app could ask for that, like in cases where you are building components that may run in an environment with multiple class loaders, a classic example is OSGi.
Bottom line, if it is not interfering with your work, just leave them there, something requested that version. For your app itself, you are able to define your dependencies and have control of the required versions.
Normally depending on the authors of a lid a minor version number difference may signify huge API changes, and multiple versions are the only way to make the pieces fit together. Not necessarily only for your app, but also for your building. Not mentioning here the dependency inheritance that may add up.
Recently I was in a pretty much trouble with a dependency hell problem in Java.
Are there any proper set of procedures to avoid it in future?
First, use dependency manager like maven or gradle. No lib folder that contains third party jars, obviously no copying classes or *.java files from other project to yours. Sorry if this is obvious but I saw a lot of projects built using such technique.
The next phase however is optimization of dependencies. You can use library A and B that both use library C. It is fine if both use library C of the same version. The hell starts if they depend on different versions of library C. In most cases this does not cause any problem and you can be even not aware on this fact. However sometimes it may cause problems. To avoid this I'd recommend you from time to time check the dependency tree of your project, find duplicates and if they exist use exclude instruction of your dependency manager to exclude older version.
This however can fail because, of incompatibility of these versions of the library. In this case there is no general way to solve the problem. Sometimes you have to downgrade one (or several) of your dependencies in order to make them to work together and wait for the newer version that uses newer version of library C (in our example).
The luck of java programmers in 2016 is that most of tools are typically open source and a lot of them are available for contributions (e.g. via github), so sometimes you can contribute your fix and help yourself and to other developers to get newer version faster.
I have seen many interesting (and duplicated) questions here about "sharing or using classes between projects".
I see this as quite practical but the proposed solutions I have read about definitely assume certain prerequisites such as:
shared eclipse workspaces
projects that can be made as dependencies of oneanother
common servers such that classpaths can be added with local urls
While likely acceptable solutions, I am looking for an alternative with perhaps greater flexibility and portability.
I am thinking of learning how to use gradle (or maybe maven, I haven't fully committed to one or the other yet). And from what I understand it may be possible to manage shared classes with one of these dedicated dependency management programs.
Theoretically is this possible? Can I setup a gradle or maven enabled java project to handle and keep uptodate personal classes on a local server or folder on a portable drive or cloud mirror?
The way I understand dependency management at the moment (on a superficial level, I know the devil is in the details) is that for a configured dependency management enabled project, gradle/maven will handle classpath additions and the actual version specific comparison, retrieval and storage (and maybe even compilation is possible but I don't know about this) of JARS from external sources.
Rather than go through the steps to setup classpaths to jars I have to keep current and compiled myself as proposed in many other answers, I am considering creating a dummy project on a server that I can put generic classes which I could then point numerous individual gradle/maven enabled java projects to use. (I think most people would be able to keep them as stand alone classes, but I think I might need to keep them in a dummy project to be developed and debugged in context from a main class. I am somewhat new to java architecture so if the only thing that would make this solution impossible is pointing to a "project" instead of a "library" I can definitely adjust from there. (Assuming I am even applying the concept of the "library appropriately).)
Other info:
I would like this to simplify personal dependency using both Netbeans and eclipse IDE's and work cross platform (but Linux and Windows is what I plan to test it on)
So you're looking for portability, and you don't want to compile your java class that you want to share between projects. And you don't mind a local deployment.
The first thing that comes to mind for me is Git - I'm not sure if Gradle/Maven deal in the gritty underworld of the uncompiled. Composer will pull in git repos for php, so that got me thinking.
If you're happy with one-way sharing of code among projects, Git has submodules that let you do that.
But searching around, apparently there's a git script that goes one step further - Git Subtrees. I also found an intriguing tutorial that will allow you to make changes to common code that you change while working on any particular project that shares it - so obviously be careful - but check it out and see if the Subtrees script might suit your needs.
Actually, I don't see too much sense for dependency management on a "class level". Typically you would bundle your classes in a jar file, which in turn can be considered as a unit with a particular functional range. Such a jar is suitable to be put in a dependency management.
If you are new to such tools, I'd recommend Maven. It is widely used in the Java world and well-integrated in common IDEs. If you stick to its conventions, it will take care of your whole build process from compiling, testing to packaging. There are a lot of plugins available that let you customize practically everything in a simple XML based configuration. You'll have your first project running in 30min and your current project migrated in another 30min.
To share your code with others, you still need a repository where you can upload your Maven-built artifacts to. Depending on your preference there are many possibilities. Shove it to Amazon S3, Maven Central or install your own Sonatype Nexus in your private network.
I have a lot of small-ish Java command-line apps that have to use a vendor set of jars to make their connection to the servers, and while writing these apps has been pretty easy, maintaining them is turning out not to be.
I built them using Maven and some templates that gave me a pretty quick turnaround on development time, but these are all standalone command-line apps so far.
The pain comes from the updates to the vendor servers. Whenever it happens, it forces me to recompile all of these applications with the new jars in play, otherwise I'd get SerialVersionUID Exceptions and borked apps.
What I'm considering
I was thinking that it would be possible to use Maven to generate the app and then throw it in an app server with the server providing a set of shared vendor .jars in whatever /shared classpath it provides. That way I could just update the .jars, restart the server, and everything will likely continue without error.
However, I'm not sure what is required to get that working. They aren't .war's, and most of these apps use other .jars besides my code that isn't usually shared.
Questions
What kinds of deployments work like this?
Are there other ways to do this that I'm missing?
Any tutorials, frameworks, etc., that would make this simpler?
I could just share from my experience. We have 3 ways of overcoming this at the moment. The first way, was our old way, which we still use.
Create a base project which we link to with all our external JARs. The user project when compiled/deployed checks for a newer tag of the base project than it is currently using, and if there's a newer tag, it will fail and force the user to check for the updates and recompile. The benefit of this is that at compile time you can check for new jars, and we get a list of what changed, and reload the jars. In our IDE we can quickly see if anything changed from the API - it has been known to happen, then we can fix and recompile. Also because we're aware of the changes, we can read the changelogs and see if any of the changes affect us, and we then retest parts of the application that depend on these libraries.
We use Maven, but instead of the public repositories, we maintain our own repository, where we only move a library into our local repo after it has been tested - so no surprises for us.
This method is unique to a Tomcat deployment, but we also load libraries via the Context using a classloader. It's not my favorite way to load external jars, but this one works without a deploy. Of course since the compiler isn't there to help you, the new jar can not be 100% compatible with your code, and you can end up with some runtime NoSuchMethodError or ClassNotFoundException expections.
I might be misunderstanding your situation, but if the interface you use in the vendor libraries does not change between versions, couldn't you just keep the jar-files in the classpath of your command-line applications and thereby just overwrite the vendor files when necessary? In that way you would only need to recompile when the interfaces change in a way to break your code?
My project uses Java libraries that have their own dependencies (Hadoop, Jetty for example). I end up with different versions of the same dependencies, like ant 1.4.5, 1.4.6. My project may want to use ant 1.4.7. This is a small example, can get more complicated with larger dependencies like HTTP Commons.
How do I get all the libraries and dependencies to play nice? Is there a way to isolate each library (Hadoop, Jetty) so they only use their dependencies?
Maven will also generally handle this pretty well. If not completely, it will at least handle the bulk of it, then you can sort out the issues that are left over.
Of course that means that you have to change your build process, but it might be worth your while for not pulling your hair out over this.
JarJar to the rescue!
An ant taks that both 1) packs many jars into one, and 2) allows you to rename dependencies in class files and thus load two versions of the same library!
You may choose to manage all of these with a depenency management framework - like OSGI. Take a look at the spring framework dynamic modules http://www.springsource.org/osgi
You can also take a look at the part of the framework where Eclipse implements OSGI. Take a look here http://www.eclipse.org/osgi/
The short answer is just to go for the lowest common denominator. Remember that the 'endorsed' directory is your friend - when it comes to managing conflicting dependencies.
If you have the source code, compile everything together. If you don't, unfortunately you will have to target your source for the lowest-common-denominator with the target option to javac. You only have to do this if there are actual issues when running the application, which there rarely should be as long as the jvm is a current version (Java is very strict about binary compatibility with older versions).
J G mentioned OSGi which was even my first thought when reading this question.
Having multiple versions of the same library is a strong point in OSGi.
If we have some third party products already mentioned i think its fair to mention the specification behind it as well.
You can get it from the official osgi site http://osgi.org