This is a common problem. I'm using 2 libraries A.jar and B.jar and these depend on different versions of the same jar.
Let's say that at runtime I need THIS.x.x.x.jar
MY.jar
-> A.jar -> THIS.1.0.0.jar
-> B.jar -> C.jar -> THIS.5.0.0.jar
I can compile the specific jar (A.jar/B.jar) against its dependency but at runtime I've to load only 1 version. Which one?
Loading only 1 dependency (the latest version) means that my code will probably throw runtime exceptions if the libraries are not Backward Compatible (are there Backward Compatible libraries out there?).
Anyway I know that something like OSGi can fix this issue.
I'm wondering what's the old way to fix this kind of problems...
Thanks a lot
"Old way" you mentioned (and the one OSGI certainly uses under the hood) is to install your own ClassLoader for both branches of your dependencies. That's how, for instance, application servers are able to run both older and newer versions of the same application inside the same JVM.
Read about classloader hierarchy.
In your setup, the tricky part is the joint point, where classes from both branches meet. Neither branches can use classes loaded into another one. The way to make it work is to make sure only classes loaded by boot classloader (JRE classes) or classloader of MY.jar are passed down to both branches.
OSGi can fix this problem. An OSGi bundle is nothing more than a jar with additional metadata detailing versions. A bundle has a version number, and will detail version numbers (or ranges) of dependent jars.
Take a look at this introductory Javaworld article for more information.
To solve this without OSGi means having to ensure manually that you compile and run with compatible jars. As you've discovered that's not necessarily a trivial task. Since jars don't necessarily identify their versions, the only sure way to do this to record/compare checksums or signatures.
Many libraries are backward compatible. But not all..
The old way is to try to depend from only one version.
It is probably safer to compile both with the same version (latest).
At least you get compile-time errors, instead of runtime errors.
If needed, you can modify a little bit your library that works with the old dependency...
This would require access to the source...
Please note that compile-time compatibility will not guarantee correct runtime behavior either. It is one step, then you can:
read the WhatsNew file for the new version of the jar
look on the Internet for users reporting compatibility problems
write JUnits
compare the codes in both jars
As mentioned by KLE, the default approach is to depend on the newer version. There is no guarantee, but most of the time this works. Probably the best way (while being a bloated one) is using OSGI to get over it.
To refer a basic "oldway" implementation checkout https://github.com/atulsm/ElasticsearchClassLoader
This provides an approach to handle non-backward compatible versions of elasticsearch client usage.
Related
I am trying to understand the differences or the associations between these two principles / feature for Java modular projects.
I am trying to minimize the JRE size by reduce the use of the external libraries / modules.
In order to do that, I made some quick research and I found these two "principles" that may help me.
Now, I don't understand if these two things are different or if they link to one another somehow. How should I use these to reach my goal?
Can you please specify what would be the best solution for me? In the module-info.java do I have to specify manually what do I need?
Is there a possibility to generate the module-info.java files? (I guess not but I am just asking). Can I use it with JDK Amazon Correto 11?
you actually posted multiple questions.
I am trying to minimize the JRE size by reduce the use of the external libraries / modules.
Having fewer dependencies is always a good goal. jigsaw/JPMS will not help you with that. On the contrary: In fact you could end up with multiple versions of the same dependency, which wasn’t possible before.
Hint: JLink and JIGSAW/JPMS will not help you with reducing your dependencies.
How should I use these to reach my goal?
If your goal is to have a stripped-down JVM shipped with your application, you should look into the jlink binary, which is part of the JDK since Java 9.
If you are using maven, you could invoke it by using the maven-jlink-plugin. There are similar plugins for gradle and even for maven (e.g. javafx-specific plugins).
Can you please specify what would be the best solution for me?
That is something we cannot answer. YMMV – maybe quarkus is worth looking at as well, which creates native images (yes, os- and arch dependent native binaries).
In the module-info.java do I have to specify manually what do I need?
Yes, for your modules. You can use moditect if you use maven to inject a module-info.class file into your dependencies, at least if you are using maven as a build system.
Is there a possibility to generate the module-info.java files? I guess not but I am just asking)
Already answered in the comments, yes, by using jdeps.
Can I use it with JDK Amazon Correto 11?
Yes, they also ship both jlink and jdeps.
I have created a project in Java Eclipse which uses java 1.7. But I need to run some specific modules in it using java 1.8.
Is it possible? How?
you can set the Java compiler level only project wise not the module wise.
So in your case you should set the compiler level 1.8 which will support both the version.
In eclipse you can set the compiler level by below option.
Right click on Project->Properties->Java Compiler.
Simply spoken: you shouldn't do that.
If you are really talking about one project; then you have to make sure that all "components" within your project are on compatible levels.
In your case: if component A requires Java 1.8; and others are fine with 1.7 ... then you should go forward and use 1.8 (you still can use libraries that were compiled for older versions of Java; no need to update/recompile them). And well, if one part needs 1.8; and another only works 1.7 ... then there is no easy solution to that.
The point is: if you deviate from this practice, you will have to use multiple JVMs later on to run your "single" project - and that is of course a contradiction in itself.
The alternative is to dissect the one project you have right now into smaller parts (nowadays you would call them microservices) and define an architecture that allows you to run different parts of your application using different technology. But as others have pointed out: that adds a whole new layer of complexity to your setup.
I am trying to run a map/reduce job and I am getting a java.lang.NoSuchMethodError. I did some research on this and this appears when my code is executed (not compiled). The correct version of the class and methods are there during compilation, but when trying to run, the correct method is not available. The jar file that is causing this is guava. I know this from the stack that is printed. I throws an error when trying to execute the following line of code:
ArrayDeque<Entry<String, String>> a = Queues.newArrayDeque();
This jar is part of the hadoop classpath because it comes with the CDH verson 5.3.0 that I am using. I have tried adding the correct version of guava to the classpath, but the error does not change. My questions are as follows:
I believe that I have correctly identified the issue. Does this seem reasonable to you? I have never come across this error before.
I believe that I need to remove the older version of guava from the classpath and add the new one. However, I really do not know where to begin with correcting this. The command that is issued to hadoop jar does not contain the older version of guava (in the -libjar parm). The jar is part of the hadoop classpath when I issue the command "hadoop classpath". So I am assuming that there is some hadoop config file I could edit to make this go away. Is that the correct way to go, or is there some other thing I need to do?
I am using Java 7, CDH 5.3.0, NetBeans 8.
TIA
At the time that I'm writing this, Hadoop has a dependency on Guava version 11.0.2. It uses the library pretty heavily in its internal implementation.
According to the Guava JavaDocs, the Queues#newArrayDeque method was added in version 12.0. If your code is compiling successfully, then that means that Guava version 12.0 or higher is available on your compilation classpath at build time, but since version 11.0.2 is supplied at runtime by Hadoop, the method doesn't exist, resulting in NoSuchMethodError.
Unfortunately, there is no reliable way to swap out a different Guava version in Hadoop. Specifically, I recommend that you do not attempt to replace the Guava 11.0.2 jar that ships in the Hadoop distro. Replacing this with a different Guava version is untested, and it would risk destabilizing the cluster.
The broader problem is that Hadoop's dependencies "leak" to its clients. HADOOP-11656 is an unimplemented feature request that would isolate Hadoop's internal dependencies away from clients, so that you could more easily use common libraries like Guava at your desired version. Meanwhile, until that feature is implemented, I think your only options are to stick to Guava 11.0.2 APIs, or possibly try inlining some of the Guava code that you really want into your own project directly. The code for Queues#newArrayDeque is visible on GitHub.
public static <E> ArrayDeque<E> newArrayDeque() {
return new ArrayDeque<E>();
}
In this particular case, it looks like it will be easy to replace your code with a direct call to the java.util.ArrayDeque constructor. Thanks to the Java 7 diamond operator, it won't even be much more verbose.
ArrayDeque<Entry<String, String>> a = new java.util.ArrayDeque<>();
I am from .net world. I remember .net will immediately complain if you build with one dll but supply a different dll at run time.
I am now adding some hadoop reference to my project and find the following article.
http://answers.mapr.com/questions/364/maven-repository-for-mapr-jar-files
I just don't understand how this happens.
Java can build with one jar but run with a different jar?
Thanks
yes. this is often the case with APIs (you compile the API, but at runtime you may run with a newer version of the API which may be included with the implementation). everything will work out fine as long as the classes/method prototypes referenced in your compiled code are unchanged from the jar you compiled against.
For a specific definition of compatibility, see binary compatibility (thanks to #MiserableVariable for the link).
I need to add some jars from JRE7 library to my Android project. But for example rt.jar is in conflict with android.jar from Adroid 2.2 SDK, so I get this error:
Ill-advised or mistaken usage of a core class (java.* or javax.*)
when not building a core library.
This is often due to inadvertently including a core library file
in your application's project, when using an IDE (such as
Eclipse). If you are sure you're not intentionally defining a
core class, then this is the most likely explanation of what's
going on.
However, you might actually be trying to define a class in a core
namespace, the source of which you may have taken, for example,
from a non-Android virtual machine project. This will most
assuredly not work. At a minimum, it jeopardizes the
compatibility of your app with future versions of the platform.
It is also often of questionable legality.
If you really intend to build a core library -- which is only
appropriate as part of creating a full virtual machine
distribution, as opposed to compiling an application -- then use
the "--core-library" option to suppress this error message.
If you go ahead and use "--core-library" but are in fact
building an application, then be forewarned that your application
will still fail to build or run, at some point. Please be
prepared for angry customers who find, for example, that your
application ceases to function once they upgrade their operating
system. You will be to blame for this problem.
If you are legitimately using some code that happens to be in a
core package, then the easiest safe alternative you have is to
repackage that code. That is, move the classes in question into
your own package namespace. This means that they will never be in
conflict with core system classes. JarJar is a tool that may help
you in this endeavor. If you find that you cannot do this, then
that is an indication that the path you are on will ultimately
lead to pain, suffering, grief, and lamentation.
I know there have been several threads about it and things like JarJar, OneJar or FatJar might be good for me. But I don't know how to make any of them work and documentation doesn't really make it clear (for me). I guess they use Ant commands, but I have always used Eclipse built-in builder and now I have no idea how to use neither Ant nor any of mentioned above.
So my question is: how can I repack this rt.jar so I could compile it in my Android project?
Thank you!
EDIT:
Ok, so what I want to achieve is to create a .jar, which can be used during developing Android application (simplifies some functionalities, doesn't really matter). But I would also like to be able to add the very same .jar to standard Java project in order to use some functions there as well. It would look like this:
Whoever writes an application adds this .jar to his Java project -> it enables him to generate certain files (internet is needed to do it) -> these generated files are then added to Android project -> later on, when somebody uses this Android app, these files provide certain functionalities without using internet (off-line).
It would be ill-advised to do this in any project at all, even if it were possible. You would be opening yourself to a wealth of class incompatibility and loading problems. But in any case it doesn't even matter because the core Java libraries are loaded way before your archives are even touched, making any such attempt at overriding them moot.
Not to even talk about the fact that Android is using its own JVM implementation which is not fully compatible with JDK 6 (forget JDK 7). Also note that it may be a copyright violation to package the core Java libraries with your code and could change your licensing options (IANAL).
You need to find another way to resolve whatever issue you are having (which you failed to mention in your question).
There are many JARs that work nicely on both Android and on classic Java. None involve having Android developers pirate rt.jar. Stick to java.* and javax.* classes that exist in both the Android SDK and in whatever level of Java you are supporting, and your JAR will work fine in both environments.
You should ideally refrain from using such .jar files, but if you must, you can add them to build path. But this, at times results in a conflict, like the one that you are facing right now. What you need to do to resolve this kind of a conflict, is:
add the jar in the build path.
Check "referenced libraries". The jar file should appear under the same.
once it features under referenced libraries, check the "android dependencies" virtual directory. If you get to see that you have an instance of the same jar file there as well, you should delete the "android dependencies" folder altogether. (Trust me, this does not affect your project in any way).
having done that, you should be able to compile your code without any further conflicts.
Happy coding.. :)