How I can make maven to download only java 1.6 compatible libraries?
I have tomcat 6 and java 1.6.0_38. Or how I can find incompatible libs?
http://pastebin.com/WnwZL2RX list of dependencies.
If it's a one-time task which you have to run over the list your have provided then I would propose a manual approach described here:
What version of javac built my jar?
As for automatic check then you would have to implement your own maven plugin which will use similar approach to the one described above or drill into JAR manifest in order to get the Java version.
A very interesting question i hope to be able to check back for a more satisfying answer.
Maven is not able to perform such a task - and as faar as i know (i only checked back with a 5 minute websearch) there is no plugin available for this task.
In the end Maven supports with the dependency management by resolving dependencies of dependencies and has default approaches when it comes to versions of the same dependency ("the closest version wins") - regardless of this you will be in charge when it comes to compatibility between your own source and third party source.
Especially when talking about runtime incompatibility this could be a quite hard task to find out about issues.
Related
I am learning one android project from youtube but that video is published 2 years ago. Now in order to use that code what changes i have to do in dependencies of libraries like room , Okhttp etcc and how can i update that project to latest verison easily?
If you add the dependencies as listed, they should still work - repositories keep the older versions, so projects built with a particular set of dependencies can pull those versions and build the same output no matter how old it is. Nothing has to update to a new version of a library - and many actively maintained projects will choose to stay on an older version of a dependency for lots of reasons!
That said, if you're creating a new project right now, some of the current Android framework libraries, plugins etc. might require certain versions of other dependencies - they might force you to meet some minimum version requirements, just because they're incompatible with older versions. You could either downgrade all the stuff that complains about needing a newer version of X, or you could upgrade your old dependencies, like you're asking about.
An easy way to do it is to open the Project Structure window in the File menu, and go to the Dependencies section. It'll show you all the dependencies in your project, identify which ones are out of date, and you can select an available version you want, or just update to the most recent version.
But if there is a compatibility issue, it might take some time to work out which versions are necessary, or even which versions are being pulled in by other libraries (e.g. a particular version of a library might internally have a dependency on a certain version of a library you're already using, and the most recent one is what your project ends up using). You can get some info about this from the output in the Build window, or maybe running the dependencies Gradle task if you want to explore that.
It wouldn't hurt to look at the project pages for your libraries and see what they say about installation and dependencies. Also, their method for adding them to a project might have changes, e.g. something that used the (now closed) JCenter repository might be using Jitpack now. Something to look at if you're having problems.
Also there's the issue that new versions of the libraries might have different APIs, or their behaviour might have changed. Even if fully updating everything goes smoothly, there might be stuff you need to fix in the code itself. Just a few things to keep in mind!
I am trying to understand the differences or the associations between these two principles / feature for Java modular projects.
I am trying to minimize the JRE size by reduce the use of the external libraries / modules.
In order to do that, I made some quick research and I found these two "principles" that may help me.
Now, I don't understand if these two things are different or if they link to one another somehow. How should I use these to reach my goal?
Can you please specify what would be the best solution for me? In the module-info.java do I have to specify manually what do I need?
Is there a possibility to generate the module-info.java files? (I guess not but I am just asking). Can I use it with JDK Amazon Correto 11?
you actually posted multiple questions.
I am trying to minimize the JRE size by reduce the use of the external libraries / modules.
Having fewer dependencies is always a good goal. jigsaw/JPMS will not help you with that. On the contrary: In fact you could end up with multiple versions of the same dependency, which wasn’t possible before.
Hint: JLink and JIGSAW/JPMS will not help you with reducing your dependencies.
How should I use these to reach my goal?
If your goal is to have a stripped-down JVM shipped with your application, you should look into the jlink binary, which is part of the JDK since Java 9.
If you are using maven, you could invoke it by using the maven-jlink-plugin. There are similar plugins for gradle and even for maven (e.g. javafx-specific plugins).
Can you please specify what would be the best solution for me?
That is something we cannot answer. YMMV – maybe quarkus is worth looking at as well, which creates native images (yes, os- and arch dependent native binaries).
In the module-info.java do I have to specify manually what do I need?
Yes, for your modules. You can use moditect if you use maven to inject a module-info.class file into your dependencies, at least if you are using maven as a build system.
Is there a possibility to generate the module-info.java files? I guess not but I am just asking)
Already answered in the comments, yes, by using jdeps.
Can I use it with JDK Amazon Correto 11?
Yes, they also ship both jlink and jdeps.
We have a repository built using Java 8. There are multiple rest services within the repository. We want to migrate to Java 11 and trying to figure out the best way of doing this. We are considering doing module by module. For example changing one service over to Java 11 while the remaining are still Java 8. We are unsure if Maven supports this?
Disclaimer: This is not an answer but just a partial report of my recent experience. Feel free to flag this answer if you feel that it doesn't meet the SO standards.
Does Maven supports this?
Yes, use the compiler plugin 3.8.0/3.8.1
However this migration requires addition care.
Recently we did something like this by migrating from ORACLE JDK 8 to OPENJDK 11. As we have houndreds of repositories with different missions, we faced all kind of problems. Just to cite some that I got here in my e-mail box tagged as [jdk11_migration]:
It is quite obvious but I'd like to highlight that in order to migrate from java 8 to 11 we have to meet the requirements from java 9 and 10 as well
Some maven plugins like cobertura do not support Java 11 and I guess they will never support it. In some cases, these plugins have reached the abandoned life cycle stage. The solution was looking for alternatives in a case to case manner. For example, we replaced cobertura by Jacoco.
rt.jar and tools.jar have been removed! Everything you have explicity refered from them will probably break.
some classes which we shouldn't use in java 9 or less now in java 11 no longer exist. I'm talking about to access classes in packages like sun.*, sun.misc etc. The solution is to look for a one-to-one replacement or refactor the code to avoid the usage.
Reflection usually is the last bullet to use and, for these cases, in java 9 and above we geta warning messages like:
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by ...
WARNING: Please consider reporting this to the maintainers of ...
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Although it is not exactly a solution, there is a flag to get rid of this warning --illegal-access=permit . This is particulary important when using surefire maven plugin.
Java 9 introduced the module system then "now" we have the phenomena of clash of packages. For example, messages like "The package org.w3c.dom is accessible from more than one module: , java.xml" . The solution is to find the source of reduntant inclusion (notably duplicated maven dependences or dependences of dependences) and remove it.
Althought it wasn't a problem for us, I just noted that your repository consists in REST components in majority. Probable you will face ClassNotFound issues regarding some packages like javax.xml.bind which were basically dropped out of java standard edition. You can fix it by including they explictly in your pom.xml.
Luckly you may find good questions & anwswers for each issue you will find in your migration here in SO or over internet. There are some migration guides in the wild which are good start points. Specific issues, like obfuscation and IDE integration, can take a little bit of time, but, at least in my experience, this migration was painless than I have supposed.
My suggestion is to upgrade the entire project. Try to have some Java8 and some Java11 modules can be very difficult. As you already know, starting from Java9 module appears. The answer is very generic, so it's difficult to give a detailed response. I suppose:
Your project is managed with maven
You have many artefacts (jar)
You are using some library or framework (who said Spring?)
You are using a Source Version Control (Git or Subversion for example)
You have a multi-module project
The code you wrote works on Java8 and on Java11 platform
My suggested plan:
Create a branch on your SVC for Java11 project's version and start to work on it.
Create a parent module for common definitions
Upgrade maven plugin and all your library to the latest version. For Maven compiler set the Java11 target (the Java11 is supported only by the latest version of Maven Compiler Plugin).
For each module define the exported packages (and specify which packages are required)
If there are many modules, start with only a few of them and then include the remains.
If it can help you, let have a look at this simple project on github (It target Java11 and it's a multi-module Maven project).
Hope it helps.
I am trying to run a map/reduce job and I am getting a java.lang.NoSuchMethodError. I did some research on this and this appears when my code is executed (not compiled). The correct version of the class and methods are there during compilation, but when trying to run, the correct method is not available. The jar file that is causing this is guava. I know this from the stack that is printed. I throws an error when trying to execute the following line of code:
ArrayDeque<Entry<String, String>> a = Queues.newArrayDeque();
This jar is part of the hadoop classpath because it comes with the CDH verson 5.3.0 that I am using. I have tried adding the correct version of guava to the classpath, but the error does not change. My questions are as follows:
I believe that I have correctly identified the issue. Does this seem reasonable to you? I have never come across this error before.
I believe that I need to remove the older version of guava from the classpath and add the new one. However, I really do not know where to begin with correcting this. The command that is issued to hadoop jar does not contain the older version of guava (in the -libjar parm). The jar is part of the hadoop classpath when I issue the command "hadoop classpath". So I am assuming that there is some hadoop config file I could edit to make this go away. Is that the correct way to go, or is there some other thing I need to do?
I am using Java 7, CDH 5.3.0, NetBeans 8.
TIA
At the time that I'm writing this, Hadoop has a dependency on Guava version 11.0.2. It uses the library pretty heavily in its internal implementation.
According to the Guava JavaDocs, the Queues#newArrayDeque method was added in version 12.0. If your code is compiling successfully, then that means that Guava version 12.0 or higher is available on your compilation classpath at build time, but since version 11.0.2 is supplied at runtime by Hadoop, the method doesn't exist, resulting in NoSuchMethodError.
Unfortunately, there is no reliable way to swap out a different Guava version in Hadoop. Specifically, I recommend that you do not attempt to replace the Guava 11.0.2 jar that ships in the Hadoop distro. Replacing this with a different Guava version is untested, and it would risk destabilizing the cluster.
The broader problem is that Hadoop's dependencies "leak" to its clients. HADOOP-11656 is an unimplemented feature request that would isolate Hadoop's internal dependencies away from clients, so that you could more easily use common libraries like Guava at your desired version. Meanwhile, until that feature is implemented, I think your only options are to stick to Guava 11.0.2 APIs, or possibly try inlining some of the Guava code that you really want into your own project directly. The code for Queues#newArrayDeque is visible on GitHub.
public static <E> ArrayDeque<E> newArrayDeque() {
return new ArrayDeque<E>();
}
In this particular case, it looks like it will be easy to replace your code with a direct call to the java.util.ArrayDeque constructor. Thanks to the Java 7 diamond operator, it won't even be much more verbose.
ArrayDeque<Entry<String, String>> a = new java.util.ArrayDeque<>();
This is a common problem. I'm using 2 libraries A.jar and B.jar and these depend on different versions of the same jar.
Let's say that at runtime I need THIS.x.x.x.jar
MY.jar
-> A.jar -> THIS.1.0.0.jar
-> B.jar -> C.jar -> THIS.5.0.0.jar
I can compile the specific jar (A.jar/B.jar) against its dependency but at runtime I've to load only 1 version. Which one?
Loading only 1 dependency (the latest version) means that my code will probably throw runtime exceptions if the libraries are not Backward Compatible (are there Backward Compatible libraries out there?).
Anyway I know that something like OSGi can fix this issue.
I'm wondering what's the old way to fix this kind of problems...
Thanks a lot
"Old way" you mentioned (and the one OSGI certainly uses under the hood) is to install your own ClassLoader for both branches of your dependencies. That's how, for instance, application servers are able to run both older and newer versions of the same application inside the same JVM.
Read about classloader hierarchy.
In your setup, the tricky part is the joint point, where classes from both branches meet. Neither branches can use classes loaded into another one. The way to make it work is to make sure only classes loaded by boot classloader (JRE classes) or classloader of MY.jar are passed down to both branches.
OSGi can fix this problem. An OSGi bundle is nothing more than a jar with additional metadata detailing versions. A bundle has a version number, and will detail version numbers (or ranges) of dependent jars.
Take a look at this introductory Javaworld article for more information.
To solve this without OSGi means having to ensure manually that you compile and run with compatible jars. As you've discovered that's not necessarily a trivial task. Since jars don't necessarily identify their versions, the only sure way to do this to record/compare checksums or signatures.
Many libraries are backward compatible. But not all..
The old way is to try to depend from only one version.
It is probably safer to compile both with the same version (latest).
At least you get compile-time errors, instead of runtime errors.
If needed, you can modify a little bit your library that works with the old dependency...
This would require access to the source...
Please note that compile-time compatibility will not guarantee correct runtime behavior either. It is one step, then you can:
read the WhatsNew file for the new version of the jar
look on the Internet for users reporting compatibility problems
write JUnits
compare the codes in both jars
As mentioned by KLE, the default approach is to depend on the newer version. There is no guarantee, but most of the time this works. Probably the best way (while being a bloated one) is using OSGI to get over it.
To refer a basic "oldway" implementation checkout https://github.com/atulsm/ElasticsearchClassLoader
This provides an approach to handle non-backward compatible versions of elasticsearch client usage.