We are using an external library that has been compile-time weaved with AspectJ 1.9.6. We would like to use Java 17 and thus need at least AspectJ 1.9.8 to compile-time weave our own code. During runtime we have a shared classpath (WEB-INF/lib) for both our classes and the external library. Is it save to provide an AspectJ Runtime Version 1.9.8 here? Can we even go further and use the latest version of the runtime (which is then also used to compile-time weave our code) --> 1.9.9.1?
Basically: Is the compile-time weaved code of our external dependency compatible with those higher runtimes?
(I am aware of the statements noted at: https://www.eclipse.org/aspectj/doc/released/devguide/compatibility.html)
Please just use the latest AspectJ version, as of today 1.9.9.1, see
https://github.com/eclipse/org.aspectj/blob/master/docs/dist/doc/JavaVersionCompatibility.md
Semi off topic: Actually, I could release AspectJ 1.9.19 (in the future, the minor-minor will mirror the latest supported Java version) anytime, if it was not for some unfixed Eclipse Java compiler bugs concerning Java 19 preview features, which I would like to see fixed upstream before releasing the next AspectJ version. But you do not need Java 19 language features, so you are fine with 1.9.9.1.
Related
My AspectJ version is still on 1.6.8 running on a Java 6 Project.
Suppose I migrate to Java 8.
Should I update the AspectJ version? Is it mandatory?
If yes, are there things I have to be aware of?
This question was flagged as a duplicate before, but I voted to re-open it because the other question was about AspectJ Maven Plugin compatibility with Java 8 and the answers there do not explain anything explicitly in order to answer this question.
AspectJ 1.6.x was published for Java 6, just as AspectJ 1.7.x was for Java 7, 1.8.x for Java 8, 1.9.x for Java 9-13 at the date of writing this.
Having said that, you should consider the following:
If you would just use the Java 8 compiler but compile your Java 6 code with target 1.6, in theory you could continue using AspectJ 1.6.8 (e.g. runtime) too.
But as soon as you compile with target 1.8 and/or use Java 8 language features, you have to use AspectJ 1.8.x (I recommend the latest version) at least in order to make the AspectJ compiler and weaver understand those language features and the byte code at all.
I would even recommend to use the latest 1.9.x version, it is backward compatible and might have a few bug fixes missing in 1.8.x. But that is an optional choice. I always use the latest AspectJ version even in Java 8.
We have a repository built using Java 8. There are multiple rest services within the repository. We want to migrate to Java 11 and trying to figure out the best way of doing this. We are considering doing module by module. For example changing one service over to Java 11 while the remaining are still Java 8. We are unsure if Maven supports this?
Disclaimer: This is not an answer but just a partial report of my recent experience. Feel free to flag this answer if you feel that it doesn't meet the SO standards.
Does Maven supports this?
Yes, use the compiler plugin 3.8.0/3.8.1
However this migration requires addition care.
Recently we did something like this by migrating from ORACLE JDK 8 to OPENJDK 11. As we have houndreds of repositories with different missions, we faced all kind of problems. Just to cite some that I got here in my e-mail box tagged as [jdk11_migration]:
It is quite obvious but I'd like to highlight that in order to migrate from java 8 to 11 we have to meet the requirements from java 9 and 10 as well
Some maven plugins like cobertura do not support Java 11 and I guess they will never support it. In some cases, these plugins have reached the abandoned life cycle stage. The solution was looking for alternatives in a case to case manner. For example, we replaced cobertura by Jacoco.
rt.jar and tools.jar have been removed! Everything you have explicity refered from them will probably break.
some classes which we shouldn't use in java 9 or less now in java 11 no longer exist. I'm talking about to access classes in packages like sun.*, sun.misc etc. The solution is to look for a one-to-one replacement or refactor the code to avoid the usage.
Reflection usually is the last bullet to use and, for these cases, in java 9 and above we geta warning messages like:
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by ...
WARNING: Please consider reporting this to the maintainers of ...
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Although it is not exactly a solution, there is a flag to get rid of this warning --illegal-access=permit . This is particulary important when using surefire maven plugin.
Java 9 introduced the module system then "now" we have the phenomena of clash of packages. For example, messages like "The package org.w3c.dom is accessible from more than one module: , java.xml" . The solution is to find the source of reduntant inclusion (notably duplicated maven dependences or dependences of dependences) and remove it.
Althought it wasn't a problem for us, I just noted that your repository consists in REST components in majority. Probable you will face ClassNotFound issues regarding some packages like javax.xml.bind which were basically dropped out of java standard edition. You can fix it by including they explictly in your pom.xml.
Luckly you may find good questions & anwswers for each issue you will find in your migration here in SO or over internet. There are some migration guides in the wild which are good start points. Specific issues, like obfuscation and IDE integration, can take a little bit of time, but, at least in my experience, this migration was painless than I have supposed.
My suggestion is to upgrade the entire project. Try to have some Java8 and some Java11 modules can be very difficult. As you already know, starting from Java9 module appears. The answer is very generic, so it's difficult to give a detailed response. I suppose:
Your project is managed with maven
You have many artefacts (jar)
You are using some library or framework (who said Spring?)
You are using a Source Version Control (Git or Subversion for example)
You have a multi-module project
The code you wrote works on Java8 and on Java11 platform
My suggested plan:
Create a branch on your SVC for Java11 project's version and start to work on it.
Create a parent module for common definitions
Upgrade maven plugin and all your library to the latest version. For Maven compiler set the Java11 target (the Java11 is supported only by the latest version of Maven Compiler Plugin).
For each module define the exported packages (and specify which packages are required)
If there are many modules, start with only a few of them and then include the remains.
If it can help you, let have a look at this simple project on github (It target Java11 and it's a multi-module Maven project).
Hope it helps.
We're using java 8 for most modules/projects, but for some of the modules, we use java 6 (customer requirements).
The developers have java 8 installed and we compile the java 6 projects using these flags:
compileJava {
sourceCompatibility = 1.6
targetCompatibility = 1.6
}
We thought we're all good until we upgraded guava from v20 to latest - 28.1-jre.
To our surprise, the build was successful but failed at runtime.
We have a workaround for building for java 6 using a specific javac found in JDK 6. See more info here. This workaround wields the error class file has wrong version 52.0, should be 50.0 in compile time. The downside is that it requires a download+config+usage of JDK 6 for developers.
Is there a way to validate the dependencies' java version at compile time when using a higher java version? (without installing lower version java) Thanks.
Setting -source and -target values to 1.6 is insufficient to ensure that the resulting output is compatible with 1.6. The program itself must not have any library API dependencies on later versions, and the -source and -target options don't do that. (GhostCat said pretty much the same thing.)
For example, in Java 8, ConcurrentHashMap added a covariant override for the keySet method that returns a new type ConcurrentHashMap.KeySetView. This type didn't exist in earlier versions of Java. However, in the class binary, the return type is encoded at the call site. Thus, even if the source code is compiled with -source 1.6 -target 1.6, the resulting class file contains a dependency on the Java 8 class library API.
The only solution to this is to ensure that only Java 1.6 compatible libraries are in the classpath at compile time. This can be done using the -Xbootclasspath option to point to a JDK 1.6 class library, or it might be simpler just to use a JDK 1.6 installation in the first place.
This applies to external libraries in addition to the JDK, as you've discovered with Guava. The Animal Sniffer project provides plugins for Ant and Maven that checks library dependencies for version problems. Offhand I don't know if there is something similar for Gradle. There might be a way to get Animal Sniffer to work with Gradle, but I have no experience with doing that.
Is there a way to validate the dependencies' java version at compile time when using a higher java version? (without installing lower version java).
You specify your dependencies. When you tell your built system to explicitly use some library X in version Y, then you made a very clear statement.
And you see, it is not only about the class file version number. What if some person doesn't pay attention, and compiles something with Java8 ... with Java6 target, but forgets that the code bases uses Java8-only API calls?!
In other words: you are looking in the wrong place.
The person who makes updates to the build description, and changes a library version from Y to Y+8, that person needs to carefully assess that change. For example by reading release letters.
I agree that a really clever build system could check if libraries you are using come in with a matching class file version. But as said, that is only one aspect of the problem. So instead of looking into a technical solution, I think the real answer is: don't step version numbers because you can, but because you have to. And that manual step of changing that version number, that is something that requires due diligence (on the side of the human doing it).
Thus: I think the most sane approach here is to compile the Java6 deliverables within their own specific build setup. Which you only touch after careful inspection of such details. And sure: convince your customer to move on, and give up a long dead version of Java.
Background:
We have maven-based java project, which targets JRE 1.7, but the source code uses lambdas, so we use retrolambda for transforming Java 8 source code to Java 7. Also we use StreamSupport backport library when we need streams, function.*, Optional, etc.
Usage of retrolambda involves configuring the project's both source and target language level to 1.8.
Everything works fine if there are no dependencies on java8 classes or methods (like java.util.stream.*, java.util.Optional, or methods introduced in java8 like Collection.forEach). If there are such usages then build passes, but it fails in runtime, when running under JVM of Java 8.
Question:
My goal is to fail the build in case when such dependencies exist. Is there any way of detecting dependencies on new Java 8 classes/methods in build-time?
I thought about 2 possible options, but I'm not sure whether either of them is doable:
Some kind of bytecode analyzer for detecting depdencies on predefined classes and methods. Are there such tools/maven plugins?
Lint (lint4j) rules. Not sure whether it's possible to detect dependency on class/method using lint
You can use the Animal Sniffer Maven Plugin for this. It allows you to check that your code only uses APIs from a specified baseline (called the "signature"). In your case you'd use the org.codehaus.mojo.signature:java17:1.0 signature.
As others pointed out, you also could set up the bootstrap classpath, but that a) requires a JDK 7 to be set up and b) makes the build a bit more complex as you need to point to the JDK 7 install. Animal Sniffer is is much easier to work with in my experience.
I try to write my code compatible with older versions of java, so it will work everywhere.
on the other hand there is very powerful tools in the new version - java 8, and I want use them.
So I'm forced to choose between compatibility or richest code.
And I'm wondering if by any chance I can write some methods in java 8, and somehow prevent the compiler of older version to ignore these methods, so my class is compatible "partially" with older version.
Thanks.
You can write two classes and use some toll like ant, maven or gradle to chose which file use for compiling with concrete Java version.
You can set the java compiler to compile against an older jdk (ie jdk 1.5) even if you use jdk 1.8. see javac source and target options
I think the short and easy answer is no.
See this thread: Can Java 8 code be compiled to run on Java 7 jvm?
You can use the java reflection api to check if methods exist in the jvm the code runs on. This allows you to make your code fail-safe even when a method or class is unavailable in the jvm. Doing this is very cumbersome however and I'm pretty sure it's not what your're looking for.