The question first, the story will follow:
Is it safe to mix different bytecode version in a class hierarchy? What are the risks?
For a case, Class C extends B, Class B extends Class A. Class A implements Interface I.
My question would involve following example scenarios:
Class A compiled to Java 1.6 bytecode, and have 1.6 features such as generics, etc. The heirs, which are B and C was compiled to 1.4 bytecode.
Interface I compiled to 1.6, while the implementor compiled to 1.4.
Other exotic inheritance scenario involving different version of bytecode.
I have tried as many scenarios I could imagine and it seems to run just fine. However I still feel the urge to ask here as I only know Java at the surface; i know how to code and tweak Java but don't really know what happen under the hood.
Now for those minds who can't help themselves to ask "why would you need to do that???".
I'm in a project to assess the migration of legacy Java 1.4 Swing app, connected to EJB 2 via RMI, to Java 1.6 Swing connected to newer version of App Server running on top of 1.6 also. The J2EE platform will still be 1.4 (EJB 2).
The migration will not be "recompile everything to 1.6", but it will be "code and compile new features to 1.6".
The way they do things is like this:
They only have one path in the CVS, everyone commits there. No tags/branches whatsoever to get the production code.
Whenever a new feature need to be added, they get the JARs from production server, explode them, replace or add new classes as needed, repackage the jars, put them back to server.
Therefore, if they will use Java 6 to compile and using the above method for deployment, there will be a lot of exotic mixes of 1.4 and 1.6 bytecodes.
The JVM byte code is not siginificantly different between Java 1.0 and Java 6. In Java 7 they add one new instruction. Woohoo.
There are so little changes in how the byte code works that
The JVM doesn't support nested classes accessing private members of outer classes, this works through generated code.
The JVM doesn't support runtime checks for generics e.g you cannot new T() where T is a generic.
Basically, they make the JVM smarter and faster but until recently changing the model of how the byte code works has been avoided at all costs.
You can compile with Java 6 but target 1.4 with a compiler setting. We did this for a migration project once. If/when 1.4 disappears, you then change your compiler settings again and target 1.6.
Keeping the target version explicit also means that you can upgrade your SDK without fear of your JAR files becoming unusable to an older JVM.
I am maintaining an environment with mix of 1.4 (old library jars) and 1.5 (my fixes and stuff) classes on Tomcat using Sun JVM 1.5 and it runs fine.
However, for RMI you may be in trouble if client and server has different class version because the server might check the class version (I ran into this problem).
The best way to find out is to do a proof of concept type of project on small scale.
A friendly reminder though, you are digging a pretty big hole for yourself here :-)
These links seem relevant. They document the few edge cases that could break compatibility between 1.4 and 1.5 and between 1.5 and 1.6.
The biggest differences that could cause problems that I can think of is that enum became a keyword, but that would only effect a 1.5+ JVMs when loading an older class file (which doesn't seem to be what you will be doing). The other thing is annotations. The above links seem to suggest everything would be fine, but I would be wary about what would happen if an older JVM loaded up a class with runtime annotations.
Other than that I don't think there have been any bytecode changes between the first version of java and java 6. Meaning the only problems you should encounter are changes to functionality the API or deprecations (listed in the links above).
As long as you aren't using reflection, the only major problem you could have from differing bytecode versions is the ACC_SUPER flag.
In very early versions, invocation of superclass methods was not handled correctly. When they fixed it, they added a new flag to the classfile format, ACC_SUPER to enable it, so that applications relying on the old, broken, behavior were not affected. Naturally, using a class that doesn't contain this flag could cause problems.
However, this is ancient history. Every class compiled in 1.4 and later will have the flag, so this isn't a problem. Bytecode wise, the only major differences between 1.4 and 1.6 are the addition of optional attributes used to store metadata about inner classes, generics, annotations, etc.
However, these don't directly affect the bytecode execution. The only way these have an affect is if you access them through reflection. For instance, java.lang.Class.getDeclaredClasses() will return information from the optional attribute InnerClasses.
Related
I have a old application written using Java 7. It runs fine in a Java 8 JRE. I do not plan on rewriting any of the code to make use of Java 8 features. Is there any technical benefit to upgrading the compiled code to the latest Java 8 JDK?
To be clear, the code is currently compiled with Java 7 and already running with the latest Java 8 JRE. It should already benefit from the Java 8 runtime improvements. This question is whether any benefits would be gained by compiling with version 8 and running with Java 8 compiled byte code.
Also, I am not concerned with non-technical benefits such as developer productivity. I think those are important but not the point of this question. I am asking for the sake of production code that has NO development team. It is purely in maintenance mode.
If I understand the question correctly, you want to know if the bytecode produced by javac will be "better" in Java 8 than in Java 7.
The answer is probably not, they constantly fix bugs in the compiler and that sometimes leads to more efficient bytecode. But you will not see any significant speedup from these fixes for Java 8 as far as I can see, the changelog only lists 2 major changes between versions.
The oracle website is terrible and I can't seem to get a list of bugfixes related to javac between versions, but here is a non exhaustive one from OpenJDK. A majority of the ones I can manage to find are fixing errors. So by updating to Java 8, there is a chance it wont compile any more due to javac more correctly following the JLS and there will be very little to no "improvements" to the bytecode.
The main benefit is that Java 8 has the latest bug fixes where as Java 7 isn't being publicly updated.
Also if you are going to run code on an Java 8 JVM, you may as well have just one version of Java installed.
Java 8 might be faster, and it has better support for new features like G1. However, it might be slower for your use case so the only way to know is to test it.
Is there any technical benefit to upgrading the compiled code to the latest Java 8 JDK?
If you are asking whether there is any benefit in re-compiling Java 7 code in a Java 8 compiler, the answer is; almost nothing.
The only subtle difference is that there have been minor differences to the Java API, so there might be very subtle differences the Java 8 compiler might find that the Java 7
Other minor differences are the magic number at the start of the file, possibly the order of the constant pool. The byte code is basically the same, even the support for invokedynamic which was added for lambdas existed in Java 7 but just wasn't used that way.
It could help by creating awareness.
When you switch to Java8, you might find additional warnings being emitted by javac. Example: type inference has been greatly improved with Java8. And that could eliminate the need for #SuppressWarnings annotations in your current code base (and when such annotations are no longer required, the compiler warns about that).
So, even when you don't intend to modify your code base today, switching to Java8 could tell you about such things. Increasing your knowledge can help in making informed decisions.
On the other hand:
I saw some questions here about (rare) situations where Java8 refused to compile Java7 code. So, switching to Java8 also carries a (minimal) risk of running into that kind of problem.
And: even when you don't intend to touch your code base today, there is a certain chance that you change your mind later on. And then, when not paying attention, you might exploit Java8 features. Which could complicate "field updates"; as you now have two versions of your source code to maintain!
Then: in case you have customers running the product using a java7 jre; you have to be really careful about the binary fixes you give to them. We have such a setup; and I have wasted time more than once because I accidentally put a single Java8-compiled class onto a Java7-driven test system. That simply can't happen when your dev and test/customer setup is all Java7.
Long story short: there are a few subtle advantages, and certain risks (where the significance of the risks mainly depend on your overall setup).
I would do for at least these facts.
1) HashMap internals (it is faster under jdk-8)
2) Lots of bugs fixed that might be transparent for you (runtime optimizations) that will make your code faster and better without you actually doing anything.
3) G1 Garbage Collector
EDIT
From a technical point of view this sounds more like something to do with Ahead of Time Compilation or something that a compiler might improve by analyzing the code more. As far as I know such things are not done in java 8 compiler.
From a developer point of view - there are plenty. Increased productivity is the most important one for me.
EDIT 2
I know only two points that matches your second query:
–parameters
to preserve the method parameter names.
-profile
Called Compact Profile Option for a smaller footprint.
If you have no other reasons to recompile your application, then it probably does not make much difference, as stated in the accepted answer.
However, if you have to recompile it even only once, consider this:
Your application source code is compatible with Java 7, and most likely 8 too;
In the eventuality that the code does not compile with Java 8, it will probably not compile either with a Java 8 compiler in Java 7 source compatibility mode (-source 7 with javac);
Your developers and CI will need to run unit and integration tests against a Java 8 runtime to be as close as possible to the production environment. Developers will also need to run the application on the same Java 8 runtime when running it locally;
It is more difficult to compile with a JDK 7 and run with a JRE 8 (in the same build process, or in the same IDE) than doing everything with the same version;
There is no benefit of using -source 7 instead of -source 8 if you compile with a JDK 8 and your target runtime is Java 8;
Using -source 8 guarantees that the developer is using Java 8 (or later) for both compilation and runtime (as it enforces -target 8).
In conclusion, don't recompile it if you don't need to. However, on the first occasion you have to recompile (due to code changes), switch to Java 8. Don't take the risk of having a bug due to environment mismatches, and don't restrict the developers without a good reason.
The compiler display warnings if you use Sun's proprietary Java classes. I'm of the opinion that it's generally a bad idea to use these classes. I read this somewhere. However, aside from the warnings are there any fundamental reasons why you should not use them?
Because they are internal APIs: they are subject to change in a undocumented or unsupported way and they are bound to a specific JRE/JDK (Sun in your case), limiting portability of your programs.
Try to avoid uses of such APIs, always prefer a public documented and specified class.
The JDK 6 Documentation includes a link titled Note About sun.* Packages. This is a document from the Java 1.2 docs, so references to sun.* should be treated as if they said com.sun.*
The most important points from it are:
The classes that Sun includes with the
Java 2 SDK, Standard Edition, fall
into package groups java.*, javax.*,
org.* and sun.*. All but the sun.*
packages are a standard part of the
Java platform and will be supported
into the future. In general, packages
such as sun.*, that are outside of the
Java platform, can be different across
OS platforms (Solaris, Windows, Linux,
Macintosh, etc.) and can change at any
time without notice with SDK versions
(1.2, 1.2.1, 1.2.3, etc). Programs
that contain direct calls to the sun.*
packages are not 100% Pure Java.
and
Each company that implements the Java
platform will do so in their own
private way. The classes in sun.* are
present in the SDK to support the Sun
implementation of the Java platform:
the sun.* classes are what make the
Java platform classes work "under the
covers" for the Sun Java 2 SDK. These
classes will not in general be present
on another vendor's Java platform. If
your Java program asks for a class
"sun.package.Foo" by name, it may fail
with ClassNotFoundError, and you will
have lost a major advantage of
developing in Java.
Try running your code with a non-Sun JVM and see what happens...
(Your code will fail with a ClassNotFound exception)
Yes, because nobody guarantees that these classes or API will be the same with the next Java release and I bet it's not guaranteed that those classes are available in Java versions from other vendors.
So you couple your code to special Java version and loose at least portability.
Sun's proprietary Java classes are part of their Java implementation not part of the Java API their use is undocumented and unsupported. Since they are internal they can be changed at any time for any reason that the team working the Sun JVM decides.
Also Sun's Java implementation is not the only one out there! Your code would not be able portable to JVMs from other vendors like Oracle/BEA and IBM.
Here is Oracle's answer: Why Developers Should Not Write Programs That Call 'sun' Packages
I recently had a case that showed a real-world problem you can hit when you use these classes: we had code that would not compile because a method it was using on a sun.* class simply did not exist in OpenJDK on Ubuntu. So I guess when using these classes you can no longer say things like 'this works with Java 5', because it will only work on a certain Java implementation.
Lets say I have a Java project that is coded with Java 1.5 and I'm using a later version of Java but set target to 1.5.
If the code compiles and tests OK with the later Java, am I then guaranteed that it will work the same on an actual Java 1.5 runtime?
Or will I need to install one version of all JRE that I depend on to be sure?
What happens with bugs in the JRE? If there is a bug in 1.5, that is fixed in 1.6. If I use Java 1.6 with target set to 1.5, will that bug affect me?
In a realistic scenario, is this a concern that I need to have at all?
Assuming you set target and source to 1.5, you only need to be worried in three main cases I can think of:
You're using internal com.sun classes, which may have changed or disappeared (or relying on some other internal behaviour somehow.)
You're relying on buggy behaviour which was fixed in a later version.
You run into a backwards incompatible change (rare but it has been known to happen.)
What happens with bugs in the JRE? If there is a bug in 1.5, that is fixed in 1.6. If I use Java 1.6 with target set to 1.5, will that bug affect me?
If the bug is in the libraries, then no it won't affect you. Target only really stipulates the version of the bytecode you compile against, you'll still be using the updated libraries. As said earlier, note however this could potentially cause issues if you rely on this buggy behaviour.
If there is a deliberate backwards incompatible change, then all cases I've seen of this present themselves as compile time errors rather than runtime errors, so they'll be trivial to spot (and usually pretty easy to fix as well.)
I'd still advocate testing on the new version of the JVM before you release, but in practice it's not usually a problem, in my experience anyway.
All new JRE implementations are made in the way of maintaining compatibility, so the answer is yes.
However, I suggest that you test your app as there might be problems very specific to your project.
To sum up your question:
Is JRE backward compatible, and is JDK forward compatible?
The short answer is yes.
Explanation:
JDK's are not backward compatible. i.e
JDK5 code can't run on JVM4, or JDK6 on JVM5.
However JRE is made backward compatible, because often the organizations write once, execute many times
Why:
As the JRE's become more and more sophisticated, with more intelligent Heap management, Garbage collection, thread handling etc, customers are tempted to move to newer version of JVM.
Bugs
Real bugs present in JVM will stop behaving that way, if you use later version JVM with earlier 'target'. This is because target=prev_version doesn't really invoke altogether previous JVM.
It only picks up delta and treats the code differently. However if it was a feature introduced intentionally in new JVM (say 6), switching to target=1.5 will actually fallback to beahvior for 1.5
Hope that clarifies your doubt to certain extent.
I've inherited a very old application that hasn't been upgraded because it depends on a third party library that is dependent on Java 4.
Getting rid of this third party library isn't going to happen in the near future as a critical part of the system is dependent on it.
I want to bring the Java version of the application up to date and am thinking of moving the dependent jar into its own VM and then having some kind of call between the Java 6/7 VM and the Java 4 VM.
First thoughts are to use RMI. Obvious first question is compatibility between VMs when using different Java versions. The third party lib produces byte streams so the returned data won't be affected by serialization. The data passed in can be manipulated into something that can be passed across if compatibility is an issue.
Is this the right way to go?
Are there better ways?
You can do a wrapper as Phillip said in a comment of yours... there are only some minor adjustments that you have to do in your old library
you can use RMI... it's safe! I tested it and it's a good approach... you don't have to change old messy code
you can also use web services but I prefer RMI
If you have to modify something small and you have too many problems to change the JVM (because someone modified some jdk libraries [happened to me]) just leave it to java 4... :)
The problem I had was that I needed a library from jdk 1.5 to use in jdk 1.4 but my solution was to decompile the jdk 1.5 one and compile it back with jdk 1.4... because the old jdk had something modified in it...
The problem I had on decompiled code is that I couldn't find a very good decompiler that knows about casting... and I had some stackoverflow errors (but those are easy to fix)
I hope my answer helps you
I like generics a lot and use them whereever I can. Every now and then I need to use one of my classes in another project which has to run on an old JVM (before 5.0), needs to run on JavaME (where generics are not allowed neither) or in Microsoft J# (which has VERY poor Support for generics).
At the moment, I remove all generics manually, which means inserting many casts as well.
Since generics are said to be compile-time-only, and every piece of generic code could possibly converted to non-generic code automatically, I wonder if there is any tool which can do this for me.
If there is no such tool, how else could I solve the problem? Should I completely stop using generics?
There already are answers related to bytecode compability. What if I need source code compability for some reason?
You need to use something like Retroweaver in order to achieve this sort of thing. The other answers on this question are slightly misleading. Generics are sort-of bytecode compatible with previous versions, but not entirely (see java.lang.reflect.Type if you don't believe me). Also, there is an issue of the bytecode version attribute, which will prevent a class compiled against 1.5 from running on a previous version. Retroweaver works around both problems while also enabling other Java 5 features like annotations and enums.
In Netbeans (I'm not sure about what IDE you are using) you can set the source-code compatibility to a set java version - just set it to one that supports generics. As already posted, generics are bytecode compatable with old JVM / JRE versions and so it should hopefully work out of the box.
To the best of my knowledge Java 5 is not byte-code compatible with Java 1.4. That is, You cannot use Java 5 compiled classes with an earlier VM.
You can check retroweaver. This was mentioned a lot when generics were introduced. I personally have no experience with it.
Did you ask Google? My search turned up http://www.publicobject.com/glazedlists/documentation/Generics_and_Java_1.4_with_one_codebase.pdf, which seems a very interesting approach.
Its bytecode compatible, it should work out of the box with an old interpreter.