Updated
Take http://bugs.sun.com/view_bug.do?bug_id=6857057. We witnessed this in JRE 6u17. How do I know which version of the JRE contains the fix? The general consensus on the Internet seems to be 6u18 - but how can I tell this from the bug database? It says fixed in 6u16-rev, but I don't know what '-rev' means. Also it says it still affects version 6u18 - so is there some time dilation going on?
Note
Originally I assumed I'd need to build against a newer version of the JDK, so asked how I'd know which version of the JDK contained the fix. I've updated the question based on mindas explaining its actually the JRE which contains the fix.
You need to understand the difference between static and dynamic linking. In Java world, all JVMs are obliged to load classes as they are referenced. In other words, your code which was compiled against older JDK version does not have the information about how older JDK implemented a particular method.
Unless the method signature has changed (from your bugs.sun.com link I assume it hasn't), you should not need to rebuild your classes. Even if it had, you would be getting a different error (something like NoSuchMethodException).
Related
recently I was researching for common vulnerabilities for a particular version of JRE (1.8.0_151) we still have in use and stumbled over cvedetails.com. The result was pretty confusing, since there seems to be no known CVE for this particular version at all. At least, the page does not list this version. However, the page lists results for all kind of newer JRE versions. This could lead to the (probably false) assumption that version 8.0_151 is more secure as the following newer JRE releases and that there wouldn't be the need to update.
List of all CVEs for JRE on cvedetails.com
Does someone know why the particular version isn't listed or if it perhaps counts together with version 152?
Additionally, what would be your recommended update strategy approaches for JRE respective security. Is there any best practices? I am aware that it is a matter of time and money to invest in regards of testing compatibility with the application to use with, but apart from this, it would be great to be aware of the best reasons to stay up to date with JRE.
Many thanks!
You should assume that the vulnerability exists not just in the specified update, but also in all prior updates of the given JRE version. So if the alert calls out 1.8.0_151, you should assume that the issue exists in 1.8.0_anything-equal-to-or-less-than-151.
This isn't just because it's better to err on the side of caution. It's because that's almost always the actual reality of the situation.
There are a couple of reasons why that CVEDetails summary page is incomplete in the sense of not listing every affected update. The first is that Oracle changed the format of its CVE notices back in 2014. The earlier format was something like this:
Unspecified vulnerability in Oracle Java SE 7u40 and earlier, Java SE 6u60 and earlier, and Java SE Embedded 7u40 and earlier ...
which makes it clear that the vulnerability does not exist only in 1.7.0_40, 1.6.0_60 and Embedded 1.7.0_40. The fact the the vulnerability exists in earlier updates is true for practically every vulnerability, not just in Java but in any software. The only time that's not the case is when a vulnerability was introduced in an update, and thankfully that's pretty rare.
Oracle's newer format is something like this:
Unspecified vulnerability in Oracle Java SE 7u45 and Java SE Embedded 7u45, and OpenJDK 7 ...
which no longer makes any statement about the existence of the issue in earlier updates. Oracle would probably say that they do this because the earlier updates are no longer supported and therefore there's no point in even investigating whether they're vulnerable or not.
In fact Oracle's current format makes that position explicit:
Vulnerability in the Java SE, Java SE Embedded product of Oracle Java SE (component: Networking). Supported versions that are affected are Java SE: 7u241 and 8u231; Java SE Embedded: 8u231.
Only the currently-supported versions are specified as being affected. But the odds are overwhelming that the issue exists in earlier updates too.
The second issue is that even when the original alert format specified "update NNN and earlier", that "and earlier" part is not reflected in the CVEDetails summaries. For example, the CVEDetails summary for JRE 1.7.0 shows no vulnerabilities for 1.7.0_39, 1.7.0_38, 1.7.0_37, ... even though those were all affected by the "7u40 and earlier" issue in the example of the original alert format I showed above.
Additionally, what would be your recommended update strategy
approaches for JRE respective security. Is there any best practices?
Opinions vary, and opinions are off-topic for StackOverflow. But IMO, whenever a new update comes out (even if it has no security fixes) you should revalidate your app against the new JRE, so you know in advance whether there's going to be trouble when your customers apply that JRE update. If there are incompatibilities then you should resolve those ASAP.
If the vulnerability is severe and is exploitable in your app then you should let your customers know that they should apply the JRE update, perhaps after they've first installed a new update of your app if you found incompatibilities when you revalidated. If the vulnerability is mild and/or not exploitable in your app then you should let your customers know that, let them know if the JRE update requires an app update, and let them decide whether to move to the updated JRE.
Lets say I have a Java project that is coded with Java 1.5 and I'm using a later version of Java but set target to 1.5.
If the code compiles and tests OK with the later Java, am I then guaranteed that it will work the same on an actual Java 1.5 runtime?
Or will I need to install one version of all JRE that I depend on to be sure?
What happens with bugs in the JRE? If there is a bug in 1.5, that is fixed in 1.6. If I use Java 1.6 with target set to 1.5, will that bug affect me?
In a realistic scenario, is this a concern that I need to have at all?
Assuming you set target and source to 1.5, you only need to be worried in three main cases I can think of:
You're using internal com.sun classes, which may have changed or disappeared (or relying on some other internal behaviour somehow.)
You're relying on buggy behaviour which was fixed in a later version.
You run into a backwards incompatible change (rare but it has been known to happen.)
What happens with bugs in the JRE? If there is a bug in 1.5, that is fixed in 1.6. If I use Java 1.6 with target set to 1.5, will that bug affect me?
If the bug is in the libraries, then no it won't affect you. Target only really stipulates the version of the bytecode you compile against, you'll still be using the updated libraries. As said earlier, note however this could potentially cause issues if you rely on this buggy behaviour.
If there is a deliberate backwards incompatible change, then all cases I've seen of this present themselves as compile time errors rather than runtime errors, so they'll be trivial to spot (and usually pretty easy to fix as well.)
I'd still advocate testing on the new version of the JVM before you release, but in practice it's not usually a problem, in my experience anyway.
All new JRE implementations are made in the way of maintaining compatibility, so the answer is yes.
However, I suggest that you test your app as there might be problems very specific to your project.
To sum up your question:
Is JRE backward compatible, and is JDK forward compatible?
The short answer is yes.
Explanation:
JDK's are not backward compatible. i.e
JDK5 code can't run on JVM4, or JDK6 on JVM5.
However JRE is made backward compatible, because often the organizations write once, execute many times
Why:
As the JRE's become more and more sophisticated, with more intelligent Heap management, Garbage collection, thread handling etc, customers are tempted to move to newer version of JVM.
Bugs
Real bugs present in JVM will stop behaving that way, if you use later version JVM with earlier 'target'. This is because target=prev_version doesn't really invoke altogether previous JVM.
It only picks up delta and treats the code differently. However if it was a feature introduced intentionally in new JVM (say 6), switching to target=1.5 will actually fallback to beahvior for 1.5
Hope that clarifies your doubt to certain extent.
I've inherited a very old application that hasn't been upgraded because it depends on a third party library that is dependent on Java 4.
Getting rid of this third party library isn't going to happen in the near future as a critical part of the system is dependent on it.
I want to bring the Java version of the application up to date and am thinking of moving the dependent jar into its own VM and then having some kind of call between the Java 6/7 VM and the Java 4 VM.
First thoughts are to use RMI. Obvious first question is compatibility between VMs when using different Java versions. The third party lib produces byte streams so the returned data won't be affected by serialization. The data passed in can be manipulated into something that can be passed across if compatibility is an issue.
Is this the right way to go?
Are there better ways?
You can do a wrapper as Phillip said in a comment of yours... there are only some minor adjustments that you have to do in your old library
you can use RMI... it's safe! I tested it and it's a good approach... you don't have to change old messy code
you can also use web services but I prefer RMI
If you have to modify something small and you have too many problems to change the JVM (because someone modified some jdk libraries [happened to me]) just leave it to java 4... :)
The problem I had was that I needed a library from jdk 1.5 to use in jdk 1.4 but my solution was to decompile the jdk 1.5 one and compile it back with jdk 1.4... because the old jdk had something modified in it...
The problem I had on decompiled code is that I couldn't find a very good decompiler that knows about casting... and I had some stackoverflow errors (but those are easy to fix)
I hope my answer helps you
The question first, the story will follow:
Is it safe to mix different bytecode version in a class hierarchy? What are the risks?
For a case, Class C extends B, Class B extends Class A. Class A implements Interface I.
My question would involve following example scenarios:
Class A compiled to Java 1.6 bytecode, and have 1.6 features such as generics, etc. The heirs, which are B and C was compiled to 1.4 bytecode.
Interface I compiled to 1.6, while the implementor compiled to 1.4.
Other exotic inheritance scenario involving different version of bytecode.
I have tried as many scenarios I could imagine and it seems to run just fine. However I still feel the urge to ask here as I only know Java at the surface; i know how to code and tweak Java but don't really know what happen under the hood.
Now for those minds who can't help themselves to ask "why would you need to do that???".
I'm in a project to assess the migration of legacy Java 1.4 Swing app, connected to EJB 2 via RMI, to Java 1.6 Swing connected to newer version of App Server running on top of 1.6 also. The J2EE platform will still be 1.4 (EJB 2).
The migration will not be "recompile everything to 1.6", but it will be "code and compile new features to 1.6".
The way they do things is like this:
They only have one path in the CVS, everyone commits there. No tags/branches whatsoever to get the production code.
Whenever a new feature need to be added, they get the JARs from production server, explode them, replace or add new classes as needed, repackage the jars, put them back to server.
Therefore, if they will use Java 6 to compile and using the above method for deployment, there will be a lot of exotic mixes of 1.4 and 1.6 bytecodes.
The JVM byte code is not siginificantly different between Java 1.0 and Java 6. In Java 7 they add one new instruction. Woohoo.
There are so little changes in how the byte code works that
The JVM doesn't support nested classes accessing private members of outer classes, this works through generated code.
The JVM doesn't support runtime checks for generics e.g you cannot new T() where T is a generic.
Basically, they make the JVM smarter and faster but until recently changing the model of how the byte code works has been avoided at all costs.
You can compile with Java 6 but target 1.4 with a compiler setting. We did this for a migration project once. If/when 1.4 disappears, you then change your compiler settings again and target 1.6.
Keeping the target version explicit also means that you can upgrade your SDK without fear of your JAR files becoming unusable to an older JVM.
I am maintaining an environment with mix of 1.4 (old library jars) and 1.5 (my fixes and stuff) classes on Tomcat using Sun JVM 1.5 and it runs fine.
However, for RMI you may be in trouble if client and server has different class version because the server might check the class version (I ran into this problem).
The best way to find out is to do a proof of concept type of project on small scale.
A friendly reminder though, you are digging a pretty big hole for yourself here :-)
These links seem relevant. They document the few edge cases that could break compatibility between 1.4 and 1.5 and between 1.5 and 1.6.
The biggest differences that could cause problems that I can think of is that enum became a keyword, but that would only effect a 1.5+ JVMs when loading an older class file (which doesn't seem to be what you will be doing). The other thing is annotations. The above links seem to suggest everything would be fine, but I would be wary about what would happen if an older JVM loaded up a class with runtime annotations.
Other than that I don't think there have been any bytecode changes between the first version of java and java 6. Meaning the only problems you should encounter are changes to functionality the API or deprecations (listed in the links above).
As long as you aren't using reflection, the only major problem you could have from differing bytecode versions is the ACC_SUPER flag.
In very early versions, invocation of superclass methods was not handled correctly. When they fixed it, they added a new flag to the classfile format, ACC_SUPER to enable it, so that applications relying on the old, broken, behavior were not affected. Naturally, using a class that doesn't contain this flag could cause problems.
However, this is ancient history. Every class compiled in 1.4 and later will have the flag, so this isn't a problem. Bytecode wise, the only major differences between 1.4 and 1.6 are the addition of optional attributes used to store metadata about inner classes, generics, annotations, etc.
However, these don't directly affect the bytecode execution. The only way these have an affect is if you access them through reflection. For instance, java.lang.Class.getDeclaredClasses() will return information from the optional attribute InnerClasses.
In IE, I can use the classid "clsid:CAFEEFAC-0015-0000-0011-ABCDEFFEDCBA" to tell it to use java version 1.5.0_11. Is there an equivalent for Firefox and other browsers?
I can use the classid "clsid:CAFEEFAC-0015-0000-0011-ABCDEFFEDCBA" to tell it to use java version 1.5.0_11
Not any more, you can't, for good (security) reasons. See http://java.sun.com/javase/6/webnotes/deploy/deployment-policy.html
There is an IE-only clsid mechanism for asking for "5.0_(something)" in general - http://java.sun.com/javase/6/webnotes/family-clsid.html . This was introduced in 5.0u7 so if you have any one JRE from u7 onwards installed you get this behaviour, otherwise you get the old and incompatible behaviour.
Sun did not deign to provide a similar mechanism for other browsers until 6.0u10, when they added a bunch more mechanisms for choosing versions and deprecated all the old ones including the 5.0u7 family chooser. See https://jdk6.dev.java.net/plugin2/version-selection/ for all the gory details.
So what behaviour you get depends not only on the browser and whether the version of the JVM you want is installed, but what other versions are installed as well. The new behaviour is at least consistent, but it is completely different to all that went before and not entirely compatible. By the time your apps' deployment HTML has been updated to cope with it, they'll probably work with 1.6 anyway.
So in summary, as usual with applets, the whole thing's a bloody mess. Yay.