Question from a book:
In the past (pre-Java 8), you were told that it’s bad form to add methods to an interface because it would break existing code. Now you are told that it’s okay to add new methods, provided you also supply a default implementation.
How safe is that? Describe a scenario where the new stream method of the Collection interface causes legacy code to fail compilation.
What about binary compatibility? Will legacy code from a JAR file still run?"
My answers are as follows but I am not quite sure about them.
It's safe only if legacy code does not provide a method with the same name stream and with the same signature (e.g. in a legacy class that implements Collection). Otherwise, this old legacy code will fail compilation.
I think binary compatibility is preserved, legacy code from old JAR file will still run. But I have no clear arguments at all about this.
Could anyone confirm or reject these answers, or just add some more arguments, references, or clarity to these answers?
The new stream() default method in Collection returns a Stream<E>, also a new type in Java 8. Legacy code will fail compilation if it contains a stream() method with the same signature, but returning something else, resulting in a clash of return types.
Legacy code will continue to run as long as it's not recompiled.
First, in 1.7, set up the following:
public interface MyCollection {
public void foo();
}
public class Legacy implements MyCollection {
#Override
public void foo() {
System.out.println("foo");
}
public void stream() {
System.out.println("Legacy");
}
}
public class Main {
public static void main(String args[]) {
Legacy l = new Legacy();
l.foo();
l.stream();
}
}
With -source 1.7 -target 1.7, this compiles and runs:
$ javac -target 1.7 -source 1.7 Legacy.java MyCollection.java Main.java
$ java Main
foo
Legacy
Now in 1.8, we add the stream method to MyCollection.
public interface MyCollection
{
public void foo();
public default Stream<String> stream() {
return null;
}
}
We compile only MyCollection in 1.8.
$ javac MyCollection.java
$ java Main
foo
Legacy
Of course we can't recompile Legacy.java any more.
$ javac Legacy.java
Legacy.java:11: error: stream() in Legacy cannot implement stream() in MyCollection
public void stream()
^
return type void is not compatible with Stream<String>
1 error
Related
According to Java tutorial on Oracle, if deprecated method marked with #Deprecated annotation is used, compiler should be giving warning on compilation. But with following code sample, I am not getting any warning in the console.
Java version used: 1.8.0_112
Please let me know what could be missing here.
Thanks.
public class PreDefinedAnnotationTypesTest {
/**
* This method is deprecated.
* #deprecated
*/
#Deprecated
public void m1(){
}
public static void main(String[] args) {
PreDefinedAnnotationTypesTest obj = new PreDefinedAnnotationTypesTest();
obj.m1();
}
}
From docs
The compiler suppresses deprecation warnings if a deprecated item is
used within an entity which itself is deprecated or is used within the same outermost class or is used in an entity that is annotated to
suppress the warning.
so your function is being used within the same class in which it is declared simply try to use in some other class.
In the below image the wontShowWarning function will not generate any warning although show() funtion will, which is from another class.
The API design can have different rules for itself because it is presumed that the outermost classes will be modified according to new design so this is just a indication to other classes
For Compiling and running your Java program using Command Prompt:
javac -Xlint className.java
With this jdk code in ../java/lang/Override.java,
package java.lang;
import java.lang.annotation.*;
#Target(ElementType.METHOD)
#Retention(RetentionPolicy.SOURCE)
public #interface Override {
}
having just annotation declaration, java compiler is intelligent enough to detect error(compile time):
The method toString123() of type Example must override or implement a supertype method
in the below problem code.
package annotationtype;
public class Example {
#Override public String toString() {
return "Override the toString() of the superclass";
}
#Override public String toString123() {
return "Override the toString123() of the superclass";
}
public static void main(String[] args) {
}
}
Annotation declaration for Override just gets compiled to,
interface java.lang.Override extends java.lang.annotation.Annotation{
}
which is nothing more than an interface.
So,
How does interface java.lang.Override syntax help java compiler to detect above error at compile time?
The implementation that triggers the compile error doesn't lie in the annotation, it lies in the Java compiler.
If you want to write your own similar annotation processor, you would use the annotation processor API: http://docs.oracle.com/javase/7/docs/api/javax/annotation/processing/Processor.html
which is nothing more than an interface.
So,
How does interface java.lang.Override syntax help java compiler to
detect above error at compile time?
That's right. Override is nothing more than an interface. The actual work is done by the java compiler. How the compiler does this is not specified.
Here are some links that explain how to work with an AnnotationProcessor to implement something similar to #Override :
Processor Java doc
Java annotation processing tool
Code generation using AnnotationProcessor
Annotation Processor, generating a compiler error
Source code analysis using Java 6 API
Playing with Java annotation processing
I am reading 'Java Generics and Collections' section 8.4. The author defines the following code while trying to explain Binary Compatibility:
interface Name extends Comparable {
public int compareTo(Object o);
}
class SimpleName implements Name {
private String base;
public SimpleName(String base) {
this.base = base;
}
public int compareTo(Object o) {
return base.compareTo(((SimpleName)o).base);
}
}
class ExtendedName extends SimpleName {
private String ext;
public ExtendedName(String base, String ext) {
super(base); this.ext = ext;
}
public int compareTo(Object o) {
int c = super.compareTo(o);
if (c == 0 && o instanceof ExtendedName)
return ext.compareTo(((ExtendedName)o).ext);
else
return c;
}
}
class Client {
public static void main(String[] args) {
Name m = new ExtendedName("a","b");
Name n = new ExtendedName("a","c");
assert m.compareTo(n) < 0;
}
}
and then talks about making the Name interface and SimpleName class generic and leaving the ExtendedName as is. As a result the new code is:
interface Name extends Comparable<Name> {
public int compareTo(Name o);
}
class SimpleName implements Name {
private String base;
public SimpleName(String base) {
this.base = base;
}
public int compareTo(Name o) {
return base.compareTo(((SimpleName)o).base);
}
}
// use legacy class file for ExtendedName
class Test {
public static void main(String[] args) {
Name m = new ExtendedName("a","b");
Name n = new ExtendedName("a","c");
assert m.compareTo(n) == 0; // answer is now different!
}
}
The author describes the result of such an action as following:
Say that we generify Name and SimpleName so that they define
compareTo(Name), but that we do not have the source for ExtendedName. Since it defines
only compareTo(Object), client code that calls compareTo(Name) rather than compareTo(Object) will invoke the method on SimpleName (where it is defined) rather than
ExtendedName (where it is not defined), so the base names will be compared but the
extensions ignored.
However when I make only Name and SimpleName generic I get a compile time error and not what the author describes above. The error is:
name clash: compareTo(Object) in NameHalfMovedToGenerics.ExtendedName and compareTo(T) in Comparable have the same erasure, yet neither overrides the other
And this is not the first time I am facing such an issue - earlier while trying to read Sun documentation on erasure I faced a similar issue where my code doesn't show the same result as described by the author.
Have I made a mistake in understanding what the author is trying to say?
Any help will be much appreciated.
Thanks in advance.
This is an example of a problem that can occur under separate compilation.
The main subtlety with separate compilation is that, when a caller class is compiled, certain information is copied from the callee into the caller's class file. If the caller is later run against a different version of the callee, the information copied from the old version of the callee might not match exactly the new version of the callee, and the results might be different. This is very hard to see by just looking at source code. This example shows how the behavior of a program can change in a surprising way when such a modification is made.
In the example, Name and SimpleName were modified and recompiled, but the old, compiled binary of ExtendedName is still used. That's really what it means by "the source code for ExtendedName is not available." When a program is compiled against the modified class hierarchy, it records different information than it would have if it were compiled against the old hierarchy.
Let me run through the steps I performed to reproduce this example.
In an empty directory, I created two subdirectories v1 and v2. In v1 I put the classes from the first example code block into separate files Name.java, SimpleName.java, and ExtendedName.java.
Note that I'm not using the v1 and v2 directories as packages. All these files are in the unnamed package. Also, I'm using separate files, since if they're all nested classes it's hard to recompile some of them separately, which is necessary for the example to work.
In addition I renamed the main program to Test1.java and modified it as follows:
class Test1 {
public static void main(String[] args) {
Name m = new ExtendedName("a","b");
Name n = new ExtendedName("a","c");
System.out.println(m.compareTo(n));
}
}
In v1 I compiled everything and ran Test1:
$ ls
ExtendedName.java Name.java SimpleName.java Test1.java
$ java -version
java version "1.7.0_45"
Java(TM) SE Runtime Environment (build 1.7.0_45-b18)
Java HotSpot(TM) 64-Bit Server VM (build 24.45-b08, mixed mode)
$ javac *.java
$ java Test1
-1
Now, in v2 I placed the Name.java and SimpleName.java files, modified using generics as shown in the second example code block. I also copied in v1/Test1.java to v2/Test2.java and renamed the class accordingly, but otherwise the code is the same.
$ ls
Name.java SimpleName.java Test2.java
$ javac -cp ../v1 *.java
$ java -cp .:../v1 Test2
0
This shows that the result of m.compareTo(n) is different after Name and SimpleName were modified, while using the old ExtendedName binary. What happened?
We can see the difference by looking at the disassembled output from the Test1 class (compiled against the old classes) and the Test2 class (compiled against the new classes) to see what bytecode is generated for the m.compareTo(n) call. Still in v2:
$ javap -c -cp ../v1 Test1
...
29: invokeinterface #8, 2 // InterfaceMethod Name.compareTo:(Ljava/lang/Object;)I
...
$ javap -c Test2
...
29: invokeinterface #8, 2 // InterfaceMethod Name.compareTo:(LName;)I
...
When compiling Test1, the information copied into the Test1.class file is a call to compareTo(Object) because that's the method the Name interface has at this point. With the modified classes, compiling Test2 results in bytecode that calls compareTo(Name) since that's what the modified Name interface now has. When Test2 runs, it looks for the compareTo(Name) method and thus bypasses the compareTo(Object) method in the ExtendedName class, calling SimpleName.compareTo(Name) instead. That's why the behavior differs.
Note that the behavior of the old Test1 binary does not change:
$ java -cp .:../v1 Test1
-1
But if Test1.java were recompiled against the new class hierarchy, its behavior would change. That's essentially what Test2.java is, but with a different name so that we can easily see the difference between running an old binary and a recompiled version.
As far as I understand the source compatibility and how you can easily show an example that would break source compatibility (change name of the method, remove method etc.), I am having a bit of a problem seeing how binary compatibility can be broken in practice. Does anyone have a simple example of preservation of source compatibility that would cause binary compatibility issues i.e. no code changes are required but recompilation is necesssary?
One example (and this is by no means the only one) would be if the signature of a method in a library changes, in a compatible way. For example, consider:
// Library.java v1
public class Library {
public static void print(String foo) {
System.out.println(foo);
}
}
// Client.java v1
public class Client {
public static void main(String[] args) {
Library.print("hello");
}
}
Compile and run:
$ javac Client.java Library.java
$ java Client
hello
Now change Library.java - note the type of the foo parameter:
// Library.java v2
public class Library {
public static void print(Object foo) {
System.out.println(foo);
}
}
Just recompile Library.java and try to rerun Client:
$ javac Library.java
$ java Client
Exception in thread "main" java.lang.NoSuchMethodError: Library.print(Ljava/lang/String;)V
at Client.main(Client.java:3)
First need to understand both compatibility.
Source compatibility - Program is source compatible with new version if Program can be compiled with that new version of code(library or api)
Binary compatibility - Program is binary compatible with new version of code if Program can be linked with that code without recompilation
Following link has more example for "Source compatible but Binary incompatible "
Specialising Return Types
Generalising Parameter Types
Primitive vs Wrapper Types
Read http://praitheesh.blogspot.com.au/2014/09/compatibility-and-api-evolution-in-java.html for more details.
If you import an interface with string constants.
(An anti-pattern in Java.)
Then the importing class copies the constants in the constant table, and uses those constants immediately. The import dependency to the interface then is missing.
When the string value of the constant in the interface is changed, the compiler does not see that it needs to recompile the class that will remain using the old value - as there is no longer an import to the interface.
The running is not broken, but the behaviour is - wrong value.
An example I met :
public class Class1 {
public void do() {
System.out.println("do!");
}
}
Client part :
public class Class2 {
public void callDo() {
Class1 c = new Class1();
c.do();
}
}
Now you change the return of do method :
public class Class1 {
public String do() {
System.out.println("do!");
return "done!";
}
}
If you run the client code without a recompilation you will get a NoSuchMethodError exception because the method signature has changed.
I've run into an issue in Java's Generics in which the same code will compile and work fine in Java 6, but will fail to compile because of the same erasure in Java 5. I have a file TestErasure.java that has an overloaded method, called "method":
import java.util.ArrayList;
import java.util.List;
public class TestErasure {
public static Object method(List<Object> list) {
System.out.println("method(List<Object> list)");
return null;
}
public static String method(List<String> list) {
System.out.println("method(List<String> list)");
return null;
}
public static void main(String[] args) {
method(new ArrayList<Object>());
method(new ArrayList<String>());
}
}
In Java 5, I get the expected compilation error, stating that the erasure of "method" is the same:
$ javac -version
javac 1.5.0_19
$ javac TestErasure.java
TestErasure.java:10: name clash: method(java.util.List<java.lang.String>) and method(java.util.List<java.lang.Object>) have the same erasure
public static String method(List<String> list) {
^
TestErasure.java:17: method(java.util.List<java.lang.Object>) in TestErasure cannot be applied to (java.util.ArrayList<java.lang.String>)
method(new ArrayList<String>());
^
2 errors
However, Java 6 is able to compile and run this same code.
$ javac -version
javac 1.6.0_16
$ javac TestErasure.java
$ java TestErasure
method(List<Object> list)
method(List<String> list)
Based upon my current understanding of erasures (thanks to Jon Skeet and Angelika Langer), I actually expected the compilation error as thrown by Java 5 (unless something changed in how Java handled Generics--which I can not find on the Java 6 release notes). In fact, if I modify the return type of one of the overloaded methods:
public static Object method(List<Object> list) ...
public static Object method(List<String> list) ...
Java 6 also fails to compile because of the same erasures:
$ javac TestErasure.java TestErasure.java:5: name clash: method(java.util.List<java.lang.Object>) and method(java.util.List<java.lang.String>) have the same erasure
public static Object method(List<Object> list) {
^
TestErasure.java:10: name clash: method(java.util.List<java.lang.String>) and method(java.util.List<java.lang.Object>) have the same erasure
public static Object method(List<String> list) {
^
2 errors
It appears as if the return type in Java 6 somehow influences the selection of which overloaded method to use?
Can someone shed light on why the first example works in Java 6--it seems to go against the stated handling of overloaded generic methods?
More info:
Per David's suggestion, the original example, complied by javac 1.6, will run under the java 1.5:
$ javac -target 1.5 TestErasure.java
$ java -version
java version "1.5.0_19"
$ java TestErasure
method(List<Object> list)
method(List<String> list)
Found these bugs on Sun, which I think is what you're describing:
http://bugs.sun.com/view_bug.do?bug_id=6182950
http://bugs.sun.com/view_bug.do?bug_id=6730568