Why do we need bounded wilcard <? extends T> in Collections.max() method - java

I've read awesome "Effective Java" by Joshua Bloch. But one example in the books is left unclear to me. It's taken from chapter about generics, exact item is "Item 28: Use bounded wildcards to increase API flexibility".
In this item it's shown how to write the most universal and bulletproof (at the type system point of view) version of the algorithm of selection maximum element from collection using bounded type parameters and bounded wildcard types.
The final signature of the static method written looks like this:
public static <T extends Comparable<? super T>> T max(List<? extends T> list)
And it's mostly the same as the one of Collections#max function from standard library.
public static <T extends Object & Comparable<? super T>> T max(Collection<? extends T> coll)
I understand why we need bounded wildcard in T extends Comparable<? super T> type constraint, but is it really necessary in type of the argument? It seems to me that it will be the same if we leave just List<T> or Collection<T>, isn't it? I mean something like this:
public static <T extends Comparable<? super T>> T wrongMin(Collection<T> xs)
I've written the following silly example of using both signatures and don't see any diferrence:
public class Algorithms {
public static class ColoredPoint extends Point {
public final Color color;
public ColoredPoint(int x, int y, Color color) {
super(x, y);
this.color = color;
}
#Override
public String toString() {
return String.format("ColoredPoint(x=%d, y=%d, color=%s)", x, y, color);
}
}
public static class Point implements Comparable<Point> {
public final int x, y;
public Point(int x, int y) {
this.x = x;
this.y = y;
}
#Override
public String toString() {
return String.format("Point(x=%d, y=%d)", x, y);
}
#Override
public int compareTo(Point p) {
return x != p.x ? x - p.x : y - p.y;
}
}
public static <T extends Comparable<? super T>> T min(Collection<? extends T> xs) {
Iterator<? extends T> iter = xs.iterator();
if (!iter.hasNext()) {
throw new IllegalArgumentException("Collection is empty");
}
T minElem = iter.next();
while (iter.hasNext()) {
T elem = iter.next();
if (elem.compareTo(minElem) < 0) {
minElem = elem;
}
}
return minElem;
}
public static <T extends Comparable<? super T>> T wrongMin(Collection<T> xs) {
return min(xs);
}
public static void main(String[] args) {
List<ColoredPoint> points = Arrays.asList(
new ColoredPoint(1, 2, Color.BLACK),
new ColoredPoint(0, 2, Color.BLUE),
new ColoredPoint(0, -1, Color.RED)
);
Point p1 = wrongMin(points);
Point p2 = min(points);
System.out.println("Minimum element is " + p1);
}
So can you suggest an example where such simplified signature will be inacceptable?
P.S. And why the heck there is T extends Object in official implementation?
Answer
Well, thanks to #Bohemian I've managed to figure out what's the difference between them.
Consider the following two auxiliary methods
private static void expectsPointOrColoredPoint(Point p) {
System.out.println("Overloaded for Point");
}
private static void expectsPointOrColoredPoint(ColoredPoint p) {
System.out.println("Overloaded for ColoredPoint");
}
Sure, it's not very smart to overload method both for superclass and its subclass, but it let us see what type of return value was actually inferred (points is List<ColoredPoint> as before).
expectsPointOrColoredPoint(min(points)); // print "Overloaded for ColoredPoint"
expectsPointOrColoredPoint(wrongMin(points)); // print "Overloaded for ColoredPoint"
For both methods inferred type was ColoredPoint.
Sometimes you want be explicit about type passed to overloaded function. You may do it a couple of ways:
You can cast:
expectsPointOrColoredPoint((Point) min(points)); // print "Overloaded for Point"
expectsPointOrColoredPoint((Point) wrongMin(points)); // print "Overloaded for Point"
Still no difference...
Or you can tell compiler what type should be inferred using syntax class.<type>method:
expectsPointOrColoredPoint(Algorithms.<Point>min(points)); // print "Overloaded for Point"
expectsPointOrColoredPoint(Algorithms.<Point>wrongMin(points)); // will not compile
Aha! Here is the answer. List<ColoredPoint> can't be passed to function expecting Collection<Point> because generics are not covariant (unlike arrays), but can be passed to function expecting Collection<? extends Point>.
I'm not sure where or who may prefer to use explicit type parameter in such case, but at least it shows where the wrongMin may be inappropriate.
And thanks to #erickson and #tom-hawtin-tackline for answers about purpose of T extends Object constraint.

The difference is in the type returned, especially influenced by inference, whereby the type may be a type hierarchically between the Comparable type and the List type. Let me give an example:
class Top {
}
class Middle extends Top implements Comparable<Top> {
#Override
public int compareTo(Top o) {
//
}
}
class Bottom extends Middle {
}
Using the signature you've provided:
public static <T extends Comparable<? super T>> T max(List<? extends T> list)
we can code this without errors, warnings or (importantly) casts:
List<Bottom> list;
Middle max = max(list); // T inferred to be Middle
And if you need a Middle result, without inference, you can explicitly type the call to Middle:
Comparable<Top> max = MyClass.<Middle>max(list); // No cast
or to pass to a method that accepts Middle (where inference won't work)
someGenericMethodThatExpectsGenericBoundedToMiddle(MyClass.<Middle>max(list));
I don't know if this helps, but to illustrate the types the compiler as allowed/inferred, the signature would look like this (not that this compiles, of course):
public static <Middle extends Comparable<Top>> Middle max(List<Bottom> list)

The difference between
T max(Collection<? extends T> coll)
and
T wrongMax(Collection<T> xs)
is that the return type of the second version is exactly the same as the collection's element type T, while in the first version T can be a super type of the element type.
The second question: the reason for T extends Object makes sure that T is a class and not an interface.
Update: A slightly more "natural" demonstration of the difference: Suppose you define these two methods:
static void happy(ColoredPoint p, Point q) {}
static void happy(Point p, ColoredPoint q) {}
And call the first one them like this:
happy(coloredPoint, min(points));
happy(coloredPoint, wrongMin(points));
The type inference engine could be able to deduce that in the first call the return type of min should be Point and the code would compile. The second call would fail to compile since the call to happy is ambiguous.
Unfortunately the type inference engine isn't powerful enough at least in Java 7, so in reality both calls fail to compile. The difference is that the first call can be fixed by specifying the type parameter as in Algorithms.<Point>min, while fixing the second call would require an explicit cast.

Not an easy one, but i'll try to be as specific as possible:
in T max(Collection<? extends T> coll)
you could pass an argument like this List<Animal> or List<Cat> or List<Dog>,
and in T wrongMax(Collection<T> xs)
where T is Animal you can't pass as an Argument this
List<Dog>, List<Cat> of course in Runtime you could add Cat or Dog objects in List<Animal> but in compilation time you wouldn't be able to pass a subclass of Animal in the Type of the List being passed as an argument in the wrongMax method, in the other hand, in the max method you could. Sorry for my english, i still learning it :), Regards.

Related

Is it possible to use the provided Java Collections methods, like max, min, sort, ect..., on a Stack?

I was working on a problem involving Stacks on HackerRank (See Here). One of the parts of the question asked to provide the max value within the Stack. I thought an easy way to do this was to just write an extended Stack class with a max() method (see below). That worked but I thought an even easier way might be to just take advantage of Java's Collections methods. So I built the craftyMax() method (also seen below).
class MyStack<T> extends Stack<T> {
public T craftyMax() {
return Collections.max(this);
}
public T max() {
Integer max = Integer.MIN_VALUE;
for (T item: this) {
max = Math.max((Integer)item, max);
}
return (T) max;
}
}
Of course this did not work as the compiler replied with:
Solution.java:6: error: no suitable method found for max(MyStack<T#1>)
return Collections.max(this);
^
method Collections.<T#2>max(Collection<? extends T#2>) is not applicable
(inferred type does not conform to upper bound(s)
inferred: T#1
upper bound(s): Comparable<? super T#1>,Object)
method Collections.<T#3>max(Collection<? extends T#3>,Comparator<? super T#3>) is not applicable
(cannot infer type-variable(s) T#3
(actual and formal argument lists differ in length))
where T#1,T#2,T#3 are type-variables:
T#1 extends Object declared in class MyStack
T#2 extends Object,Comparable<? super T#2> declared in method <T#2>max(Collection<? extends T#2>)
T#3 extends Object declared in method <T#3>max(Collection<? extends T#3>,Comparator<? super T#3>)
Note: Solution.java uses unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
1 error
Since, I have tried a few different things and been looking around but I can't seem to find if what I am trying to do here is possible or not. So my question is:
Is it possible to use the provided Java Collections methods, like max, min, sort, ect..., on / within a Stack? Or am I expecting a little too much?
public static <T extends Object & Comparable<? super T>> T max(Collection<? extends T> coll)
only works for Collections whose element type implements the Comparable interface.
Therefore, your code will work with the proper type bound:
class MyStack<T extends Comparable<T>> extends Stack<T> {
public T craftyMax() {
return Collections.max(this);
}
}
I'm not sure about your second method (max()), though. You are casting T to Integer. If you are certain T is an Integer, why not define MyStack as class MyStack extends Stack<Integer>?
You can only call Collections.max with a single argument on types that implement the Comparable interface. Change your type declaration to
class MyStack<T extends Comparable<T>> extends Stack<T> {
and it should work. Or, as an alternative, you can take the two-argument max that takes a Comparator as the second argument.
you can override the comparator to match your functionality like below.
if (!MyStack.isEmpty()) {
Integer max = Collections.min(MyStack, new Comparator<Integer>() {
#Override
public int compare(Integer o1, Integer o2) {
return o2.compareTo(o1);
}
});
}

Combining <? extends ClassE> and <? super ClassB>

I understand how ? extends .. and ? super .. work on their own and which generic types are possible, but I just can't understand how the following is possible with this hierarchy:
-> means extends
Classes are X (lowest), A to E (highest)
Interface is F
X -> A (implements F) -> B -> C -> E (implements F)
also D -> E
public class Node<T extends ClassE> {
private T info;
public T getInfo() {
return info;
}
public void setInfo(T info) {
this.info = info;
}
}
public static void main (String [] args){
Node<? super ClassB> n2 = new Node<ClassC>();
// this makes sense, since Node accepts below E and above B
InterfaceF i2 = n2.getInfo();
// how? Not only outside of <? extends E> but also getting value even though
// <? super B> is defined above, what's up with PECS?
n2.setInfo(new ClassX());
// also.. how? I'm setting a class that's out of the allowed range +
// seemingly violating the PECS for <? extends E>
}
As you can see, I'm totally confused when it comes to combining them and it's quite surprising for me having those declarations pass the compiler without problems.
I read somewhere that a combination of both bounds isn't possible in Java, but how does that work then?
The first line InterfaceF i2 = n2.getInfo(); compiles because the lower-bounded wildcard still retains the bound on the type variable itself. Since the type variable has an upper bound ClassE, getInfo() still returns a ClassE. Since ClassE implements InterfaceF, the assignment compiles.
In other words, we could imagine that when you did Node<? super ClassB> you implicitly actually did something like (made-up syntax) Node<? extends ClassE & super ClassB>. The type argument to the Node is both a supertype of ClassB and a subtype of ClassE.
This is similar to how Node<?> is implicitly the same as Node<? extends ClassE>.
The way this is actually specified is a little bit complicated, but it's in capture conversion. Capture conversion is the process whereby the compiler takes a type with wildcards and treats it as if it was a type without wildcards, for the purpose of determining subtyping.
Let G name a generic type declaration with n type parameters A1,...,An with corresponding bounds U1,...,Un.
There exists a capture conversion from a parameterized type G<T1,...,Tn> to a parameterized type G<S1,...,Sn, where, for 1 ≤ i ≤ n :
[...]
If Ti is a wildcard type argument of the form ? super Bi, then Si is a fresh type variable whose upper bound is Ui[A1:=S1,...,An:=Sn] and whose lower bound is Bi.
In other words, Si (the type argument after capture conversion corresponding to ? super ClassB) gains its lower bound from the bound of the wildcard and its upper bound from the bound of the type variable declaration.
The second line n2.setInfo(new ClassX()); compiles because ClassX is a subclass of ClassB so it's implicitly convertible to it. We could imagine that n2 was a Node<ClassB> and it might be more obvious why this line compiles:
Node<ClassB> n2 = ...;
n2.setInfo(new ClassX());
setInfo accepts ClassB as well as any subtype of ClassB.
Also, with respect to this:
I read somewhere that a combination of both bounds isn't possible in Java, but how does that work then?
The type system in the compiler does a lot of things that we can't explicitly do ourselves. Another good example (although unrelated) is type inference with anonymous classes:
int num = Objects.requireNonNull(new Object() {int num = 42;}).num;
System.out.println(num); // 42
It compiles because type inference is allowed to infer that the type argument to T of requireNonNull is the anonymous object type, even though we could never provide that type as an explicit type argument ourselves.
Disclaimer: I do not know the concrete type inference rules but explain to my best understanding.
Regarding InterfaceF i2 = n2.getInfo() - since Node<T extends E> it is assured that Node.getInfo() returns something extends E. According to your description E implements F. As such it is ensured that T extends E will also implement F. Therefore Node.getInfo() = T extends E implements F. So n2.getInfo() implements F is fine.
Regarding n2.setInfo(new ClassX()) - I do not have a formal-like explanation for this as above but let's try to think about it: Basically you Node<? super ClassB> tells everyone to expect at most ClassB as its lowest for the Node's content. However, since ClassX transitively inherits ClassB it is perfectly valid since it will satisfy all interface guarantees posed by ? super ClassB.
Hope this helps!
public class Test {
public interface F {}
public static class E implements F {}
public static class D extends E {}
public static class C extends E {}
public static class B extends C {}
public static class A extends B implements F {}
public static class X extends A {}
public static class Node<T extends E> {
private T info;
public T getInfo() {return info;}
public void setInfo(T info) {this.info = info;}
}
public static void main(String[] args) {
Node<? super B> n = new Node<C>(); // C is superclass of B = OK
F i = n.getInfo(); // node type = B|C|E all these types implements F (since E implements F) = OK
n.setInfo(new X()); // X has supertypes A,B,C,E = can be casted to B and so satisfy <? super B>
}
}

Java generics - compilation error [duplicate]

This question already has an answer here:
Java generic collection, cannot add list to list
(1 answer)
Closed 5 years ago.
Why it doesn't work ?
class W<A extends Number>
{
public void create(A value)
{
}
}
public void calculate(W<? extends Number> w)
{
Integer v = 5;
w.create(v); // Compilation error
}
Could somebody explain what is wrong with that code and how to fix it?
Compilation error : "create (capture) in W cannot be applied to (java.lang.Integer)"
You have a common misconception about wildcards in generics. You think that ? means you can pass any type (as long as it extends Number), but you cannot.
? extends Number does not mean that you can pass any type that extends Number. It means "a specific, but unknown type that extends Number". Since the exact type is unknown to the compiler, it cannot allow you to use Integer - the compiler at that point simply has not enough information to check if it is type-safe. (It doesn't know if the actual type of the W that you pass to the method is Integer or some other type that extends Number).
This is exactly the same reason as why you cannot add anything to a List<?>, for example, which is a question that people often ask.
This follows straight from the core generics principle. When you use ? extends SomeType, you cannot pass/consume anything into the reference, as this might violate type safety guarantee provided by the Generics when retrieving these items. One would then be able to do -
List<Long> longs = Arrays.asList(5L, 10L, 20L);
List<? extends Number> numbers = longs;
// Trouble!
numbers.add(10.5);
See one of my blog posts here for details on how subtyping works with Generics.
You can use bounderies in the create method:
public <T extends Number> void create(final T value) {
}
Example:
class W<A extends Number> {
public <T extends Number> void create(final T value) {
}
public void calculate(final W<? extends Number> w) {
final Integer v = 5;
w.create(v); // Compilation error
}
}
You have to declare what kind of W you're using before you use it
public void calculate(W<? extends Number> w) {
Integer v = 5;
new W<Integer>().create(v);
}
or, you can change the create method to accept any number
public void create(Number value) {
}
Both objects should extend Number, but that does NOT mean these objects cannot be siblings:
ClassA extends Number {..}
ClassB extends Number {..}
ClassA cannot be casted to ClassB, and this is what could happen in your code. It can be both ClassA but your compiler cannot be sure of that as it could be ClassB or any other class extending Number, so it gives a compile error.

Java Generics. Why does it compile?

abstract class Type<K extends Number> {
abstract <K> void use1(Type<K> k); // Compiler error (Type parameter K is not within its bounds)
abstract <K> void use2(Type<? extends K> k); // fine
abstract <K> void use3(Type<? super K> k); // fine
}
The method generic type K shadows the class generic type K, so <K> doesn't match <K extends Number> in use1().The compiler doesn't know anything usefull about new generic type <K> in use2() and use3() but it is still legal to compile . Why <? extends K> (or <? super K>) match <K extends Number>?
The problem you have is that There are two K types. It may be clearer if you rename one.
abstract class Type<N extends Number> {
abstract <K extends Number> void use1(Type<K> k); // fine
abstract <K> void use2(Type<? extends K> k); // fine
abstract <K> void use3(Type<? super K> k); // fine
}
There are cases where you have to provide duplicate information the compiler can infer, and other places where you don't. In Java 7 it has added a <> diamond notation to tell the compiler to infer types it didn't previously.
To illustrate what I mean. Here is different ways to create an instance of a generic class. Some requires the type be given twice, others only once. The compiler can infer the type.
In general, Java doesn't infer types when it might do in most other languages.
class Type<N extends Number> {
private final Class<N> nClass;
Type(Class<N> nClass) {
this.nClass = nClass;
}
static <N extends Number> Type<N> create(Class<N> nClass) {
return new Type<N>(nClass);
}
static void main(String... args) {
// N type is required.
Type<Integer> t1 = new Type<Integer>(Integer.class);
// N type inferred in Java 7.
Type<Integer> t2 = new Type<>(Integer.class);
// type is optional
Type<Integer> t3 = Type.<Integer>create(Integer.class);
// type is inferred
Type<Integer> t4 = create(Integer.class);
}
When you define the method like this:
abstract <K> void use1(Type<K> k);
You're effectively hiding the type K in your class definition. You should be able to define the methods like this:
abstract void use1(Type<K> k);
First of all, let's rewrite it to avoid shadowing:
abstract class Type<N extends Number> {
abstract <K> void use1(Type<K> k);
abstract <K> void use2(Type<? extends K> k);
abstract <K> void use3(Type<? super K> k);
}
In the first method K acts as a type parameter of Type<N extends Number>, thus its value sould comply to the bound of Type's N. However, method declaration doesn't have any restrictions on value of K, therefore it's not legal. It would be legal if you add a necessary restriction on K:
abstract <K extends Number> void use1(Type<K> k);
In the following methods, the actual type parameter of Type is unknown (?), and K imposes additional bound on it, so that there is nothing illegal in these declarations.
Here is a more practical example with the similar declarations:
class MyList<N extends Number> extends ArrayList<N> {}
<K> void add1(MyList<K> a, K b) {
a.add(b); // Given the method declaration, this line is legal, but it
// violates type safety, since object of an arbitrary type K can be
// added to a list that expects Numbers
// Thus, declaration of this method is illegal
}
<K> void add2(MyList<? extends K> a, K b) {
// a.add(b) would be illegal inside this method, so that there is no way
// to violate type safety here, therefore declaration of this method is legal
}
<K> void add3(MyLisy<? super K> a, K b) {
a.add(b); // This line is legal, but it cannot violate type safey, since
// you cannot pass a list that doesn't expect K into this method
}
This is a gray area; javac 7 and 6 disagree; JLS3 is outdated; no idea where's the new spec.

Java generic method boundaries

What is the difference in following 2 lines?
public static <T extends Comparable<? super T>> int methodX(List<T> data)
public static <T> int methodX(List<? extends Comparable<? super T>> data)
Your first option is a "stricter" parametrisation. Meaning, you're defining the class T with a bunch of restrictions, and then use it later on with List. In your second method, the parameter class T is generic with no conditions, and the Lists class parameter is defined in terms of the parameter T.
The second way is syntactically different as well, with a ? instead of the first option's T, because in the parameter definition you aren't defining the type parameter T but rather using it, so the second method cannot be as specific.
The practical difference that comes out of this is one of inheritance. Your first method needs to be a type that is comparable to a super class of itself, whereas the second type need only be comparable to an unconditional/unrelated T:
public class Person implements Comparable<Number> {
#Override
public int compareTo(Number o) {
return 0;
}
public static <T extends Comparable<? super T>> int methodX(List<T> data) {
return 0;
}
public static <T> int methodY(List<? extends Comparable<? super T>> data) {
return 0;
}
public static void main(String[] args) {
methodX(new ArrayList<Person>()); // stricter ==> compilation error
methodY<Object>(new ArrayList<Person>());
}
}
If you change the Comparable of Person to be able to compare Object or Person (the inheritance tree of the base class) then methodX will also work.
To the callers, the 2nd version is roughly equivalent to
public static <T, X extends Comparable<? super T>> int methodX(List<X> data)
Suppose a caller calls it with an arg whose concrete type List<Foo>. Type inference will conclude that X=Foo. Then we get a new equation about T from X's bound
=>
Foo <: Comparable<? super T>
( A <: B means A is a subtype of B)
If Foo is Comparable at all, it almost certainly implements Comparable<Foo> [2]
=>
Comparable<Foo> <: Comparable<? super T>
=>
T <: Foo
Without further information, inference chooses T=Foo.
Therefore from caller's POV, the two versions are not really different.
Inside method body, the 2nd version does not have access to type parameter X, which is a synthetic one introduced in compilation phase. This means you can only read from data. Things like
X x = data.get(0);
data.set(1, x);
are impossible in version#2; No such problem in version #1 with T.
However we can forward #2 to #1
<T1> method1(List<T1> data){ data.set(...); }
<T2> method2(List<?...> data)
{
method1(data);
}
(they must have difference method names; overloading not allowed since java7)
This is because for the compiler, type of data is really List<X> (it knows the secrete X), so there is no problem calling method1(data) after inferring that T1=X
[1] JLS3, 5.1.10 Capture Conversion
[2] According to the javadoc of Comparable, This interface imposes a total ordering on the objects of each class that implements it. That means if Foo implements Comparable<W>, W must be Foo or a super type of Foo. It is quite improbably for a subclass implementation to define a total order among objects of a super class. So W most definitely should be Foo. Otherwise funny things would happen. The notorious example is 'Timestamp', its javadoc (now) explains why it can't be compared with its supertype Date
The first method expects a list of elements which can be compared against their own class or a supertype of it. Say, real numbers can be compared to any kind of numbers:
class Real extends Number implements Comparable<Number> {
public int compareTo(Number o) ...
}
A bit more restrictive, but still acceptable for your first method is the following:
class Real extends Number implements Comparable<Real> {
public int compareTo(Real o) ...
}
But the second method is actually not very different from this version:
public static int methodY(List<? extends Comparable<?>> data) ...
That is to say, you can replace T with an unnamed wildcard ? because it is used only once in the method signature. It does not use concepts like the same class or an object's own class, etc.

Categories

Resources