I've got a ThingsProvider interface I'm trying to test with Mockito, (simplified version) defined as follows:
interface ThingsProvider {
Iterable<? extends Thing> getThings()
}
Now when I'm going to test it with Mockito, I'm doing the following (again, simplified for the question):
ThingsProvider thingsProvider = mock(ThingsProvider.class);
List<Thing> things = Arrays.asList(mock(Thing.class));
when(thingsProvider.getThings()).thenReturn(things); // PROBLEM IS HERE
Compile error message: The method thenReturn(Iterable<capture#11-of ? extends Thing>) in the type OngoingStubbing<Iterable<capture#11-of ? extends Thing>> is not applicable for the arguments (List<Thing>)
Now, purely for getting the test going, I'm changing the last line to
when(thingsProvider.getThings()).thenReturn((List)things); // HAHA take THAT generics!
... but this would clearly be bad to do in non-testing code.
My question(s):
Why is this an error? I'm clearly returning objects that extend Thing, which is what the interface expects.
Is there a better way of solving this? Perhaps defining my interface differently? I haven't run into this issue outside of testing so far...
On #2 - The main reason I'm not simply returning Iterable<Thing> is that there are several different extensions where the concrete types have things buried in them that return specific subtypes, and I end up with issues like Type mismatch: cannot convert from Iterable<MagicalThing> to Iterable<Thing> - maybe the solution is a better way to fix this issue?
For those of you less familiar with Mockito, a more pure Java version of #1 is below:
public static void main(String...args) {
List<Integer> ints = Arrays.asList(1,2,3);
blah(ints);
Foo<Number> foo1 = new Foo<Number>();
foo1.bar(ints); // This works
Foo<? extends Number> foo2 = new Foo<Number>();
foo2.bar(ints); // NO COMPILEY!
}
private static void blah(List<? extends Number> numberList) {
// something
}
public static class Foo<T> {
public Object bar(List<? extends T> tList) {
return null;
}
}
Wildcard in return type is very convenient for subclass implementations, as you've observed.
It doesn't make much difference to the callers of the method though; you may change it to Iterable<Thing> if you'd like to; it's simpler on javadoc, at the expense of subclass implementers. Subclass can do brute cast if necessary, e.g. List<MyThing> => Iterable<Thing>, thanks to erasure.
The reason for your problem is wildcard capture; basically every expression goes though wildcard capture first before the contextual expression is evaluated. In a method invocation
foo( arg )
arg is always wildcard-captured first, before method applicability/overloading/inference are done. In your case, the more general type Iterable<? extends Thing> is lost, becomes Iterable<CAP#>.
Usually, wildcard capture does not pose any problem; but Mockito semantics is nothing usual.
The first solution is to avoid type inference; explicitly supply type arguments instead
Mockito.<Iterable<? extends Thing>>when(...
Or as Delimanolis suggested, use a target type to restrict inference (in java8+)
OngoingStubbing<Iterable<? extends Thing>> stub = when(...
It also appears that lambda inference may be helpful for this case
static <T> OngoingStubbing<T> whenX( Supplier<T> sup )
{
return Mockito.when( sup.get() );
}
whenX(thingsProvider::getThings).thenReturn(things);
// T = Iterable<? extends Thing>
And - if your interface is simple enough, just directly implement it instead of mock it :)
List<Thing> things = ...;
ThingsProvider thingsProvider = ()->things;
Refactor into two steps
OngoingStubbing<Iterable<? extends Thing>> stub = when(thingsProvider.getThings());
stub.thenReturn(things);
or use what bayou.io suggested
Mockito.<Iterable<? extends Thing>>when(thingsProvider.getThings())
Related
The Collections.fill method has the following header:
public static <T> void fill(List<? super T> list, T obj)
Why is the wildcard necessary? The following header seems to work just as well:
public static <T> void fill(List<T> list, T obj)
I cannot see a reason why the wildcard is needed; code such as the following works with the second header as well as the first:
List<Number> nums = new ArrayList<>();
Integer i = 43;
fill(nums, i); //fill method written using second header
My question is: For what specific call of fill would the first header work but not the second? And if there is no such call, why include the wildcard? In this case, the wildcard does not make the method more concise nor add to readability (in my opinion).
This is a really good question and the simple answer was guessed already:
For the current version of the fill(List<? super T> list, T obj) there is no
such input that would be rejected given the signature is changed to fill(List<T> list, T obj), so there is no benefit and the devs are likely followed the PECS principle
The above statement derives from the principle that: if there is a such type X so that
X is a supertype of T then List<X> is a supertype of List<? super T> because of type contravariance.
Since we can always find such X (at the worst case it's the Object class) - the compiler can infer a suitable List<X> argument type given either form of fill.
So, knowing that fact we can interfere with the compiler and infer the type ourselves using "type witness" so the code breaks:
List<Object> target = new ArrayList<>();
//Compiles OK as we can represent List<Object> as List<? super Integer> and it fits
Collections.<Integer>fill(target, 1);
//Compilation error as List<Object> is invariant to List<Integer> and not a valid substitute
Collections.<Integer>fillNew(target, 1);
This is all of course purely theoretical and nobody in their right mind would use the type argument there.
HOWEVER
While answering the question "What is the benefit of using wildcards here?" we yet considered only one side of the equation - us, consumers of the method and our experience but not library developers.
Hence this question is somewhat similar to why Collections.enumeration(final Collection<T> c) is declared the way it is and not enumeration(Collection<T> c) as final seems superfluous for the end-user.
We can speculate here about the real intention, but I can give a few subjective reasons:
First: using List<? super T> (as well as final for enumeration) immediately disambiguates the code that tiny bit more and for the <? super T> specifically - it useful to show that only partial knowledge about the
type parameter is required and the list cannot be used to produce values of T, but only to consume them.
Quote:
Wildcards are useful in situations where only partial knowledge about the type parameter is required.
JLS 4.5.1. Type Arguments of Parameterized Types
Second: it gives some freedom to the library owners to improve/update the method without breaking backward compatibility while conforming to the existing constraints.
Now let's try make up some hypothetical "improvements" to see what I mean (I'll call the form of fill that uses List<T> as fillNew):
#1 The decision is to make method to return the obj value (used to fill up the list) back:
public static <T> void fill(List<? super T> list, T obj)
//becomes ↓↓↓
public static <T> T fill(List<? super T> list, T obj)
The updated method would work just fine for fill signature, but for fillNew - the inferred return type now isn't that obvious:
List<Number> target = new ArrayList<>();
Long val = fill(target, 1L); //<<Here Long is the most specific type that fits both arguments
//Compilation error
Long val = fillNew(target, 1L); //<<Here Number is, so it cannot be assigned back
//More exotic case:
Integer val = fill(asList(true), 0); //val is Integer as expected
Comparable<?> val = fillNew(asList(true), 0); //val is now Comparable<?> as the most specific type
#2 The decision to add an overloaded version of fill that is 10x more performant in cases when T is Comparable<T>:
/* Extremely performant 10x version */
public static <T extends Comparable<T>> void fill(List<? super T> list, T value)
/* Normal version */
public static void fill(List<? super T> list, T value)
List<Number> target = new ArrayList<>();
fill(target, 1); //<<< Here the more performant version is used as T inferred to Integer and it implements Comparable<Integer>
fillNew(target, 1); //<< Still uses the slow version just because T is inferred to Number which is not Comparable
To sum up - the current signature of fill is more flexible/descriptive in my opinion for all parties (developers and library designers)
For your example, the reason it 'works' with your basic <T> signature, is that an Integer is also a Number. The only 'T' that works is T = Number, and then the whole thing just works out.
In this case, the expression you have for the T obj parameter is a reified type: You have an Integer. You could have a T instead. Perhaps you have this:
class AtomicReference<T> {
// The actual impl of j.u.concurrent.AtomicReference...
// but with this one additional method:
public void fillIntoList(List<? super T> list) {
T currentValue = get();
Collections.fill(list, currentValue);
}
}
I may perhaps want to write something like this:
AtomicReference<String> ref = new AtomicReference<String>("hello");
List<CharSequence> texts = new ArrayList<>();
...
ref.fillIntoList(texts);
If my hypothetical fillIntoList method simply had List<T> in the signature that wouldn't compile. Fortunately it does, so the code does compile. Had the Collections.fill method not done the <? super T> thing, the invocation of the Collections.fill method in my fillIntoList method would have failed.
It's highly exotic for any of this to come up. But it can come up. List<? super T> is the strictly superior signature here - it can do everything List<T> does, and more, and it is also semantically correct: Of course I can fill a list-of-foos by writing into every slot a ref to something that I know for sure is a bar, if bar is a child of foo.
That is because the inheritance is useful is some cases.
For example, if you have the following class structure:
public class Parent {
//some code
}
public class Child extends Parent {
//some another code
}
You could use the first method writing:
List<Child> children = new ArrayList<>();
Parent otherParentObject = new Parent(); //after this line, set the values for the class
List<Parent> outParentList = new ArrayList<>();
fill(children, otherParentObject); //fill method using first signature;
Suppose I have the following class:
public class FixExpr {
Expr<FixExpr> in;
}
Now I want to introduce a generic argument, abstracting over the use of Expr:
public class Fix<F> {
F<Fix<F>> in;
}
But Eclipse doesn't like this:
The type F is not generic; it cannot be parametrized with arguments <Fix<F>>
Is this possible at all or have I overlooked something that causes this specific instance to break?
Some background information: in Haskell this is a common way to write generic functions; I'm trying to port this to Java. The type argument F in the example above has kind * -> * instead of the usual kind *. In Haskell it looks like this:
newtype Fix f = In { out :: f (Fix f) }
I think what you're trying to do is simply not supported by Java generics. The simpler case of
public class Foo<T> {
public T<String> bar() { return null; }
}
also does not compile using javac.
Since Java does not know at compile-time what T is, it can't guarantee that T<String> is at all meaningful. For example if you created a Foo<BufferedImage>, bar would have the signature
public BufferedImage<String> bar()
which is nonsensical. Since there is no mechanism to force you to only instantiate Foos with generic Ts, it refuses to compile.
Maybe you can try Scala, which is a functional language running on JVM, that supports higher-kinded generics.
[ EDIT by Rahul G ]
Here's how your particular example roughly translates to Scala:
trait Expr[+A]
trait FixExpr {
val in: Expr[FixExpr]
}
trait Fix[F[_]] {
val in: F[Fix[F]]
}
In order to pass a type parameter, the type definition has to declare that it accepts one (it has to be generic). Apparently, your F is not a generic type.
UPDATE: The line
F<Fix<F>> in;
declares a variable of type F which accepts a type parameter, the value of which is Fix, which itself accepts a type parameter, the value of which is F. F isn't even defined in your example. I think you may want
Fix<F> in;
That will give you a variable of type Fix (the type you did define in your example) to which you are passing a type parameter with value F. Since Fix is defined to accept a type parameter, this works.
UPDATE 2: Reread your title, and now I think you might be trying to do something similar to the approach presented in "Towards Equal Rights for Higher-Kinded Types" (PDF alert). If so, Java doesn't support that, but you might try Scala.
Still, there are ways to encode higer-kinded generics in Java. Please, have a look at higher-kinded-java project.
Using this as a library, you can modify your code like this:
public class Fix<F extends Type.Constructor> {
Type.App<F, Fix<F>> in;
}
You should probably add an #GenerateTypeConstructor annotation to your Expr class
#GenerateTypeConstructor
public class Expr<S> {
// ...
}
This annotation generates ExprTypeConstructor class.
Now you can process your Fix of Expr like this:
class Main {
void run() {
runWithTyConstr(ExprTypeConstructor.get);
}
<E extends Type.Constructor> void runWithTyConstr(ExprTypeConstructor.Is<E> tyConstrKnowledge) {
Expr<Fix<E>> one = Expr.lit(1);
Expr<Fix<E>> two = Expr.lit(2);
// convertToTypeApp method is generated by annotation processor
Type.App<E, Fix<E>> oneAsTyApp = tyConstrKnowledge.convertToTypeApp(one);
Type.App<E, Fix<E>> twoAsTyApp = tyConstrKnowledge.convertToTypeApp(two);
Fix<E> oneFix = new Fix<>(oneAsTyApp);
Fix<E> twoFix = new Fix<>(twoAsTyApp);
Expr<Fix<E>> addition = Expr.add(oneFix, twoFix);
process(addition, tyConstrKnowledge);
}
<E extends Type.Constructor> void process(
Fix<E> fixedPoint,
ExprTypeConstructor.Is<E> tyConstrKnowledge) {
Type.App<E, Fix<E>> inTyApp = fixedPoint.getIn();
// convertToExpr method is generated by annotation processor
Expr<Fix<E>> in = tyConstrKnowledge.convertToExpr(inTyApp);
for (Fix<E> subExpr: in.getSubExpressions()) {
process(subExpr, tyConstrKnowledge);
}
}
}
It looks as if you may want something like:
public class Fix<F extends Fix<F>> {
private F in;
}
(See the Enum class, and questions about its generics.)
There is a roundabout way to encode higher kinded types in Java as pointed out by Victor. The gist of it is to introduce a type H<F, T> to encode F<T>. This can then be used to encode fixed point of functors (i.e. Haskell's Fix type):
public interface Functor<F, T> {
<R> H<F, R> map(Function<T, R> f);
}
public static record Fix<F extends H<F, T> & Functor<F, T>, T>(F f) {
public Functor<F, Fix<F, T>> unfix() {
return (Functor<F, Fix<F, T>>) f;
}
}
From here you can go on and implement catamorphisms over initial algebras:
public interface Algebra<F, T> extends Function<H<F, T>, T> {}
public static <F extends H<F, T> & Functor<F, T>, T> Function<Fix<F, T>, T> cata(Algebra<F, T> alg) {
return fix -> alg.apply(fix.unfix().map(cata(alg)));
}
See my GitHub repo for working code including some example algebras. (Note, IDE's like IntelliJ struggle with the code although it compiles and runs just fine with Java 15).
public class Foo<T extends Bar>{
private Class<T> _type;
public Foo( Class<T> _type ){
this._type = _type;
}
public Collection<T> hypothetical( List<T> items ){ //PROBLEMATIC
return dostuffWithItems( items );
}
}
Usage:
Foo<? extends ChildBar> foo = new Foo<ChildBar>( ChildBar.class );
List<ChildBar> items = ( List<ChildBar> ) foo.hypothetical( new ArrayList<ChildBar>() ); //COMPILER ERROR: The method hypothetical(List<capture#2-of ?>) in the type Foo<capture#2-of ?> is not applicable for the arguments (List<ChildBar>)
The compiler would either accept
casting List<ChildBar> items argument to List<?>
or changing the hypothetical( List<T> items ) signature to either
a) hypothetical( List<ChildBar> items ) or
b) hypothetical( List<? extends Bar> items )
However, none of the alternatives assure that the hypothetical method's List items argument T type is the equivalent runtime type of the Foo class T parametric type. I am currently using an extra method to verify the parametric types at the moment.
Is there a better way within Java generics constructs to achieve this automatically without the extra logic? Or better yet, why can I not declare foo as Foo<? extends Bar> and then fill in the actual type parameter at runtime?
I edited your code and added the missing stuff to make it compilable, and I can confirm that the only problematic parts are:
The missing dostuffWithItems method.
The typos with the hypothetical method name.
Assigning a Collection<ChildBar> to a List<ChildBar>.
The first two are easy to fix.
The last one requires you to either change the change the API method, or change the code where you are calling it. Neither of these is (IMO) problematic. Furthermore, the
It is worth noting that you would get all of these errors if the types were non-generic. You can't assign a Collection to a List without a typecast.
Here's my code for you to play with. (Copy and paste into appropriately named files ...)
public class Bar {
}
public class ChildBar extends Bar {
}
import java.util.*;
public class Foo<T extends Bar> {
private Class<T> _type;
public Foo( Class<T> _type ) {
this._type = _type;
}
public Collection<T> hypothetical( List<T> items ) {
return items; // dummy implementation ...
}
}
import java.util.*;
public class Main {
public static void main(String[] args) {
Foo<ChildBar> foo = new Foo<ChildBar>( ChildBar.class );
Collection<ChildBar> items =
foo.hypothetical( new ArrayList<ChildBar>() );
}
}
The accepted answer doesn't precisely explain why the snippet in the question (after edits) is rejected by the compiler.
We start from the observation that the snippet from #Stephen C's answer is accepted, while revision 8 of the question is rejected. The difference is: in the latter version the variable foo is declared with a wildcard-parameterized type Foo<? extends ChildBar>, while Stephen C had copied Foo<ChildBar> from an earlier revision (we all seem to agree that this is a suitable way to resolve the compile error).
To understand why this difference is crucial please see that with Foo<? extends ChildBar> foo this wildcard propagates as a capture into the signature for the invocation of foo.hypothetical, so this invocation is rendered as hypothetical(List<capture#2-of ?>), meaning that the parameter has an unknown (upper-bounded) type parameter. List<ChildBar> is not compatible to that type, hence the compile error.
Also note that all mentions of "runtime" in this thread are inappropriate, all this is statically resolved at compile time. Perhaps you meant invocation type, or type of the actual argument, as opposed to the declared type (of the formal parameter). The actual runtime type is unknown to the compiler.
This seems to be currently impossible in Java.
Foo<? extends ChildBar> foo = new Foo<ChildBar>( ChildBar.class );
This leave foo with an ambiguous parametric type. It is obvious that ChildBar would become the true de facto parametric type. The call to the foo.hypothetical() method with the List<ChildBar> exposes this assumption to be untrue. Although foo.hypothetical only accepts a List<> argument containing elements of the foo parametric type, it still fails to recognize that the argument was a list of ChildBar objects.
For this use case, the object parametric type must be specified during foo declaration in order make it a part and parcel of the foo runtime reference.
Foo<ChildBar> foo = new Foo<ChildBar>( ChildBar.class );
All conforming List<ChildBar> arguments of the foo.hypothetical method will now correctly be accepted as carrying elements of the foo's declared parametric type.
I have this problem
a method which is cutting unwanted details from one class and returning collection of objects with wanted ones. the matter is I want this metod to be able to work with different classes ( which are based on one abstract, though), so I use generic type. the problem is that in one point I need to create an instance of , which is impossible. I looked for some way out, but it doesn't seem to work for my case.
So, code is following
private <T extends RestMandate> List<T> toRestMandate(List<CardMandate> mandates ) {
List<T> restMandates = new ArrayList<>(mandates == null ? 0
: mandates.size());
if (mandates != null) {
for (CardMandate mandate : mandates) {
restMandates.add(new T(mandate));
}
}
return restMandates;
}
RestMandate is base class, CardMandate were I take the info. Any ideas?
Since the generic type arguments are erased at runtime, there is no way you can refer to it like you are trying to do. The only way out is a type tag argument + reflective instantiation.
A better choice is to redesign your solution to solve this without relying on generics and type tags. Leverage dynamic method dispatch instead: add a method to RestMandate which will return the object converted to the desired type.
Because of Type Erasure, T becomes Object at runtime. You don't know its real type anymore.
You can still instantiate the object by reflection if you have its class. In order to do that, you must give the class to your method:
private <T extends RestMandate> List<T> toRestMandate(List<CardMandate> mandates, Class<T> clazz ) {
...
for (CardMandate mandate : mandates) {
/*
* I get the constructor which needs one CardMandate and call it.
* Note : I do not recommend this solution (no check at compile-time!).
* Like Marko Topolnik, I advise to redesign the solution.
*/
restMandates.add(clazz.getConstructor(CardMandate.class).newInstance(mandate));
}
...
}
To create an instance you require Class<T> object too
private <T extends RestMandate> List<T> toRestMandate(List<CardMandate> mandates, Class<T> clazz) {
//....
T newInst = clazz.newInstance();
//....
}
I have a Generic method that should work similarly to recursion, but calling different instances of the method for each calling.
public <M extends A> void doSomething(Class<M> mClass, M mObject)
{
// ... Do something with mObject.
A object = getObject();
Class<? extends A> objectClass = object.getClass();
doSomething(objectClass, objectClass.cast(object)); // Does not compile.
}
private A getObject() {...}
The problem is the line with a comment does not compile, giving the following error:
The method doSomething(Class, M) in the type MainTest is not applicable for the arguments (Class, capture#3-of ? extends A)
I don't quite understand why the compiler does not compile if it can call doSomething with M = "? extends A".
Why doesn't it compile?
Ok here is a crude explanation
You've typed your method so that it will accept M which is a subtype of A
Now you are calling your method using 'objectClass' which is a subtype of A BUT not necessarily a subtype of M.
Hence the compiler is complaining...
If you can explain what you are trying to do a bit more, I can help with a solution.
The language does not track wildcards like that (it seems). What you need to do is to capture that wildcard, which can be done with a method call with type inference.
public <M extends A> void doSomething(Class<M> mClass, M mObject) {
// ... Do something with mObject.
A object = getObject();
Class<? extends A> objectClass = object.getClass();
privateSomething(objectClass, object);
}
private <T extends A> void privateSomething(Class<T> objectClass, A object) {
doSomething(objectClass, objectClass.cast(object)); // Should compile.
}
As always, whilst reflection has some uses, it's usually a sign of confusion.
When you are asking the compiler to perform a cast, the exact type to perform the cast must be known. It is not sufficient to tell the compiler that you don't know about the exact type excerpt that it's a subclass of A.
Class tell the compiler that the type of the object is a subclass of A but it doesn't tell the compilator the exact type to be used for the casting.
Your problem is that you are trying to replace Polymorphism with Generic. As you are learning the hard way, Generic is not the new modern way of doing Polymorphism.