I am using Flex 3 and make a call through a RemoteObject to a Java 1.6 method and exposed with BlazeDS and Spring 2.5.5 Integration over a SecureAMFChannel. The ActionScript is as follows (this code is an example of the real thing which is on a separate dev network);
import com.adobe.cairngorm.business.ServiceLocator;
import mx.collections.ArrayCollection;
import mx.rpc.remoting.RemoteObject;
import mx.rpc.IResponder;
public class MyClass implements IResponder
{
private var service:RemoteObject = ServiceLocator.getInstance().getRemoteOjbect("mySerivce");
public MyClass()
{
[ArrayElementType("Number")]
private var myArray:ArrayCollection;
var id1:Number = 1;
var id2:Number = 2;
var id3:Number = 3;
myArray = new ArrayCollection([id1, id2, id3]);
getData(myArray);
}
public function getData(myArrayParam:ArrayCollection):void
{
var token:AsyncToken = service.getData(myArrayParam);
token.addResponder(this.responder); //Assume responder implementation method exists and works
}
}
This will make a call, once created to the service Java class which is exposed through BlazeDS (assume the mechanics work because they do for all other calls not involving Collection parameters). My Java service class looks like this;
public class MySerivce {
public Collection<DataObjectPOJO> getData(Collection<Long> myArrayParam) {
//The following line is never executed and throws an exception
for (Long l : myArrayParam) {
System.out.println(l);
}
}
}
The exception that is thrown is a ClassCastException saying that a java.lang.Integer cannot be cast to a java.lang.Long. I worked around this issue by looping through the collection using Object instead, checking to see if it is an Integer, cast it to one, then do a .longValue() on it then add it to a temp ArraList. Yuk.
The big problem is my application is supposed to handle records in the billions from a DB and the id will overflow the 2.147 billion limit of an integer. I would love to have BlazeDS or the JavaAdapter in it, translate the ActionScript Number to a Long as specified in the method. I hate that even though I use the generic the underlying element type of the collection is an Integer. If this was straight Java, it wouldn't compile.
Any ideas are appreciated. Solutions are even better! :)
Please read the following threads related to your issue. You can find there some workarounds.
https://bugs.adobe.com/jira/browse/BLZ-115
https://bugs.adobe.com/jira/browse/BLZ-305
You can also change the argument on the Java side to expect a Long[] rather than a Collection<Long>. Because the native Java array is strongly typed, it deserializes correctly.
Flex serializes an ArrayCollection of Numbers to an ArrayCollection<Integer> in Java.
Since Adobe's ArrayCollection extends ArrayList, you can run the Collection through the following function. This should produce a List of Long values.
public class TransformUtils {
public static final <T extends Number> List<Long> toLongList(Collection<T> values) {
List<Long> list = new ArrayList();
for (T value : values) {
list.add(value.longValue());
}
return list;
}
}
public class MySerivce {
public Collection<DataObjectPOJO> getData(Collection<Long> myArrayParam) {
myArrayParam = TransformUtils.toLongList(myArrayParam);
for (Long l : myArrayParam) {
System.out.println(l);
}
}
}
Guava :)
public static final <T extends Number> List<Long> toLongList(Collection<T> values) {
return Lists.newArrayList(new Function<T, Long>() {
#Override public Long apply(T value) {
return value.longValue(); }));}
Related
I would like to create an array_agg UDF for Apache Drill to be able to aggregate all values of a group to a list of values.
This should work with any major types (required, optional) and minor types (varchar, dict, map, int, etc.)
However, I get the impression that Apache Drill's UDF API does not really make use of inheritance and generics. Each type has its own writer and handler, and they cannot be abstracted to handle any type. E.g., the ValueHolder interface seems to be purely cosmetic and cannot be used to have type-agnostic hooking of UDFs to any type.
My current implementation
I tried to solve this by using Java's reflection so I could use the ListHolder's write function independent of the holder of the original value.
However, I then ran into the limitations of the #FunctionTemplate annotation.
I cannot create a general UDF annotation for any value (I tried it with the interface ValueHolder: #param ValueHolder input.
So to me it seems like the only way to support different types to have separate classes for each type. But I can't even abstract much and work on any #Param input, because input is only visible in the class where its defined (i.e. type specific).
I based my implementation on https://issues.apache.org/jira/browse/DRILL-6963
and created the following two classes for required and optional varchars (how can this be unified in the first place?)
#FunctionTemplate(
name = "array_agg",
scope = FunctionScope.POINT_AGGREGATE,
nulls = NullHandling.INTERNAL
)
public static class VarChar_Agg implements DrillAggFunc {
#Param org.apache.drill.exec.expr.holders.VarCharHolder input;
#Workspace ObjectHolder agg;
#Output org.apache.drill.exec.vector.complex.writer.BaseWriter.ComplexWriter out;
#Override
public void setup() {
agg = new ObjectHolder();
}
#Override
public void reset() {
agg = new ObjectHolder();
}
#Override public void add() {
if (agg.obj == null) {
// Initialise list object for output
agg.obj = out.rootAsList();
}
org.apache.drill.exec.vector.complex.writer.BaseWriter.ListWriter listWriter =
(org.apache.drill.exec.vector.complex.writer.BaseWriter.ListWriter)agg.obj;
listWriter.varChar().write(input);
}
#Override
public void output() {
((org.apache.drill.exec.vector.complex.writer.BaseWriter.ListWriter)agg.obj).endList();
}
}
#FunctionTemplate(
name = "array_agg",
scope = FunctionScope.POINT_AGGREGATE,
nulls = NullHandling.INTERNAL
)
public static class NullableVarChar_Agg implements DrillAggFunc {
#Param NullableVarCharHolder input;
#Workspace ObjectHolder agg;
#Output org.apache.drill.exec.vector.complex.writer.BaseWriter.ComplexWriter out;
#Override
public void setup() {
agg = new ObjectHolder();
}
#Override
public void reset() {
agg = new ObjectHolder();
}
#Override public void add() {
if (agg.obj == null) {
// Initialise list object for output
agg.obj = out.rootAsList();
}
if (input.isSet != 1) {
return;
}
org.apache.drill.exec.vector.complex.writer.BaseWriter.ListWriter listWriter =
(org.apache.drill.exec.vector.complex.writer.BaseWriter.ListWriter)agg.obj;
org.apache.drill.exec.expr.holders.VarCharHolder outHolder = new org.apache.drill.exec.expr.holders.VarCharHolder();
outHolder.start = input.start;
outHolder.end = input.end;
outHolder.buffer = input.buffer;
listWriter.varChar().write(outHolder);
}
#Override
public void output() {
((org.apache.drill.exec.vector.complex.writer.BaseWriter.ListWriter)agg.obj).endList();
}
}
Interestingly, I can't import org.apache.drill.exec.vector.complex.writer.BaseWriter to make the whole thing easier because then Apache Drill would not find it.
So I have to put the entire package path for everything in org.apache.drill.exec.vector.complex.writer in the code.
Furthermore, I'm using the depcreated ObjectHolder. Any better solution?
Anyway: These work so far, e.g. with this query:
SELECT
MIN(tbl.`timestamp`) AS start_view,
MAX(tbl.`timestamp`) AS end_view,
array_agg(tbl.eventLabel) AS label_agg
FROM `dfs.root`.`/path/to/avro/folder` AS tbl
WHERE tbl.data.slug IS NOT NULL
GROUP BY tbl.data.slug
however, when I use ORDER BY, I get this:
org.apache.drill.common.exceptions.UserRemoteException: SYSTEM ERROR: UnsupportedOperationException: NULL
Fragment 0:0
Additionally, I tried more complex types, namely maps/dicts.
Interestingly, when I call SELECT sqlTypeOf(tbl.data) FROM tbl, I get MAP.
But when I write UDFs, the query planner complains about having no UDF array_agg for type dict.
Anyway, I wrote a version for dicts:
#FunctionTemplate(
name = "array_agg",
scope = FunctionScope.POINT_AGGREGATE,
nulls = NullHandling.INTERNAL
)
public static class Map_Agg implements DrillAggFunc {
#Param MapHolder input;
#Workspace ObjectHolder agg;
#Output org.apache.drill.exec.vector.complex.writer.BaseWriter.ComplexWriter out;
#Override
public void setup() {
agg = new ObjectHolder();
}
#Override
public void reset() {
agg = new ObjectHolder();
}
#Override public void add() {
if (agg.obj == null) {
// Initialise list object for output
agg.obj = out.rootAsList();
}
org.apache.drill.exec.vector.complex.writer.BaseWriter.ListWriter listWriter =
(org.apache.drill.exec.vector.complex.writer.BaseWriter.ListWriter) agg.obj;
//listWriter.copyReader(input.reader);
input.reader.copyAsValue(listWriter);
}
#Override
public void output() {
((org.apache.drill.exec.vector.complex.writer.BaseWriter.ListWriter)agg.obj).endList();
}
}
#FunctionTemplate(
name = "array_agg",
scope = FunctionScope.POINT_AGGREGATE,
nulls = NullHandling.INTERNAL
)
public static class Dict_agg implements DrillAggFunc {
#Param DictHolder input;
#Workspace ObjectHolder agg;
#Output org.apache.drill.exec.vector.complex.writer.BaseWriter.ComplexWriter out;
#Override
public void setup() {
agg = new ObjectHolder();
}
#Override
public void reset() {
agg = new ObjectHolder();
}
#Override public void add() {
if (agg.obj == null) {
// Initialise list object for output
agg.obj = out.rootAsList();
}
org.apache.drill.exec.vector.complex.writer.BaseWriter.ListWriter listWriter =
(org.apache.drill.exec.vector.complex.writer.BaseWriter.ListWriter) agg.obj;
//listWriter.copyReader(input.reader);
input.reader.copyAsValue(listWriter);
}
#Override
public void output() {
((org.apache.drill.exec.vector.complex.writer.BaseWriter.ListWriter)agg.obj).endList();
}
}
But here, I get an empty list in the field data_agg for my query:
SELECT
MIN(tbl.`timestamp`) AS start_view,
MAX(tbl.`timestamp`) AS end_view,
array_agg(tbl.data) AS data_agg
FROM `dfs.root`.`/path/to/avro/folder` AS tbl
GROUP BY tbl.data.viewSlag
Summary of questions
Most importantly: How do I create an array_agg UDF for Apache Drill?
How to make UDFs type-agnostic/general purpose? Do I really have to implement an entire class for each Nullable, Required and Repeated version of all types? That's a lot to do and quite tedious. Isn't there a way to handle values in an UDF agnostic to the underlying types?
I wish Apache Drill would just use what Java offers here with function generic types, specialised function overloading and inheritence of their own type system. Am I missing something on how to do that?
How can I fix the NULL problem when I use ORDER BY on my varchar version of the aggregate?
How can I fix the problem where my aggregate of maps/dicts is an empty list?
Is there an alternative to using the deprecated ObjectHolder?
To answer your question, unfortunately you've run into one of the limits of the Drill Aggregate UDF API which is that it can only return simple data types.1 It would be a great improvement to Drill to fix this, but that is the current status. If you're interested in discussing that further, please start a thread on the Drill user group and/or slack channel. I don't think it is impossible, but it would require some modification to the Drill internals. IMHO it would be well worth it because there are a few other UDFs that I'd like to implement that need this feature.
The second part of your question is how to make UDFs type agnostic and once again... you've found yet another bit of ugliness in the UDF API. :-) If you do some digging in the codebase, you'll see that most of the Math functions have versions that accept FLOAT, INT etc..
Regarding the aggregate of null or empty lists. I actually have some good news here... The current way of doing that is to provide two versions of the function, one which accepts regular holders and the second which accepts nullable holders and returns an empty list or map if the inputs are null. Yes, this sucks, but the additional good news is that I'm working on cleaning this up and hopefully will have a PR submitted soon that will eliminate the need to do this.
Regarding the ObjectHolder, I wrote a median function that uses a few Stacks to compute a streaming median and I used the ObjectHolder for that. I think it will be with us for some time as there is no alternative at the moment.
I hope this answers your questions.
I have the ViewValue class defined as follows:
class ViewValue {
private Long id;
private Integer value;
private String description;
private View view;
private Double defaultFeeRate;
// getters and setters for all properties
}
Somewhere in my code i need to convert a list of ViewValue instances to a list containing values of id fields from corresponding ViewValue.
I do it using foreach loop:
List<Long> toIdsList(List<ViewValue> viewValues) {
List<Long> ids = new ArrayList<Long>();
for (ViewValue viewValue : viewValues) {
ids.add(viewValue.getId());
}
return ids;
}
Is there a better approach to this problem?
We can do it in a single line of code using java 8
List<Long> ids = viewValues.stream().map(ViewValue::getId).collect(Collectors.toList());
For more info : Java 8 - Streams
You could do it in a one-liner using Commons BeanUtils and Collections:
(why write your own code when others have done it for you?)
import org.apache.commons.beanutils.BeanToPropertyValueTransformer;
import org.apache.commons.collections.CollectionUtils;
...
List<Long> ids = (List<Long>) CollectionUtils.collect(viewValues,
new BeanToPropertyValueTransformer("id"));
Use google collections. Example:
Function<ViewValue, Long> transform = new Function<ViewValue, Long>() {
#Override
public Long apply(ViewValue from) {
return from.getId();
}
};
List<ViewValue> list = Lists.newArrayList();
List<Long> idsList = Lists.transform(list, transform);
UPDATE:
On Java 8 you don't need Guava. You can:
import com.example.ViewValue;
import java.util.ArrayList;
import java.util.List;
import java.util.function.Function;
import java.util.stream.Collectors;
Function<ViewValue, Long> transform = ViewValue::getId;
List<ViewValue> source = new ArrayList<>();
List<Long> result = source.stream().map(transform).collect(Collectors.toList());
Or just:
List<ViewValue> source= new ArrayList<>();
List<Long> result = source.stream().map(ViewValue::getId).collect(Collectors.toList());
NEXT UPDATE (The last one after Javaslang to Vavr name change):
Currently it's worth to mention about the solution with Javaslang library(http://www.javaslang.io/) Vavr library (http://www.vavr.io/). Let's assume that we have our list with genuine objects:
List<ViewValue> source = newArrayList(new ViewValue(1), new ViewValue(2), new ViewValue(2));
We could make transformation with List class from Javaslang library (on the long run the collect is not convenient):
List<Long> result = io.vavr.collection.List.ofAll(source).map(ViewValue::getId).toJavaList();
But you will see the power with only the Javaslang lists:
io.vavr.collection.List<ViewValue> source = javaslang.collection.List.of(new ViewValue(1), new ViewValue(2), new ViewValue(3));
io.vavr.collection.List<Long> res = source.map(ViewValue::getId);
I encourage to take a look available collections and new types on that library (I like especially the Try type). You will find the documentation under the following address: http://www.javaslang.io/javaslang-docs/ http://www.vavr.io/vavr-docs/.
PS. Due to the Oracle and the "Java" word within the name they had to change the library name from javaslang to something else. They had decided to Vavr.
EDIT: This answer is based on the idea that you'll need to do similar things for different entities and different properties elsewhere in your code. If you only need to convert the list of ViewValues to a list of Longs by ID, then stick with your original code. If you want a more reusable solution, however, read on...
I would declare an interface for the projection, e.g.
public interface Function<Arg,Result>
{
public Result apply(Arg arg);
}
Then you can write a single generic conversion method:
public <Source, Result> List<Result> convertAll(List<Source> source,
Function<Source, Result> projection)
{
ArrayList<Result> results = new ArrayList<Result>();
for (Source element : source)
{
results.add(projection.apply(element));
}
return results;
}
Then you can define simple projections like this:
private static final Function<ViewValue, Long> ID_PROJECTION =
new Function<ViewValue, Long>()
{
public Long apply(ViewValue x)
{
return x.getId();
}
};
And apply it just like this:
List<Long> ids = convertAll(values, ID_PROJECTION);
(Obviously using K&R bracing and longer lines makes the projection declaration a bit shorter :)
Frankly all of this would be a lot nicer with lambda expressions, but never mind...
I've implemented a small functional library for this usecase. One of the methods has this signature:
<T> List<T> mapToProperty(List<?> objectList, String property, Class<T> returnType)
Which takes the string and uses reflection to create a call to the property then it returns a List backed by the objectList where get and iterator implemented using this property call.
The mapToProperty functions is implemented in terms of a general map function that takes a Function as a mapper though, just as another post described. Very usefull.
I suggest you read up on basic functionl programming and in particular take a look at Functors (objects implementing a map function)
Edit: Reflection really doesn't have to be expensive. The JVM has improved a lot in this area. Just make sure to compile the invocation once and reuse it.
Edit2: Sample code
public class MapExample {
public static interface Function<A,R>
{
public R apply(A b);
}
public static <A,R> Function<A,R> compilePropertyMapper(Class<A> objectType, String property, Class<R> propertyType)
{
try {
final Method m = objectType.getMethod("get" + property.substring(0,1).toUpperCase() + property.substring(1));
if(!propertyType.isAssignableFrom(m.getReturnType()))
throw new IllegalArgumentException(
"Property "+property+" on class "+objectType.getSimpleName()+" is not a "+propertyType.getSimpleName()
);
return new Function<A,R>()
{
#SuppressWarnings("unchecked")
public R apply(A b)
{
try {
return (R)m.invoke(b);
} catch (Exception e) {
throw new RuntimeException(e);
}
};
};
} catch (Exception e) {
throw new RuntimeException(e);
}
}
public static <T1,T2> List<T2> map(final List<T1> list, final Function<T1,T2> mapper)
{
return new AbstractList<T2>()
{
#Override
public T2 get(int index) {
return mapper.apply(list.get(index));
}
#Override
public int size() {
return list.size();
}
};
}
#SuppressWarnings("unchecked")
public static <T1,T2> List<T2> mapToProperty(List<T1> list, String property, Class<T2> propertyType)
{
if(list == null)
return null;
else if(list.isEmpty())
return Collections.emptyList();
return map(list,compilePropertyMapper((Class<T1>)list.get(0).getClass(), property, propertyType));
}
}
You could use a wrapper:
public class IdList impements List<Long>
{
private List<ViewValue> underlying;
pubic IdList(List<ViewValue> underying)
{
this.underlying = underying;
}
public Long get(int index)
{
return underlying.get(index).getId()
}
// other List methods
}
Though that's even more tedious work, it could improve performance.
You could also implement your and my solution generic-ly using reflection, but that would be very bad for performance.
There's no short and easy generic solution in Java, I'm afraid. In Groovy, you would simply use collect(), but I believe that involves reflection as well.
That depends on what you then do with the List<Long>, and the List<ViewValue>
For example you might get sufficient functionality from creating your own List implementation that wraps a List<ViewValue>, implementing iterator() with an iterator implementation that iterates over the ViewValues, returning the id.
You can populate a map from the properties of a list of objects (say id as key and some property as value) as below
Map<String, Integer> mapCount = list.stream().collect(Collectors.toMap(Object::get_id, Object::proprty));
I have such code in Java. How to make similar solution in C#.
Especially i'm interested how to implement the first and the last rows?
This code goes through specified package (that contains forms for Android and iOS) and returns Android form instance or iOS depending on getTargetPlatform()
public static <T extends Helpers> T getPage(Class pageInterface) throws Exception {
Set<Class<?>> allClasses = new Reflections("forms", new SubTypesScanner(false)).getSubTypesOf(Object.class);
for (Class pageClass : allClasses) {
if (pageInterface.isAssignableFrom(pageClass) && pageClass.getName().contains(String.format(".%1$s.", getTargetPlatform()))) {
return (T) pageClass.newInstance();
}
}
return (T) pageInterface.newInstance();}
Depending on your usecase, you could scan assemblies for your type. Example:
public static T GetPage<T>(Type pageInterface) where T : Helpers
{
// maybe you need to scan different assemblies, depending on your usecase
var allTypes = Assembly.GetExecutingAssembly().GetTypes();
foreach (var pageType in allTypes)
{
if (pageInterface.IsAssignableFrom(pageType) && pageType.Name.Contains(String.Format(".%1$s.", GetTargetPlatform())))
{
return (T)Activator.CreateInstance(pageType);
}
}
return (T)Activator.CreateInstance(pageInterface);
}
first row is like below:
public static T getPage<T>(Class pageInterface) where T: Helpers
and last row is just the same:
return (T) pageInterface.newInstance();
So, I have some code that looks approximately like (truncated for brevity - ignore things like the public member variables):
public class GenericThingy<T> {
private T mValue;
public final T[] mCandidates;
public GenericThingy(T[] pCandidates, T pInitValue) {
mCandidates = pCandidates;
mValue = pInitValue;
}
public void setValue(T pNewValue) {
mValue = pNewValue;
}
}
public class GenericThingyWidget {
private final GenericThingy<?> mThingy;
private final JComboBox mBox;
public GenericThingyWidget (GenericThingy<?> pThingy) {
mThingy = pThingy;
mBox = new JComboBox(pThingy.mCandidates);
//do stuff here that makes the box show up
}
//this gets called by an external event
public void applySelectedValue () {
mThingy.setValue(mBox.getSelectedItem());
}
}
}
My problem is that the mThingy.setValue(mBox.getSelectedItem()); call generates the following error:
The method setValue(capture#4-of ?) in the type Generics.GenericThingy<capture#4-of ?> is not applicable for the arguments (Object)
I can get around this by removing the <?> from the declaration of mThingy and pThingy in GenericThingyWidget - which gives me a "GenericThingy is a raw type. References to GenericThingy should be parameterized" warning.
I also tried replacing the setValue call with
mThingy.setValue(mThingy.mCandidates[mBox.getSelectedIndex()]);
which I genuinely expected to work, but that produced a very similar error:
The method setValue(capture#4-of ?) in the type Generics.GenericThingy<capture#4-of ?> is not applicable for the arguments (capture#5-of ?)
Is there any way to do this without generating "raw type" warnings ("unchecked cast" warnings I'm OK with) and without making GenericThingyWidget into a generic type? I'd think I could cast the return of mBox.getSelectedItem() to something, but I can't figure out what that would be.
As a bonus question, why does the replacement call to mThingy.setValue not work?
You lack information in GenericThingyWidget.
The ? you put means : any class extending object. Which means any, not some particular one but I don't know which one. Java can't relate one ? to another, they can not be related one to the other in a class hierarchy tree. So
mThingy.setValue(mThingy.mCandidates[mBox.getSelectedIndex()]);
this tries to put an object of any class in the setValue, which is waiting for any other class, but the ? can not tell Java these two any should be the same class.
Without parameterizing GenericThingyWidget, I don't see any way to work around it.
What I would do : parameterize GenericThingyWidget, and create a Factory static parameterized method :
public static <T> GenericThingyWidget<T> make(T someObject){
...
}
I see two possibilities.
With a private addition to GenericThingyWidget— Goetz's capture helper pattern:
public void applySelectedValue() {
helper(mThingy, mBox.getSelectedIndex());
}
private static <T> void helper(GenericThingy<T> pThingy, int pIndex) {
pThingy.setValue(pThingy.mCandidates[pIndex]);
}
Or, quick-and-dirty, with a modification to the API of GenericThingy:
public void setValue(int value) {
mValue = mCandidates[value];
}
As a bonus question, why does the replacement call to mThingy.setValue not work?
The article by Brian Goetz probably explains this better than I will, but I'll give it a try.
mThingy.setValue(mThingy.mCandidates[mBox.getSelectedIndex()]);
The compiler knows that mThingy has some type parameter, but it doesn't know what the that type is, because it is a wildcard. It creates a placeholder for this type—"capture#4-of ?". The compiler also knows that mCandidates has some type, but it doesn't know what it is either. It creates brand new "capture" type—"capture#5-of ?" While you and I can reason that these should be the same type, the compiler (at least for now) can't jump to that conclusion. Thus, you get the error message.
The capture helper gets around that. Although the compiler doesn't know what the type is, it knows it has a type, so it allows you to pass it to the helper method. Once inside the helper method, there are no wildcards, and the compiler doesn't have to do any reasoning about whether the wildcards really refer to the same type.
Update
OK, try this:
public class GenericThingy<T> {
private Class<T> mClazz;
private T mValue;
public final T[] mCandidates;
public GenericThingy(Class<T> clazz, T[] pCandidates, T pInitValue) {
mClazz = clazz;
mCandidates = pCandidates;
mValue = pInitValue;
}
public void setValue(Object newValue) throws ClassCastException {
mValue = mClazz.cast(newValue);
}
}
What you need to to is parameterize GenericThingyWidget like so:
public class GenericThingyWidget<T> {
private final GenericThingy<? super T> mThingy;
private final JComboBox mBox;
public GenericThingyWidget (GenericThingy<? super T> pThingy) {
mThingy = pThingy;
mBox = new JComboBox(pThingy.mCandidates);
//do stuff here that makes the box show up
}
//this gets called by an external event
public void applySelectedValue () {
mThingy.setValue((T) mBox.getSelectedItem());
}
}
}
Technically, you don't need the ? super T for your example, and would be fine with just a T, and perhaps it would be better in real code if you ever want to get from the GenericThingy instead of just inserting into it.
As KLE said, You can just de-parameterize GenericThingy (replace all the T's with objects). In fact, I think you have to unless you plan to pass the class of T to the constructor of GenericThingyWidget, and then dynamically cast from your mbox.getSelectedItem(), since as far as I can tell, getSelectedItem() only returns Object.
Why is the compiler unable to infer the correct type for the result from Collections.emptySet() in the following example?
import java.util.*;
import java.io.*;
public class Test {
public interface Option<A> {
public <B> B option(B b, F<A,B> f);
}
public interface F<A,B> {
public B f(A a);
}
public Collection<String> getColl() {
Option<Integer> iopt = null;
return iopt.option(Collections.emptySet(), new F<Integer, Collection<String>>() {
public Collection<String> f(Integer i) {
return Collections.singleton(i.toString());
}
});
}
}
Here's the compiler error message:
knuttycombe#knuttycombe-ubuntu:~/tmp/java$ javac Test.java
Test.java:16: <B>option(B,Test.F<java.lang.Integer,B>) in
Test.Option<java.lang.Integer> cannot be applied to (java.util.Set<java.lang.Object>,
<anonymous Test.F<java.lang.Integer,java.util.Collection<java.lang.String>>>)
return iopt.option(Collections.emptySet(), new F<Integer, Collection<String>>() {
^
1 error
Now, the following implementation of getColl() works, of course:
public Collection<String> getColl() {
Option<Integer> iopt = null;
Collection<String> empty = Collections.emptySet();
return iopt.option(empty, new F<Integer, Collection<String>>() {
public Collection<String> f(Integer i) {
return Collections.singleton(i.toString());
}
});
}
and the whole intent of the typesafe methods on Collections is to avoid this sort of issue with the singleton collections (as opposed to using the static variables.) So is the compiler simply unable to perform inference across multiple levels of generics? What's going on?
Java needs a lot of hand holding with its inference. The type system could infer better in a lot of cases but in your case the following will work:
print("Collections.<String>emptySet();");
First you can narrow down your problem to this code:
public class Test {
public void option(Collection<String> b) {
}
public void getColl() {
option(Collections.emptySet());
}
}
This does not work, you need a temporary variable or else the compiler cannot infer the type. Here is a good explanation of this problem: Why do temporary variables matter in case of invocation of generic methods?
Collections.emptySet() is not a Collection<String> unless Java knows that it needs a Collection<String> there. In this case, it appears that the compiler is being somewhat silly about the order that it tries to determine types, and tries to determine the return type of Collections.emptySet() before trying to determine the intended template parameter type for B is actually String.
The solution is to explictly state that you need Collections.<String>emptySet(), as mentioned by GaryF.
It looks like a typecasting issue - i.e., that it's being required to cast Object (in Set<Object>, which would be the type of the empty set) to String. Downcasts are not, in the general case, safe.