I recently bumped into the ResultConverter interface in Neo4j when examining the following method on the RestAPIFacade class...
org.neo4j.rest.graphdb.RestAPIFacade.query(String statement, Map<String, Object> params, ResultConverter resultConverter)
Thinking it might be involved in the conversion of a query result to a specified java class as the code below suggests..
public interface ResultConverter<T, R> {
R convert(T value, Class<R> type);
ResultConverter NO_OP_RESULT_CONVERTER = new ResultConverter() {
#Override
public Object convert(Object value, Class type) {
return null;
}
};
}
I started digging around for documentation on usage of the interface and what the types T and R(I'm kind of suspecting that the R might be the class to convert to) but I've come up short so far. Can anyone give me a heads up on what this is supposed to do in the context of the query method?
Examples would really help.
Thanks.
If you look at the result type of the query method in RestAPIFacade, you'll see that it returns a QueryResult<T>
on QueryResult you have a bunch of methods to convert your result to other types, and these methods then use the result-converter to do the conversion.
public interface QueryResult<T> extends Iterable<T> {
<R> ConvertedResult<R> to(Class<R> type);
<R> ConvertedResult<R> to(Class<R> type, ResultConverter<T, R> resultConverter);
void handle(Handler<T> handler);
}
Where the to methods return a ConvertedResult which then is either an Iterable of type R or has methods to access a single value of type R.
public interface ConvertedResult<R> extends Iterable<R> {
R single();
R singleOrNull();
void handle(Handler<R> handler);
}
So in this case T is Map<String, Object> and R would be your target type. The default Converter supports conversion to node and path objects and vice versa. See the implementation here.
Related
Not sure how to put it in words, so I'll just start with the code I currently have:
// "root type" for all resources
// fixed
public class ResourceClassA
{ }
// "base type" for all resources for a specific domain
// fixed
public class ResourceClassB extends ResourceClassA
{ }
// specific resource type
// can be derived from further but don't think that matters here
// not fixed but heavily constrained in other ways
public class ResourceClassC extends ResourceClassB
{ }
// only needed for a negative example below, irrelevant otherwise
public class ResourceClassD extends ResourceClassB
{ }
// fixed
public class Remote
{
public <T extends ResourceClassA> Set<T> read(Class<T> baseType, Class<? extends T> subType);
}
// semi-fixed: read() signature can be modified
public interface AbstractRemoteAccess<T extends ResourceClassA>
{
Set<T> read(Class<? extends T> clazz);
}
// semi-fixed: read() signature can be modified
public class SpecificRemoteAccess<T extends ResourceClassA> implements AbstractRemoteAccess<T>
{
private Class<T> _baseType;
private Remote _remote;
public Set<T> read(Class<? extends T> clazz)
{
return _remote.read(_baseType, clazz);
}
}
// not fixed
public class ConsumerClass
{
public void doSomething(AbstractRemoteAccess<ResourceClassB> remoteAccess)
{
Set<ResourceClassB> rawObjects = remoteAccess.read(ResourceClassC.class);
Set<ResourceClassC> castedObjects = rawObjects.stream()
.map(c -> (ResourceClassC) c)
.collect(Collectors.toSet());
}
}
All classes marked with fixed cannot be changed, they are provided as is - vice versa for not fixed. Class SpecificRemoteAccess is the one I'm looking to change: I would like the read() method to
not return its result as Set but as a Set<> of generic type matching clazz
so that the caller does not have to cast the method's result, see ConsumerClass.doSomething()
and all of this without loosing type-safety
The easiest way I saw was to do
Set<V> read(Class<V extends T> clazz)
but that produces this error:
Incorrect number of arguments for type Class<T>; it cannot be parameterized with arguments <V, T>
which, if I'm interpreting it correctly, means the compiler is treating V & T as separate type arguments for Class which doesn't match its definition.
Next I tried adding a second generic type V and using it as generic type for the return type of read(). I started with
<V extends T> Set<V> read(Class<? extends T> clazz)
which doesn't constrain V to clazz at all, meaning both of these will be accepted by the compiler
Set<ResourceClassC> correct = remoteAccess.read(ResourceClassC.class);
Set<ResourceClassD> incorrect = remoteAccess.read(ResourceClassC.class);
The type declaration for incorrect is semantically wrong but syntactically fine. So I tried to constrain V based on clazz but the only solution I could think of is
<V extends T> Set<V> read(Class<V extends T> clazz, Class<V> classV)
which does, somewhat, fix the problem from above:
// compiles
Set<ResourceClassC> correct = remoteAccess.read(ResourceClassC.class,
ResourceClassC.class);
// error: Type mismatch: cannot convert from Set<ResourceClassC> to Set<ResourceClassD>
Set<ResourceClassD> incorrect = remoteAccess.read(ResourceClassC.class,
ResourceClassC.class);
but not only does it make the read() call cumbersome (users will be wondering why they have to pass the same info twice) but also error prone:
// compiles
Set<ResourceClassC> correct = remoteAccess.read(ResourceClassC.class,
ResourceClassC.class);
// type error
Set<ResourceClassD> incorrect = remoteAccess.read(ResourceClassC.class,
ResourceClassC.class);
// compiles but will cause run-time cast failures
Set<ResourceClassD> incorrect2 = remoteAccess.readAndCast2(ResourceClassC.class,
ResourceClassD.class);
Given consumer-side developers are faced with hundreds of resource classes like ResourceClassC, making read() error prone simply is no option.
Would appreciate if someone could point out my mistake.
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 5 years ago.
Improve this question
I have an interface with method parse(String value) and it might have different implementations which return a map of <String, Integer> <String, String> or anything. How can I make this generic enough so that I extend for different return types?
Currently, I do :
public interface Parser <K,V> {
Map<K,V> parse(String document);
}
But this will make it generic for maps alone. Can someone tell me is there a way I could make it generic for different return types?
If you want to make your interface generic in the return type, I would suggest an extension of JoeC's comment.
Since Java 8, there is the java.util.function-package, providing interfaces for basic transformation. In particular, the interface Function can be used to fit your purpose. I would suggest an implementation like this:
// file: Parser.java
import java.util.function.Function;
public abstract class Parser<R> implements Function<String, R> {
#Override
public final R apply(String document) {
return (this.parse(document));
}
abstract public R parse(String document);
}
An instantiation for the above example would look like this:
String document = ...;
Parser<Map<K, V>> mapParser = ...; // instantiate a fitting Parser
Map<K, V> result = mapParser.parse(document);
(Given that K and V are generic parameters known in this code block).
You could further specify the interface to obtain a somewhat simpler syntax:
// file: MapParser.java
import java.util.Map;
public abstract class MapParser<K, V> extends Parser<Map<K, V>> {}
With this (empty) interface, you can re-wirte the above code as:
String document = ...;
MapParser<K, V> mapParser = ...; // instantiate a fitting MapParser
Map<K, V> result = mapParser.parse(document);
As mentioned by #matoni, it is possible to write interfacesIParser and IMapParser and set abstract classes Parser and MapParser on top of them:
// file: IParser.java:
import java.util.function.Function;
public interface IParser<R> extends Function<String,R> {
#Override
default public R apply(String document) {
return (this.parse(document));
}
public R parse(String document);
}
// file: IMapParser.java:
import java.util.Map;
public interface IMapParser<K, V> extends IParser<Map<K, V>> {}
// file: Parser.java:
public abstract class Parser<R> implements IParser<R> {
#Override
public final R apply(String document) {
return (this.parse(document));
}
}
// file: MapParser.java:
import java.util.Map;
public abstract class MapParser<K, V> extends Parser<Map<K, V>>
implements IMapParser<K, V> {}
The interfaces provides more flexibility for the user since one class can implement multiple interfaces, but only extends one other class. On the downside, however, the developer of interfaces IParser and IMapParser cannot enforce that method apply(...) cannot be overwritten. Thus, in theory, an implementer of Parser could implement apply(...) and parse(...) differently, which could lead to unexpected behaviour. When using abstract classes Parser and MapParser, the developer does enforce that apply(...) calls parse(...) and thus consistent behaviour.
If you want it to return any type, just define it with one generic type like T:
public interface Parser <T> {
<T> parse(String document);
}
This is possible, but I'm afraid that you will run into a new challenges later. Java currently has now way to instantiate a class from generic type, so you must also pass that class type as a parameter:
public interface Parser <T> {
<T> parse(Class<T> clazz, String document);
}
You can do this, but I think should design you architecture further. If return type from a document can be anything, in most of cases this is a smell of weak design and will lead to a spaghetti code.
The comments already gave you a very good hint, but I guess you need an example.
// imports elided
interface Parser<T> {
T parse(String document);
Parser<Map<String, Integer>> static mapParser() {
// replace with actual parsing code
return document -> {
Map<String, Integer> result = new Hashmap<>();
result.put(document, document.length());
return result;
}
Parser<List<String>> static listParser() {
return document -> Collections.singletonList(document);
}
}
Notice that the implementations are just placeholders - they are just meant to illustrate the Parser types you can create. I also used a lambda which is more concise given that your interface only has one instance method parse(String document), which qualifies it as a FunctionalInterface, allowing you to substitute an anonymous lambda expression when implementing the named interface method.
The caller can then call via:
String document = "abc";
Map<String, Integer> lookup = Parser.mapParser().parse(document);
List<String> list = Parser.listParser().parse(document);
I am trying to create a generic converter interface, which will convert T objects to U objects by using the same method name, i.e. convert:
public interface GenericConverter<T, U> {
T convert(U fromObject);
U convert(T fromObject);
}
Of course, generics erasure tranforms both methods into the following during compilation:
convert(object fromObject);
So both methods have the same erasure, which results in an error during compilation.
In my example it is logical that I will always use different object types for T and U. Is there a way to keep the same method name (convert), be able to encapsulate the fact that T and U are different types and ensure that the proper method will be called in each case?
Unless the two types T and U are based in two separate type hierarchies (i.e. each one will always have some distinct superclass), there's no way of having the two methods with same name. It doesn't even make sense semantically in that case - what should be the semantic difference between the two methods if you cannot distinguish the two types in any reasonable matter?
Apart of the suggested renaming of the methods, consider also only having one such method in the interface and instead using a GenericConverter<T, U> and GenericConverter<U, T> wherever you need to transform both ways.
It's not directly possible due to type erasure. Several options have already been listed in the other answers. One of them implicitly aimed at separating the conversions. So instead of having a single converter, you could have two distinct ones:
public interface GenericConverter<T, U> {
U convert(T fromObject);
}
GenericConverter<Integer, String> forward = Converters.integerString();
GenericConverter<String, Integer> backward = Converters.stringInteger();
But note that the GenericConverter interface in this cases is structurally equal to the Function interface - so there is probably no reason to create a new one.
Instead, if you want to have this "forward and backward converter" as some sort of a named entity (with both conversion functions inseparably linked together), you could define an interface for that:
public interface GenericConverter<T, U> {
Function<T, U> forward();
Function<U, T> backward();
}
This could be used as follows:
GenericConverter<Integer, String> converter = Converters.integerString();
String string = converter.forward().apply(someInteger);
Integer integer = converter.backward().apply(someString);
Whether or not this is the "best" solution here depends on the intended usage patterns. One advantage could be that, with a generic (!) utility function like this...
private static GenericConverter<T, U> create(
Function<T, U> forward, Function<U, T> backward) {
return new GenericConverter() {
#Override
public Function<T, U> forward() {
return forward;
}
#Override
public Function<U, T> backward() {
return backward;
}
}
}
creating a new converter would be easy as pie:
public static GenericConverter<Integer, String> integerString() {
return create(
integer -> String.valueOf(integer),
string -> Integer.parseInt(string)
);
}
Problem
When you say,
it is logical that I will always use different object types for T and U
Compiler does not know. Types can be forced to be same, but not to be different (without constraints).
Approach 1
class ConversionSource {}
class ConversionTarget {}
interface GenericConverter<T extends ConversionSource, U extends ConversionTarget> {
T convert(U obj);
U convert(T obj);
}
Now, erasures are different. You get the behavior you want with source you want, but usage is severely restricted because of constraints.
Approach 2
interface InvertibleConverter<T, U> {
U convert(T obj);
InvertibleConverter<U, T> inverse();
}
class Tokenizer implements InvertibleConverter<String, Stream<String>> {
#Override
Stream<String> convert(String obj) {
return Arrays.stream(obj.split(" "));
}
#Override
InvertibleConverter<Stream<String>, String> inverse() {
return new InvertibleConverter<Stream<String>, String>() {
#Override
public String convert(Stream<String> obj) {
return obj.collect(Collectors.joining(" "));
}
#Override
public InvertibleConverter<String, Stream<String>> inverse() {
return Tokenizer.this;
}
};
}
}
Usage can be as follows
InvertibleConverter<String, Stream<String>> splitter = new Tokenizer();
String sentence = "This is a sentence";
Stream<String> words = splitter.convert(sentence);
String sameSentence = splitter.inverse().convert(words);
This approach works even when T and U are identical.
Hope this helps.
Good luck
I am trying to program a kind of registry for objects of different classes.
I have the following:
public interface DbObject{
void setId(long id);
Long getId();
}
A prototypic class implementing this interface would be the following:
public class BasicDbObject implements DbObject{
private long id=null;
void setId(long id){
this.id=id;
}
Long getId(){
return id;
}
}
I want to build various different Implementations of this Interface.
And I want to be able to have a Map object, that maps from each implementing class to a Map of instances.
Something like this:
Map <Class<C implements DbObject> , Map<Long, C>> registry = new TreeMap/HashMap/SomeOtherKindOfMap (...)
I know I could do something like
Map <String,Map<Long,DbObjects>> registry = new ...
But this way I would have to write some more code for determining names, comparing classes and so on. Is there an easier way to accomplish this?
So what I want to know: is it possible to have class objects as keys in a tree map?
What would be the syntax to declare a map object, that maps from implementing classes C to a map objects each mapping from a long object (the id) to instances of C?
I want to be able to do requests like the following:
BasicObject bo = registry.get(BasicObject.class).get(42);
assuing id did
BasicObject bo=new BasicObject(...);
innerMap = new SomeMap<Long,BasicObject>();
innerMap.put(42,bo);
registry.put(BasicObject.class,innerMap);
before.
Please tell me, if this still is not clear, I have difficulties to explain, since english is not my mother tongue.
Thank you in advance.
Edit:
It turns out, i can do something very close to what I want, when defining a generic class around the map:
public class ObjectRegistry <T extends DbObject>{
private HashMap<Class<T>, TreeMap<Long,T>> registry=null;
ObjectRegistry(){
registry=new HashMap<Class<T>, TreeMap<Long,T>>();
}
public void register(T dbObject){
TreeMap<Long, T> map = registry.get(dbObject.getClass());
if (map==null){
map=new TreeMap<Long,T>();
registry.put((Class<T>) dbObject.getClass(),map);
}
map.put(dbObject.getId(),dbObject);
}
public <T extends DbObject>T get(Class<T> objectClass,long id){
TreeMap<Long, T> map = (TreeMap<Long, T>) registry.get(objectClass);
if (map != null){
return map.get(id);
}
return null;
}
public TreeMap<Long,T> getAll(Class<T> dbObjectClass) {
return registry.get(dbObjectClass);
}
}
I use a TreeMap for the inner mappings since I want to easily return Class instances sorted by id.
So the refined question is:
Is there a way to do this, without the <T extends DbObject> clause in the Class head?
Edit 2:
Thinking through it again, it turns out that John's answer is exactly the solution to this.
Here is my final code:
HashMap<Class<? extends DbObject>, TreeMap<Long, ? extends DbObject>> registry = null;
public <T extends DbObject> T get(Class<T> clazz, long id) {
TreeMap<Long, T> map = (TreeMap<Long, T>) registry.get(clazz);
if (map != null) {
return map.get(id);
}
return null;
}
public <T extends DbObject> void register(T dbObject) {
TreeMap<Long, T> map = (TreeMap<Long, T>) registry.get(dbObject.getClass());
if (map == null) {
map = new TreeMap<Long, T>();
registry.put((Class<T>) dbObject.getClass(), map);
}
map.put(dbObject.getId(), dbObject);
}
public <T extends DbObject> TreeMap<Long, T> getAll(Class<T> dbObjectClass) {
return (TreeMap<Long, T>) registry.get(dbObjectClass);
}
It does not need the <T extends DbObject> clause in the Class head.
So what I want to know: is it possible to have class objects as keys in a tree map?
TreeMap depends on there being a total order over the key space, as established by the key type having a natural order (by implementing Comparable) or by a separate Comparator object that you provide. Classes do not have a natural order. It is conceivable that you could write a suitable Comparator, but that seems very contrived to me.
But why do you specifically need a TreeMap? You didn't describe any requirement that would not be at least as well addressed by any other kind of Map. In particular, I almost always find HashMap to be a better choice, and I don't see any reason why it would be unsuitable in this one. It can certainly have objects of type Class as keys.
Moreover, if indeed you don't need any particular implementation, then you are best off declaring the type simply as a Map. That way you can actually provide any Map implementation, and even change which one you do provide if you ever discover a reason to do so.
What would be the syntax to declare a map object, that maps from implementing classes C to a map objects each mapping from a long object (the id) to instances of C?
You ask that the constraints on the type of each value be dependent on the type of the associated key, but there is no way to declare a type that enforces such a relationship. Whether a particular key or a particular value is appropriate for the Map is a function of the type of the map alone, not of each others' type.
You can write generic methods around access to your map that provide the appearance of what you want, but the data retrieval methods will need to cast. For example:
Map<Class<? extends DbObject>, Map<Long, ? extends DbObject>> registry = /*...*/;
<T extends DbObject> Map<Long, T> getRegistryMap(Class<T> clazz) {
return (Map<Long, T>) registry.get(clazz);
}
<T extends DbObject> T get(Class<T> clazz, Long id) {
Map<Long, T> map = getRegistryMap(clazz);
return (map == null) ? null : map.get(id);
}
<T extends DbObject> T put(Class<T> clazz, Long id, T obj) {
Map<Long, T> map = getRegistryMap(clazz);
if (map == null) {
map = new HashMap<>();
registry.put(clazz, map);
}
return map.put(id, obj);
}
Updated to add:
So the refined question is: Is there a way to do this, without the <T extends DbObject> clause in the Class head?
Yes, what I already wrote. Just slap a plain class declaration around it. You do not need a generic class to have generic methods. In fact, the two are orthogonal. Regular methods of a generic class can use that class's type parameters. That does not make them generic methods. A method is generic if it declares its own type parameter(s), as mine above do. Your get() method also does that, and it is important to understand that the type parameter <T> you declare explicitly in the method signature shadows the class's type parameter of the same name: it is a different T.
I want to specify a method in an interface the signature of the method of an implementation (JPA 2 based) is:
List<T> getByStringValue(SingularAttribute<? super T, String> attribute, String value)
I want to specify this method in an interface (Object<? super T, String> is not possible) that abstracts from jpa. The implementing method could have a different signature but i want it to accept SingularAttribute<? super T, String> and i want to use it in type safe query without casting around.
At the end i want to specify all entity interaction in a "Repository" interface and want to give one JPA-based implementation that covers most of it for all entities (to minimize redundant code). I have finished it for CRUD-ops and getAll. Now i wish i could provide a generic approach for getByCriteria (one Critera is enough at the moment).
Java doesn't allow the non-generic-parameter part to itself be some kind of wildcard, eg any class that has certain generic parameters:
<T> void method(*<T, String> o) // can't do this
But you can define an abstract type and have all classes you want to use like this implement it, something like:
interface StringGetter<T> {
T getByStringValue(String value);
}
then:
public class WidgetStringGetter implements StringGetter<Widget> {
public Widget getByStringValue(String value) {
// whatever
}
}
and:
public static <T> T factory(StringGetter<T> getter, String value) {
return getter.getByStringValue(value);
}
finally:
Widget w = factory(new WidgetStringGetter(), "foo");
Try the following.
<R, T extends R> List<T> getByStringValue(SingularAttribute<R, String> attribute, String value)
The <R, T extends R> defines generic parameters for use in the method signature.