Subclasses with Java 8 lambdas and Optional - java

I don't understand why the following code doesn't compile:
private ResponseEntity<JSendResponse> buildResponse(RequestModel requestModel,
RequestModelParamConverter paramConverter,
Supplier<String> xsdSupplier,
Supplier<String> xmlTemplateSupplier) {
return Optional.ofNullable(new RequestErrorHandler<>().validate(validator, requestModel))
.map(validationErrors -> new ResponseEntity<>(validationErrors, HttpStatus.BAD_REQUEST))
.orElse(this.buildResponseForValidRequest(requestModel, paramConverter, xsdSupplier, xmlTemplateSupplier));
}
Compile error:
orElse
(org.springframework.http.ResponseEntity<com.company.util.response.JSendFailResponse>)
in Optional cannot be applied to
(org.springframework.http.ResponseEntity<com.company.util.response.JSendResponse>)
While this code (which I think is logically the same code) does compile:
private ResponseEntity<JSendResponse> buildResponse(RequestModel requestModel,
RequestModelParamConverter paramConverter,
Supplier<String> xsdSupplier,
Supplier<String> xmlTemplateSupplier) {
JSendResponse validationErrors = new RequestErrorHandler<>().validate(validator, requestModel);
if(validationErrors == null) {
return this.buildResponseForValidRequest(requestModel, paramConverter, xsdSupplier, xmlTemplateSupplier);
}
else
{
return new ResponseEntity<>(validationErrors, HttpStatus.BAD_REQUEST);
}
}
The problem seems to be that the RequestErrorHandler<>().validate method returns a class called JSendFailResponse if the validation fails. JSendFailResponse is a subclass of JSendResponse, which is what is ultimately returned. It seems like the lambda code is not able to understand that JSendFailResponse is a JSendResponse.
I can get it to compile if I cast validationErrors to a JSendResponse in the map lambda, but then I lose some of the fields on the JSendFailResponse.
EDIT: This code also fails to compile:
private ResponseEntity<? extends JSendResponse> buildResponse(RequestModel requestModel,
RequestModelParamConverter paramConverter,
Supplier<String> xsdSupplier,
Supplier<String> xmlTemplateSupplier) {
return Optional.ofNullable(new RequestErrorHandler<>().validate(validator, requestModel))
.map(validationErrors -> new ResponseEntity<>(validationErrors, HttpStatus.BAD_REQUEST))
.orElse(this.buildResponseForValidRequest(requestModel, paramConverter, xsdSupplier, xmlTemplateSupplier));
}
EDIT2: Here is a simplified example you can copy/paste into your IDE to see for yourself.
import java.util.*;
public class GemLamDemo {
public static void main(String... args) {
GemLamDemo gld = new GemLamDemo();
gld.getList(null);
}
public List<? extends TypeA> getList(TypeA ta) {
return Optional.ofNullable(ta)
.map(a -> new ArrayList<TypeA>(Arrays.asList(a)))
.orElse(new ArrayList<TypeB>(Arrays.asList(new TypeB())));
}
public class TypeA {
}
public class TypeB extends TypeA {
}
}
EDIT3: I was thinking I understood this issue based on the help I've received so far, but the following code compiles and works.
Optional.ofNullable(val1)
.map(a -> new TypeA())
.orElse(new TypeB());
So the issue does not seem to be that the map and the orElse must return the same type, it seems to be related to paramterization. So, map can emit TypeA and orElse can emit TypeB if its a subclass of TypeA. But they can not emit differing parameterized types of List. List<TypeA> and List<TypeB> don't seem to be considered subtypes of each other and now that I think about it, they aren't.
ResponseEntity<JSendResponse> is a different type than ResponseEntity<JSendFailResponse>. If I were returning plain JSendResponse and JSendFailResponse from the lambdas, that would probably work. But I'm not, I'm emitting different versions of ResponseEntity, which are not really related by hierarchy. So I guess it comes down to how Optional supports (or doesn't support) wildcard generics. I can't set the type of the Optional to ResponseEntity<? extends JSendResponse>, so I am limited to strict type hierarchies.
EDIT4:
The above example is incorrect because the types are switched from the original case. I think I get it now, thanks everybody.

The type emitted from map() is inferred as JSendFailResponse, but you’re offering a different type in orElse() and both types must agree.
Explicitly type to call to map() with a common type:
.<JSendResponse>map(validationErrors -> new ResponseEntity<>(validationErrors, HttpStatus.BAD_REQUEST))

If you check the oracle documention : https://docs.oracle.com/javase/8/docs/api/java/util/Optional.html,
The signature of .ofNullable() method is : public static <T> Optional<T> ofNullable(T value)
and orElse() : public T orElse(T other)
So the parameter on the both methods type is T and orElse() does not have <? extends T> as parameter they should both have T as type, so your method shall not work i guess,
you should try something this ( using your simplified example) :
public List<TypeA> getList(TypeA ta) {
ArrayList<TypeA> typeAinstance = new ArrayList<>();
return Optional.ofNullable(ta)
.map(a -> new ArrayList<TypeA>(Arrays.asList(a)))
.orElse(typeAinstance.getClass().cast(Arrays.asList(new TypeB())));
}
public class TypeA {
}
public class TypeB extends TypeA {
}
Hope this helps

Related

How to fix "Unsafe interpretation of method return type" in Eclipse null annotation analysis

Using Eclipse 2019-12.
I'm defining a method like this:
public static <T> T execute(#NonNull Supplier<T> action) {
return action.get();
}
Now, if I do this:
public #NonNull String foo() { return ""; }
public String doSomething() {
return MyClass.execute(() -> foo());
}
Eclipse shows an info/warning message on doSomething body saying:
Unsafe interpretation of method return type as '#NonNull' based on substitution 'T=#NonNull String'. Declaring type 'MyClass' doesn't seem to be designed with null type annotations in mind
So, I can understand that having MyClass.execute(Supplier) returning T and passing a Supplier<T> where T is #NonNull doesn't necessarily mean that the return type is also #NonNull, because in general MyClass.execute(Supplier) could be implemented like this:
public static <T> T execute(Supplier<T> action) {
if(action.getClass().getSimpleName().startsWith("a"))
return null;
return action.get();
}
in which case, even if the specified Suppliers T type variable is #NonNull, MyClass.execute(Supplier) still returns null in some cases (and Map.get(Object) is a concrete example, where it can return null if the specified key is missing, even if the Map is defined to have a #NonNull value type).
But now, since MyClass.execute(Supplier) is under my control, I would like to understand if there's a way to write it to express without ambiguities that its return type is non-null if the specified Supplier has a #NonNull return value, while it's nullable if the specified Supplier has a #Nullable return value.
Also, it's not clear to me why this does NOT generate the above info/warning:
public String doSomething() {
return MyClass.execute(() -> "");
}
that is, I simply inlined the foo() method call in the above example.
The message was shown incorrectly in Eclipse 2019-12 (4.14) and before.
In Eclipse 2020-03 (4.15) annotation-based null analysis has been improved. A new check (Unsafe conversion of annotated parameterized type to less-annoted type) has been added and several fixes have been made that also affect your example.
There are no messages shown in the following code anymore:
public static <T> T execute(#NonNull Supplier<T> action) {
return action.get();
}
public #NonNull String foo() {
return "";
}
public #NonNull String doSomething() {
return MyClass.execute(() -> foo());
}
But if you change foo() to return a #Nullable value, in the line return Foo.execute(() -> foo()); the following problem is correctly shown:
Null type safety (type annotations): The expression of type 'String' needs unchecked conversion to conform to '#NonNull String'
For the new Unsafe conversion of annotated parameterized type to less-annoted type option see:
Eclipse 4.15 - New and Noteworthy
Blog post Interfacing null-safe code with legacy code by Stephan Herrmann
My video showing this in action

Design generic interface for data object used throughout a service

Been migrating over some legacy code and I came across this.
#Getter
#Setter
public class CollectedData
{
SkillResponse skills;
TrendingResponse storyMatch;
Map<String, String> slotData;
Map<String, String> slotDataSecondSource;
Boolean hasSlots;
Boolean hasSlotsSecondSource;
KnowledgeRequest request;
}
Since I've been using java 8 and accustomed to streams, I started to restructure this response class as ..
#Getter
#Setter
public class CollectedData
{
List<DataSupplierResponse> dataSupplierResponses;
Metadata metadata;
}
Where DataSupplierResponse was to be a defined interface like so..
public interface DataSupplierResponse<T>
{
DataSupplierType getDataSupplierType();
T getSupplierResponse();
}
Implementation Example:
public class DataSupplierResponseImpl implements DataSupplierResponse<TrendingResponse>
{
private TrendingResponse mTrendingResponse;
public DataSupplierResponseImpl(
TrendingResponse trendingResponse)
{
mTrendingResponse = trendingResponse;
}
#Override
public DataSupplierType getDataSupplierType()
{
return DataSupplierType.TRENDING_STORY;
}
#Override
public TrendingResponse getSupplierResponse()
{
return mTrendingResponse;
}
}
The goal is to run certain predicates depending on the CollectedData.
Optional<DataSupplierResponse> first = data.getDataSupplierResponses().stream()
.filter(res -> res.getDataSupplierType().equals(DataSupplierType.TRENDING_STORY))
.findFirst();
This would need a cast in order to get the right object. It returns Object
TrendingResponse match = first.get().getSupplierResponse();
Thus when I started refactoring, I assumed to solve this issue of data being available by creating the generic interface that returns different data. To make this code work, I would have to cast the return object of getSupplierResponse which defeats the purpose of using generics. I need to make this Data Carrier object as clean and beautiful as possible for my own sake. Any ideas how I should structure these classes, and/or how to use generics to solve this problem.
EDIT: I know the StackOverflow community likes to enforce objective, concrete answers but where else to go to for design questions?
You have to specify the List in CollectedData also with generics. E.g:
List<DataSupplierResponse> dataSupplierResponse;
should actually be:
List<DataSupplierResponse<YourType>> dataSupplierResponse;
where YourType corresponds to the type of the response. That is because when using a RawType (a generic class without actually specifiying a generic) all Generic information for that class is eliminated. That's why it is returning Objects and you have to manually cast it.
Unless used in other places, I'd get rid of the enumeration type DataSupplierType as the classes TrendingResponse (and others) already provide a discrimination criteria.
(Also keep in mind that enums are full classes)
The perfect response to this would have you implement a basic type for your response, e.g:
interface Response {
int getScore(); // implement in subclasses
String getName(); // implement in subclasses
}
class TrendingResponse implements Response {}
class SkillResponse implements Response {}
class OtherResponse implements Response {}
but this is not strictly necessary.
At this point just a generic wrapper class would be enough (no need to have a base interface and extend it for each type of response):
class DataSupplierResponse<T extends Response> {
private T t;
public DataSupplierResponse(final T t) {
this.t = t;
}
public T getSupplierResponse() {
return this.t;
}
}
This would allow you to call:
Optional<DataSupplierResponse<?>> first = data.responses
.stream()
.filter(response -> TrendingResponse.class.isAssignableFrom(response.getClass()))
.findFirst();
first.ifPresent( o -> o.getSupplierResponse().getScore() );
or simply
Optional<?> first = data.responses
.stream()
.filter(response -> TrendingResponse.class.isAssignableFrom(response.getClass()))
.map(r -> r::getSupplierResponse)
.findFirst();
Even without the base interface Response (which is useful only for defining common behavior in your responses), your class DataSupplierResponse<T> wouldn't need the enumeration type.

Confusion in chossing data structure

I want to create a Map<Long, Enum< ? extends SomeInterface>. Which is the best option for me?
I tried this one
private Map<Long, Enum<? extends SomeInterface>[]> questionIdToanswersMapping = Collections.unmodifiableMap(Stream.of(
new SimpleEntry<>(QuestionEnum1.getQuestionId(), AnswerEnum1.values()),
new SimpleEntry<>(QuestionEnum2.getQuestionId(), AnswerEnum2.values()),
new SimpleEntry<>(QuestionEnum3.getQuestionId(), AnswerEnum3.values()),
new SimpleEntry<>(QuestionEnum4.getQuestionId(), AnswerEnum4.values()),
new SimpleEntry<>(QuestionEnum5.getQuestionId(), AnswerEnum5.values()))
.collect(Collectors.toMap((e) -> e.getKey(), (e) -> e.getValue())));
But it is giving error "cannot convert from Map<Object,Object> to Map<Long,Enum<? extends SomeEnum>[]>". I am new to this. Please help!
I need unmodifiable map of question Id to the corrosponding possible answers values. Possible answers are Enums
Possible Answers are wrapped like this :
public class RecognizedAnswers {
public enum AnswerEnum1 implements SomeInterface;
public enum Answer2 implements SomeInterface;
}
There is a small problem with naming I think:
You cannot extend one enum with another in java, use interface with desired method instead, like below
And below code is working fine:
#Test
public void test() {
Map<Long, Enum<? extends SomeEnum>[]> questionIdToanswersMapping = Collections.unmodifiableMap(Stream.of(
new AbstractMap.SimpleEntry<>(QuestionEnum1.A.getQuestionId(), AnswerEnum1.values()),
new AbstractMap.SimpleEntry<>(QuestionEnum1.B.getQuestionId(), AnswerEnum1.values()),
new AbstractMap.SimpleEntry<>(QuestionEnum1.C.getQuestionId(), AnswerEnum2.values()),
new AbstractMap.SimpleEntry<>(QuestionEnum1.D.getQuestionId(), AnswerEnum2.values())
)
.collect(Collectors.toMap((e) -> e.getKey(), (e) -> e.getValue())));
System.out.print(questionIdToanswersMapping.size());
}
enum QuestionEnum1 {
A, B, C, D;
Long getQuestionId() {
return (long) name().hashCode(); // my mocked values
}
}
interface SomeEnum {
}
enum AnswerEnum1 implements SomeEnum {
}
enum AnswerEnum2 implements SomeEnum {
}
I tried to replicate your example (since you obfuscated the enum types, I made up my own) and it appears to compile just fine:
enum SomeEnum { FOO, BAR }
private Map<Long, Enum<? extends SomeEnum>[]> exampleMap =
Collections.unmodifiableMap(Stream.of(
new SimpleEntry<>(1L, SomeEnum.values()))
.collect(Collectors.toMap(SimpleEntry::getKey, SimpleEntry::getValue)));
My guess is that you have either a missing parenthesis, or your QuestionEnum1.getQuestionId() returns an int rather than a long, and those things are confusing the compiler enough that it can't give a clear error message.
I'll note that the Stream API really isn't a clean way to construct a constant map. Simply building such a map "normally" with Map.put() will likely be simpler and easier to read, even if it requires a static {} block or a helper function. You can do even better with Guava's immutable collections, which could be used like so:
private final ImmutableMap<Long, Enum<? extends SomeEnum>[]> questionIdToanswersMapping =
ImmutableMap.builder()
.put(QuestionEnum1.getQuestionId(), AnswerEnum1.values())
.put(QuestionEnum2.getQuestionId(), AnswerEnum2.values())
.put(QuestionEnum3.getQuestionId(), AnswerEnum3.values())
.put(QuestionEnum4.getQuestionId(), AnswerEnum4.values())
.put(QuestionEnum5.getQuestionId(), AnswerEnum5.values())
.build();
Much clearer and easier to read (and write).

Mockito and Hamcrest: how to verify invocation of Collection argument?

I'm running into a generics problem with Mockito and Hamcrest.
Please assume the following interface:
public interface Service {
void perform(Collection<String> elements);
}
And the following test snippet:
Service service = mock(Service.class);
// ... perform business logic
verify(service).perform(Matchers.argThat(contains("a", "b")));
So I want to verify that my business logic actually called the service with a collection that contains "a" and "b" in that order.
However, the return type of contains(...) is Matcher<Iterable<? extends E>>, so Matchers.argThat(...) returns Iterable<String> in my case, which naturally does not apply to the required Collection<String>.
I know that I could use an argument captor as proposed in Hamcrest hasItem and Mockito verify inconsistency, but I would very much like not to.
Any suggestions!
Thanks!
You can just write
verify(service).perform((Collection<String>) Matchers.argThat(contains("a", "b")));
From the compiler's point of view, this is casting an Iterable<String> to a Collection<String> which is fine, because the latter is a subtype of the former. At run time, argThat will return null, so that can be passed to perform without a ClassCastException. The important point about it is that the matcher gets onto Mockito's internal structure of arguments for verification, which is what argThat does.
As an alternative one could change the approach to ArgumentCaptor:
#SuppressWarnings("unchecked") // needed because of `List<String>.class` is not a thing
// suppression can be worked around by using #Captor on a field
ArgumentCaptor<List<String>> captor = ArgumentCaptor.forClass(List.class);
verify(service).perform(captor.capture());
assertThat(captor.getValue(), contains("a", "b"));
Notice, that as a side effect this decouples the verification from the Hamcrest library, and allows you to use any other library (e.g. Truth):
assertThat(captor.getValue()).containsExactly("a", "b");
If you get stuck in situations like these, remember that you can write a very small reusable adapter.
verify(service).perform(argThat(isACollectionThat(contains("foo", "bar"))));
private static <T> Matcher<Collection<T>> isACollectionThat(
final Matcher<Iterable<? extends T>> matcher) {
return new BaseMatcher<Collection<T>>() {
#Override public boolean matches(Object item) {
return matcher.matches(item);
}
#Override public void describeTo(Description description) {
matcher.describeTo(description);
}
};
}
Note that David's solution above, with casting, is the shortest right answer.
You can put your own lambda as an ArgumentMatcher
when(myClass.myMethod(argThat(arg -> arg.containsAll(asList(1,2))))
.thenReturn(...);
Why not just verify with the expected arguments, assuming the list only contains the two items, e.g.:
final List<String> expected = Lists.newArrayList("a", "b");
verify(service).perform(expected);
Whilst I agree with Eugen in principle, I think that relying on equals for String comparison is acceptable... besides, the contains matcher uses equals for comparison anyway.
Similar to another answer here you can do the following:
verify(yourmock, times(1)).yourmethod(argThat(arg -> arg.containsAll(asList("a", "b"))));
You could have your own java.util.Collection implementation and override the equals method like below.
public interface Service {
void perform(Collection<String> elements);
}
#Test
public void testName() throws Exception {
Service service = mock(Service.class);
service.perform(new HashSet<String>(Arrays.asList("a","b")));
Mockito.verify(service).perform(Matchers.eq(new CollectionVerifier<String>(Arrays.asList("a","b"))));
}
public class CollectionVerifier<E> extends ArrayList<E> {
public CollectionVerifier() {
}
public CollectionVerifier(final Collection<? extends E> c) {
super(c);
}
#Override
public boolean equals(final Object o) {
if (o instanceof Collection<?>) {
Collection<?> other = (Collection<?>) o;
return this.size() == other.size() && this.containsAll(other);
}
return false;
}
}

Why can't the Java compiler figure this out?

Why is the compiler unable to infer the correct type for the result from Collections.emptySet() in the following example?
import java.util.*;
import java.io.*;
public class Test {
public interface Option<A> {
public <B> B option(B b, F<A,B> f);
}
public interface F<A,B> {
public B f(A a);
}
public Collection<String> getColl() {
Option<Integer> iopt = null;
return iopt.option(Collections.emptySet(), new F<Integer, Collection<String>>() {
public Collection<String> f(Integer i) {
return Collections.singleton(i.toString());
}
});
}
}
Here's the compiler error message:
knuttycombe#knuttycombe-ubuntu:~/tmp/java$ javac Test.java
Test.java:16: <B>option(B,Test.F<java.lang.Integer,B>) in
Test.Option<java.lang.Integer> cannot be applied to (java.util.Set<java.lang.Object>,
<anonymous Test.F<java.lang.Integer,java.util.Collection<java.lang.String>>>)
return iopt.option(Collections.emptySet(), new F<Integer, Collection<String>>() {
^
1 error
Now, the following implementation of getColl() works, of course:
public Collection<String> getColl() {
Option<Integer> iopt = null;
Collection<String> empty = Collections.emptySet();
return iopt.option(empty, new F<Integer, Collection<String>>() {
public Collection<String> f(Integer i) {
return Collections.singleton(i.toString());
}
});
}
and the whole intent of the typesafe methods on Collections is to avoid this sort of issue with the singleton collections (as opposed to using the static variables.) So is the compiler simply unable to perform inference across multiple levels of generics? What's going on?
Java needs a lot of hand holding with its inference. The type system could infer better in a lot of cases but in your case the following will work:
print("Collections.<String>emptySet();");
First you can narrow down your problem to this code:
public class Test {
public void option(Collection<String> b) {
}
public void getColl() {
option(Collections.emptySet());
}
}
This does not work, you need a temporary variable or else the compiler cannot infer the type. Here is a good explanation of this problem: Why do temporary variables matter in case of invocation of generic methods?
Collections.emptySet() is not a Collection<String> unless Java knows that it needs a Collection<String> there. In this case, it appears that the compiler is being somewhat silly about the order that it tries to determine types, and tries to determine the return type of Collections.emptySet() before trying to determine the intended template parameter type for B is actually String.
The solution is to explictly state that you need Collections.<String>emptySet(), as mentioned by GaryF.
It looks like a typecasting issue - i.e., that it's being required to cast Object (in Set<Object>, which would be the type of the empty set) to String. Downcasts are not, in the general case, safe.

Categories

Resources