I have a property in my JSF managed bean:
private List<Long> selectedDataSets;
I initialize the list like this within an other method:
ArrayList<Long> longList = new ArrayList<>();
What happens is I get java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Long right when it jumps on this foreach:
for (Long CRC : selectedDataSets) { ... }
Which is very odd. Debug shows that selectedDataSets are full of String values, but I thought that's not even possible. Please describe me what exactly happened here.
Apparently you bound the property to an UISelectMany component like <h:selectManyCheckbox> or <selectManyListbox> without explicitly specifying a Converter. In Java, the generic type is erased during runtime and JSF (more specifically, EL) does not know anything about the generic list type at all and defaults to String unless told otherwise by a Converter. It's String because that's just the default value type of HttpServletRequest#getParameterMap(). EL fills the list with submitted values by reflection and does not take any generic types into account.
So, for example this should do it for you, with help of the builtin LongConverter:
<h:selectManyCheckbox value="#{bean.selectedDataSets}" converter="javax.faces.Long">
See also:
Use enum in h:selectManyCheckbox
Note that this has nothing to do with Java 7's diamond operator. You would have exactly the same problem when you have experimented with new ArrayList<Long>().
Related
This method is defined in a JpaRepository and runs a native PostgreSQL query.
List<Long> distributorIds = distributorRepository
.findDistributorIdsWithChildren(distributorId)
It runs with no exception and on runtime I see BigInteger values in returned distributorIds ArrayList instead of Long values.
It's same with this question: Bug in Spring Data JPA: Spring Data returns List<BigInteger> instead of List<Long>
So how can this bug occur? I mean how JAVA allows this? If Java doesn't check this kind of type errors isn't it a problem with generics in JAVA.
Note: I also checked the type hierarchy for Long and BigInteger and there is no sub/super class relation.
Generic type checks are a compile time feature. At runtime all type information is lost. See "type erasure". The behavior you see can easily happen if, for example, a legacy API that uses non-generic collections is mapped to an API that uses generics and requires a cast of the collection. If that collection happens to contain objects of an unexpected type you will, sadly, only find out at runtime.
I am trying to get a list of objects using Spring RestTemplate. I am confused as to why choose ParameterizedTypeReference approach to get list of objects using restTemplate instead of just using Object[].class?
I have checked multiple answers suggesting to use ParameterizedTypeReference. But why can't I just use Object[].class? What limitations do I have?
I have checked this link (https://stackoverflow.com/a/49752261/6001027) where it says, I can use Object[] only for simple cases and have to got for ParameterizedTypeReference while handling complex json structures. Can someone explain me under what cases I cannot use Object[] approach?
ParameterizedTypeReference approach:
ResponseEntity<List<Rating>> responseEntity =
restTemplate.exchange("http://localhost:8084/ratingsdata/user/" + userId,
HttpMethod.GET, null, new ParameterizedTypeReference<List<Rating>>() {
});
List<Rating> ratings = responseEntity.getBody();
Object[] approach:
List<Rating> ratings = Arrays.asList(restTemplate.getForObject("http://localhost:8084/ratingsdata/user/"+userId, Rating[].class));
The answer is simple, to keep type information during runtime.
A list can act as an array, but an array can't act like a list. Why what you are doing is working is because you are casting an array to a list, and as stated a list can act as an array so you are safe, this time.
But casting, in general, is bad practice.
When you are casting something you are taking a risk, you are basically forcing the compiler to "trust you" in what you are doing. So instead of the compiler telling you what is right or wrong, you are telling the compiler what is right and wrong and that is a risk, risk of breaking stuff. Hard, because if you are wrong during runtime, we might crash. Hard and uncontrollably.
There are many programmers with many opinions, but there is only one compiler with one opinion and we should trust it.
So back to the question, what does ParameterizedTypeReference<T>.class actually do?
If you look in its constructor you can see that it acts like a vessel for type information. By doing new ParameterizedTypeReference<List<Foo.class>> {}; you are instantiating an anonymous class while passing in a type. Then in the constructor it extract the type information from the type passed and stores it internally. Then at a later moment we can do getType() to get the type information so we can do a typesafe "casting" during runtime.
final ParameterizedTypeReference<List<String>> typeRef = new ParameterizedTypeReference<>() {};
final Type type = typeRef.getType();
final String typeName = type.getTypeName();
System.out.println(typeName);
// will print "java.util.List<java.lang.String>"
This pattern is called "Super type tokens" and you can read more about it here Neal Gafter's blog - Super Type Tokens
While testing new rules I noticed that there is one bug. My rule is checking for method parameters and return type and is checking if those values owner has certain annotation.
Previously I had problem of getting Array in method parameters and getting element type of that Array. But then I found solution:
if (parameterType.isArray()) {
Type.ArrayType arrayType = (Type.ArrayType) parameterType;
Type arrayElementType = arrayType.elementType();
...
But currently I have another issue. My rule found List as return value. I tried to find something similar to Type.ArrayType but with no success.
Is there a way to get List element type?
Short answer : no there is not. Parameterized types are not provided as part of the semantic API.
We already have in mind to provide this at some point : https://jira.sonarsource.com/browse/SONARJAVA-1871
but no clear plan defined as of today.
I need Morphia to support serialization of java 8 Optional. Morphia clearly doesn't special case Optional, and, by default, Morphia seems to serialize an Optional with a value to {value: BLAH} and to drop an empty Optional (as I have dropEmpty or whatever configured).
When I attempt to rehydrate an Optional containing an enum though, Morphia fails with a class cast exception in the bowels of the mapping logic:
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to com.mongodb.DBObject
at org.mongodb.morphia.mapping.EmbeddedMapper.fromDBObject(EmbeddedMapper.java:160)
Indeed, Morphia seems to be losing type information; when I implemented my own TypeConverter, the MappedField contained no subClass information, which is where I'd normally look for information for the information. Instead, I had to store class information about the inner value in a separate field so that the result ends up looking like:
{"valueClassName" : "full.class.name" "value" : BLAH}.
Is there a more elegant way of handling this? This pretty much seems like a special case of IterableConverter (although that clearly depends on the subClass value being present within MappedField as well.
For what it's worth, 'upgrading morphia' isn't much of an option, because of the myriad bugs that erupt whenever we try to do so. This was failing with org.mongodb.morphia version 0.108
I have a form with (at the moment) two fields and submit the following:
capture.id = 213
capture.description = DescriptionText
The target object 'capture' is immutable and I would like to provide a type converter to take both values and call the constructor. What I cannot seem to do is get by TypeConverter to be invoked.
If the input is simply:
capture = foo
Then the type converter is called, but obviously this isn't much use, is there away to make a ognl delegate the rest of the rest of the type conversation to me, perhaps passing in a Map of the parameters?
Any ideas? Is this even possible in struts2
versions: struts 2.0.14 & ognl 2.6.11
EDIT: I've done a bit of reading on this and my next attempt seemed to me to be a good plan. My theory was that using the Map syntax would make Ognl convert the values to a map and then call my converter with that map to convert it to my value.
capture[id] = 213
capture[description] = DescriptionText
Nope that doesn't seem make any difference at all.
The way I did this was to have the following in the JSP:
<s:textfield name="capture" value="capture.id" />
<s:textfield name="capture" value="capture.description" />
In the type converter, the String[] values parameter of the convertFromString method will contain both values needed to construct a new immutable capture. Provided that you are consistent with the text field ordering (or better yet, encapsulate it in a tag file), you can use the indexes of the values array to reliably get the appropriate field of the capture object.
The one weird part about this approach is that the convertToString method doesn't really do anything for you. You can return either id or description (or concatenate them together), but since you are using the values attribute in the JSP, it doesn't matter.
It seems the that the answer is no you can't do that with struts2.
I've posted this question on the struts2 mailing list and it seems that it just isn't possible to have multiple fields be presented to a TypeConverter.
The alternative solution suggested is to have mutable object with setters and then have some form of 'petify' method to prevent any future changes.
For my project I've actually implemented another struts Interceptor to implement my custom parameter binding behaviour.