It seems that GSON silently ignores when a JSON string contains field names that don't match the target POJO class. One solution outlined here suggests to use annotations to mark "required" fields to have GSON fail when de-serializing strings that don't contain fields.
But we defined that our POJOs must be "exact" matches (when we allow for incoming objects to be null, they must be declared as Optional field in the POJO - and we have a special type adapter that turns nulls into Optional.empty() instances). Therefore all fields in the POJO are mandatory. And null isn't a valid value.
Following the guidance in that question I linked to, it seems that the only way of having gson fail while parsing: to do a full "deep reflection" scan of the object created by de-serialization process and check if any of the Optional fields are null.
Or maybe - I am missing something and there is an easier way to have gson tell me when our JSON strings contain bad field names?
( background: we just ran into a problem because of wrong field name deep down in a nested structure - leading to null objects where we didn't expect them )
Turns out: this "deficiency" is really a core design point of gson: it is a JSON parser. Validation is not within the scope of gson.
Therefore the "correct" answer is to use java bean validation annotations and to put some implementation framework (for example the hibernate validator or apache bval) in place.
Alternatively, it is possible to register a special type adapter when creating the gson instance. This type adapter uses reflection to override an internal map with a bit of checking code - allowing for a relatively "clean" solution which leads to gson throwing an exception when running into "unknown" fields. ( thanks to Andy Turner for pointing to the corresponding github issue tracker entry --- code can be found there)
Related
I'm using GSON to convert JSON data I get to a Java object. It works pretty well in all my tests.
The problem is that our real objects have some properties named like is_online. GSON only maps them if they are named totally equal, it would be nice to have GSON convert the names to Java camel case isOnline.
It seems this is possible while creating the JSON data, camel case is converted to underscore separated words in JSON. But I can't find a way to specify this the other way round.
I have found the following setting works perfect when reading json with underscored attributes and using camelcasing in my models.
Gson gson = new GsonBuilder()
.setFieldNamingPolicy(FieldNamingPolicy.LOWER_CASE_WITH_UNDERSCORES)
.create()
You can use the SerializedName annotation:
#SerializedName("field_name_in_json")
private final String fieldNameInJava;
Note: When you have set a FieldNamingPolicy already, SerializedName will overwrite its settings for that specific field (quite handy for special cases).
Bear in mind your example is an edge case. If you have a property 'foo' its getter should be named 'getFoo', and if you have a property named 'foo_bar' its getter should be named 'getFooBar', however, in your example you're mapping a boolean and booleans have special case naming conventions in java. A primitive boolean property named online should have a getter named 'isOnline', NOT 'getOnline' or even worse, 'getIsOnline'. A boolean wrapper object (i.e. Boolean) should not follow this special case and a property named 'online' should have a getter named 'getOnline'.
Hence, having boolean properties with 'is' in the name is an edge case, where you'll want to strip out this particular prefix during your conversion. In the reverse direction, your code may want to inspect the json object for both a raw property name as well as a 'is_XXX' version.
I think what you want is here. Using annotations you can tell GSON that the mySuperCoolField is actually called this_field_is_fun in the JSON and it will unpack it correctly. At least I think it works for deserialization too.
If that doesn't work, you can use custom JsonSerializer/JsonDeserializers, which work great, but you have to update them for changes in your class (like when you add a field). You lose the auto-magic.
The easiest thing to do (which would be ugly, but very clean and simple if the first suggestion doesn't work) would be to simply name the field in a way to make GSON happy, and add extra accessor methods with the names you like, e.g.
public boolean isXXX() {return this.is_XXX;}
I have a class with 2 values: val1 and val2. I am sending val1 to register (create) API and val2 is auto filled by API itself. I do not want to send val2 while calling create API and that API is not designed for handling unwanted values.
In short I want to ignore val2 while I call create API but I want it while I call get API.
The code that I have right now creates JSON including both the values assigning null to val2. This causes that API to throw an exception.
Is there any easy way of doing it (java /groovy)?
Is there any easy way of doing it (java /groovy)?
Not 100% sure I'm understanding your need. I believe it depends on what json de/serializer you are using. For example, using Jackson we do:
#JsonIgnoreProperties(ignoreUnknown = true)
#JsonTypeName("account")
public class Account {
I believe this allows us to load in objects with a ton of extra json fields into objects without corresponding java fields. To quote from the javadocs:
Property that defines whether it is ok to just ignore any unrecognized properties during deserialization. If true, all properties that are unrecognized -- that is, there are no setters or creators that accept them -- are ignored without warnings (although handlers for unknown properties, if any, will still be called) without exception.
I need Morphia to support serialization of java 8 Optional. Morphia clearly doesn't special case Optional, and, by default, Morphia seems to serialize an Optional with a value to {value: BLAH} and to drop an empty Optional (as I have dropEmpty or whatever configured).
When I attempt to rehydrate an Optional containing an enum though, Morphia fails with a class cast exception in the bowels of the mapping logic:
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to com.mongodb.DBObject
at org.mongodb.morphia.mapping.EmbeddedMapper.fromDBObject(EmbeddedMapper.java:160)
Indeed, Morphia seems to be losing type information; when I implemented my own TypeConverter, the MappedField contained no subClass information, which is where I'd normally look for information for the information. Instead, I had to store class information about the inner value in a separate field so that the result ends up looking like:
{"valueClassName" : "full.class.name" "value" : BLAH}.
Is there a more elegant way of handling this? This pretty much seems like a special case of IterableConverter (although that clearly depends on the subClass value being present within MappedField as well.
For what it's worth, 'upgrading morphia' isn't much of an option, because of the myriad bugs that erupt whenever we try to do so. This was failing with org.mongodb.morphia version 0.108
I have an object containing cyclic references. According to the XStream Json documentation, cyclic references are NOT supported, and one should therefore use the NO_REFERENCES XStream mode when marshalling an object to Json:
What limitations has XStream's JSON support?
JSON represents a very simple data model for easy data transfer.
Especially it has no equivalent for XML attributes. Those are written
with a leading "#" character, but this is not always possible without
violating the syntax (e.g. for array types). Those may silently
dropped (and makes it therefore difficult to implement
deserialization). References are another issue in the serialized
object graph, since JSON has no possibility to express such a
construct. You should therefore always set the NO_REFERENCES mode of
XStream. Additionally you cannot use implicit collections, since the
properties in a JSON object must have unique names.
But I tried setting the mode to ID_REFERENCES and it appears as though the Object is marshalled with references, and the object can be unmarshalled properly. Is the XStream documentation simply outdated, or have I simply inadvertently created the object graph in such a way that I haven't hit any of the limitations?
Sorry, but I can't post my exact graph as an example as it contains application/domain-specific code and it might take some time to construct a 'clean' alternative.
I am using JSON as a save format.
If I change the field names of my object GSON will silently discard the original fields upon loading the older version, because they no longer match the new names.
I would like to be able to get some notification if I do this accidentally through refactoring, i.e. "Warning: variableName not found in ObjectType during deserialization."
There is a #Version annotation but it isn't exactly what I'm looking for.
Has anyone written a custom deserializer or custom type converter that will throw an error when a field in the JSON does not exist in the type? Is there another serialization library that does this?
Edit: I would still be interested in a GSON deserializer that does this as well, if anyone has one.
Jackson can be used to fail on unexpected JSON elements. Jackson can also be configured to gather (and log) all unbound JSON elements, as described at http://www.cowtowncoder.com/blog/archives/2010/09/entry_414.html (search for "any setter").
If you're already familiar with Gson, I documented how to use Jackson to do the same things covered in the Gson User Guide at http://programmerbruce.blogspot.com/2011/07/gson-v-jackson-part-6.html.