How can I log discarded fields when deserializing with GSON? - java

I am using JSON as a save format.
If I change the field names of my object GSON will silently discard the original fields upon loading the older version, because they no longer match the new names.
I would like to be able to get some notification if I do this accidentally through refactoring, i.e. "Warning: variableName not found in ObjectType during deserialization."
There is a #Version annotation but it isn't exactly what I'm looking for.
Has anyone written a custom deserializer or custom type converter that will throw an error when a field in the JSON does not exist in the type? Is there another serialization library that does this?
Edit: I would still be interested in a GSON deserializer that does this as well, if anyone has one.

Jackson can be used to fail on unexpected JSON elements. Jackson can also be configured to gather (and log) all unbound JSON elements, as described at http://www.cowtowncoder.com/blog/archives/2010/09/entry_414.html (search for "any setter").
If you're already familiar with Gson, I documented how to use Jackson to do the same things covered in the Gson User Guide at http://programmerbruce.blogspot.com/2011/07/gson-v-jackson-part-6.html.

Related

Is there a way of GSON to be "not lenient" at all?

It seems that GSON silently ignores when a JSON string contains field names that don't match the target POJO class. One solution outlined here suggests to use annotations to mark "required" fields to have GSON fail when de-serializing strings that don't contain fields.
But we defined that our POJOs must be "exact" matches (when we allow for incoming objects to be null, they must be declared as Optional field in the POJO - and we have a special type adapter that turns nulls into Optional.empty() instances). Therefore all fields in the POJO are mandatory. And null isn't a valid value.
Following the guidance in that question I linked to, it seems that the only way of having gson fail while parsing: to do a full "deep reflection" scan of the object created by de-serialization process and check if any of the Optional fields are null.
Or maybe - I am missing something and there is an easier way to have gson tell me when our JSON strings contain bad field names?
( background: we just ran into a problem because of wrong field name deep down in a nested structure - leading to null objects where we didn't expect them )
Turns out: this "deficiency" is really a core design point of gson: it is a JSON parser. Validation is not within the scope of gson.
Therefore the "correct" answer is to use java bean validation annotations and to put some implementation framework (for example the hibernate validator or apache bval) in place.
Alternatively, it is possible to register a special type adapter when creating the gson instance. This type adapter uses reflection to override an internal map with a bit of checking code - allowing for a relatively "clean" solution which leads to gson throwing an exception when running into "unknown" fields. ( thanks to Andy Turner for pointing to the corresponding github issue tracker entry --- code can be found there)

java objectmapper readvalue reading wrong string

I need to convert a certain JSON string to a Java object. I am using Jackson ObjectMapper for reading the JSON. The JSON String is something like this:-
"{"emailId":"gmail#rajnikant.com","accessToken":"accTok"}4".
When I am using objectMapper.readValue() for reading the JSON string to a specific destination class, it should throw an exception because of the JSON string being appended by 4. What should I do so that only valid JSON can be read and in other cases it will throw an exception?
To Jackson, GSON and others, a JSON string with some characters appended after the last } is valid JSON as long as what is contained between the {} is valid JSON.
As stated by a member of FasterXML (Jackson) team:
Yes. This is by design. If you want to catch such problems, you need to construct JsonParser, advance it manually. Existence of multiple root-level values is not considered a validity problem.
Reference: https://github.com/FasterXML/jackson-databind/issues/726
So if you need to enforce "clean" JSON you'll have to extend the default parser with your own functionality. However, IMO if it's OK to the default parser it should be OK to you too (unless we're dealing with some inter-language incompatibility scenario here).

Parse json array with quotes around it

I'm calling 3rd party API and receiving as a response next string:
"[{\"name\":\"name\",\"id\":1}]"
As I see it's not valid json because it has quotes around it. Is it possible somehow to map it to java object with jackson, gson libraries?
Or anyway I should write my custom converter/deserializer?
You don't need a custom converter or deserializer. You could write one of course, but I wouldn't encourage you to do so. Be explicit about what is happening here, especially when you are working in a team. It's the other side that is at fault here, they are not outputting valid JSON.
With Jackson, deserialize their output this way:
ObjectMapper mapper = new ObjectMapper();
String json = theirOutput.substring(1, theirOutput.length - 1);
Object myObject = mapper.readValue(json, MyObject.clas);
Put some documentation above why you do it this way so everybody understands what's happening here. In my opinion this is a much cleaner solution than writing a custom converter or deserializer.

Does XStream support cyclic JSON graphs?

I have an object containing cyclic references. According to the XStream Json documentation, cyclic references are NOT supported, and one should therefore use the NO_REFERENCES XStream mode when marshalling an object to Json:
What limitations has XStream's JSON support?
JSON represents a very simple data model for easy data transfer.
Especially it has no equivalent for XML attributes. Those are written
with a leading "#" character, but this is not always possible without
violating the syntax (e.g. for array types). Those may silently
dropped (and makes it therefore difficult to implement
deserialization). References are another issue in the serialized
object graph, since JSON has no possibility to express such a
construct. You should therefore always set the NO_REFERENCES mode of
XStream. Additionally you cannot use implicit collections, since the
properties in a JSON object must have unique names.
But I tried setting the mode to ID_REFERENCES and it appears as though the Object is marshalled with references, and the object can be unmarshalled properly. Is the XStream documentation simply outdated, or have I simply inadvertently created the object graph in such a way that I haven't hit any of the limitations?
Sorry, but I can't post my exact graph as an example as it contains application/domain-specific code and it might take some time to construct a 'clean' alternative.

Jackson json provider linkedHashSet deserialization

We are using Spring rest template and jackson json provider to serialize/deserialize json. From my services i send a linkedHashSet back which gets converted to a HashSet on the client side when i receive it. Because of this I loose my insertion order of elements.
Is this the default implementation of jackson json provider for Set ? Is there any other way, so it can deserialize to proper implementation? I feel it's gonna be tricky but inputs will be highly appreciated from you guys.
Thanks
You can specify the concrete class for Jackson to use with the #JsonDeserialize annotation. Just put:
#JsonDeserialize(as=LinkedHashSet.class)
On the property's setter.
It all depends on what you ask the result type to be: if ask data to be mapped to a LinkedHashSet, then JSON Array gets mapped to it. If you use a vague type like java.lang.Object (or java.util.Collection), you will get ArrayList for JSON Arrays.
Keep in mind that JSON is data, not objects (by default), so metadata regarding Java types you used is not passed by default. There are ways to do that, if you need it, but usually you will simply need to provide expected type.

Categories

Resources