JSON duplicated properties validation via Jackson - java

I use Jackson and want to check that input JSON string doesn't contain duplicated properties like:
{"a" : 1, "a" : 0}
Following Jackson fragment process input string without any errors and even return value:
JsonNode jsonSelect = mapper.readTree("{ A : 1, A : 0}");
System.out.println(jsonSelect.getFieldValue("A")); // prints 0
Does I have a chance to validate duplicates via Jackson?
P.S. Does JSON format support duplicated properties at all? I didn't find any restrictions about it in specification. Also org.json.JSONObject throws an exception for duplicates that doesn't give me an answer - is {"a" : 1, "a" : 0} well-formed according to standard.

JSON specification indicates duplicates are not consider valid, but parsers are not required to do anything about them. From practical perspective, keeping track of all seen properties adds overhead, which may not make sense at streaming parser level.
As to Jackson, it used to have duplicate detection at data binding level, but I think that is not enabled at this point. It could be added fairly easily when dealing with Maps.
If this is something you would want, filing a feature request or asking on user list might make sense (esp. to see if others would want this feature too, making it more likely to get added soon).
If all you want to do is just validation, you could create a Map subclass, make it throw exception on duplicate. Or, just set a flag in sub-class that you can check if you prefer.

JSON does not support duplicated properties. So if your input is guaranteed to be valid JSON you don't have to check for them.

Related

Is there a way of GSON to be "not lenient" at all?

It seems that GSON silently ignores when a JSON string contains field names that don't match the target POJO class. One solution outlined here suggests to use annotations to mark "required" fields to have GSON fail when de-serializing strings that don't contain fields.
But we defined that our POJOs must be "exact" matches (when we allow for incoming objects to be null, they must be declared as Optional field in the POJO - and we have a special type adapter that turns nulls into Optional.empty() instances). Therefore all fields in the POJO are mandatory. And null isn't a valid value.
Following the guidance in that question I linked to, it seems that the only way of having gson fail while parsing: to do a full "deep reflection" scan of the object created by de-serialization process and check if any of the Optional fields are null.
Or maybe - I am missing something and there is an easier way to have gson tell me when our JSON strings contain bad field names?
( background: we just ran into a problem because of wrong field name deep down in a nested structure - leading to null objects where we didn't expect them )
Turns out: this "deficiency" is really a core design point of gson: it is a JSON parser. Validation is not within the scope of gson.
Therefore the "correct" answer is to use java bean validation annotations and to put some implementation framework (for example the hibernate validator or apache bval) in place.
Alternatively, it is possible to register a special type adapter when creating the gson instance. This type adapter uses reflection to override an internal map with a bit of checking code - allowing for a relatively "clean" solution which leads to gson throwing an exception when running into "unknown" fields. ( thanks to Andy Turner for pointing to the corresponding github issue tracker entry --- code can be found there)

Custom serialize and deserialize to create JSON

I have a class with 2 values: val1 and val2. I am sending val1 to register (create) API and val2 is auto filled by API itself. I do not want to send val2 while calling create API and that API is not designed for handling unwanted values.
In short I want to ignore val2 while I call create API but I want it while I call get API.
The code that I have right now creates JSON including both the values assigning null to val2. This causes that API to throw an exception.
Is there any easy way of doing it (java /groovy)?
Is there any easy way of doing it (java /groovy)?
Not 100% sure I'm understanding your need. I believe it depends on what json de/serializer you are using. For example, using Jackson we do:
#JsonIgnoreProperties(ignoreUnknown = true)
#JsonTypeName("account")
public class Account {
I believe this allows us to load in objects with a ton of extra json fields into objects without corresponding java fields. To quote from the javadocs:
Property that defines whether it is ok to just ignore any unrecognized properties during deserialization. If true, all properties that are unrecognized -- that is, there are no setters or creators that accept them -- are ignored without warnings (although handlers for unknown properties, if any, will still be called) without exception.

What is the best practice to return dynamic type from a REST API in SpringMVC

I have implemented some REST API with springMVC+Jackson+hibernate.
All I needed to do is retrieve objects from database, return it as a list, the conversion to JSON is implicit.
But there is one problem. If I want to add some more information to those object before return/response. For example I am returning a list of "store" object, but I want to add a name of the person who is attending right now.
JAVA does not have dynamic type (how I solve this problem in C#). So, how do we solve this problem in JAVA?
I thought about this, and have come up with a few not so elegant solution.
1. use factory pattern, define another class which contain the name of that person.
2. covert store object to JSON objects (ObjectNode from jackson), put a new attribute into json objects, return json objects.
3. use reflection to inject a new property to store object, return objects, maybe SpringMVC conversion will generate JSON correctly?
option 1 looks bad, will end up with a lot of boiler plate class which doesn't really useful. option 2 looks ok, but is this the best we could do with springMVC?
option 1
Actually your JSON domain is different from your core domain. I would decouple them and create a seperate domain for your JSON objects, as this is a seperate concern and you don't want to mix it. This however might require a lot of 1-to-1 mapping. This is your option 1, with boilerplate. There are frameworks that help you with the boilerplate (such as dozer, MapStruct), but you will always have a performance penalty with frameworks that use generic reflection.
option 2, 3
If you really insist on hacking it in because it's only a few exceptions and not a common pattern, I would certainly not alter the JSON nodes or use reflection (your option 2 and 3). This is certainly not the way to do it in Java.
option 4 [hack]
What you could do is extend your core domain with new types that contain the extra information and in a post-processing step replace the old objects with the new domain objects:
UnaryOperator<String> toJsonStores = domainStore -> toJsonStore(domainStore);
list.replaceAll(toJsonStores);
where the JSONStore extends the domain Store and toJsonStore maps the domain Store to the JSONStore object by adding the person name.
That way you preserve type safety and keep the codebase comprehensive. But if you have to do it more then in a few exceptional cases, you should change strategy.
Are you looking for a rest service that return list of objects that contain not just one type, but many type of objects? If so, Have you tried making the return type of that service method to List<Object>?
I recommend to create a abstract class BaseRestResponse that will be extended by all the items in the list which you want return by your rest service method.
Then make return type as List<BaseRestResponse>.
BaseRestResponse should have all the common properties and the customized object can have the property name as you said

Deserializing selected property names only (Jackson)

Let's say we have following JSON,
{
"id": "imgsId1",
"type": "Fruits",
"name": "Tropical",
"image":
{
"url": "images/img1.jpg",
"width": 300,
"height": 300
},
"thumbnail":
{
"url": "images/thumbnails/img11.jpg",
"width": 50,
"height": 50
}
}
And in Java Class, we have all fields matching with above JSON.
Each time list of fields to be Deserialized depends on customer who sends the information.
For example for customer 1, we want to only read back following values, (and skip other properties even if provided in JSON)
String[] propertiesToFilter1 = {"type","image.url"};
For example for customer 2, we want to read back following values, (and skip other properties even if provided in JSON)
String[] propertiesToFilter2 = {"type","image.url", "image.width"};
When Deserializing JSON using Jackson, is it possible to provide above array which includes which fields need to be Deserialized,
ImageInfo obj1 = (ImageInfo)objectMapper.readValue(jsonStr, ImageInfo.class);
Update:
On researching on net, i saw that one of the options could be via using
FilterProvider filterProvider = new SimpleFilterProvider().addFilter("filterName1",
SimpleBeanPropertyFilter.serializeAllExcept(propertiesToFilter1));
objectMapper.setFilters(filterProvider);
But i think this is good, if we want to keep reusing the same "filterName1" for multiple customers.
In this scenario, it's little bit different because, we customize list of fields each customer can update. So each customer has different list of JSON fields they can update in different classes.
If we start defining different filter names for each customer, it will be a long list, and lookup will have performance impact.
So was looking for solution, where i can check list of fields allowed to be processed at runtime, when constructing back object using objectMapper.readValue() method.
Update 2 (Apr 25 2016):
Going through other Jackson questions, saw a similar question here,
Jackson Dynamic filtering of properties during deserialization
Using the approach listed below by creating custom "static ObjectMapper", the issue with this approach is we running Reflection API multiple times.
First time Jackson parser is populating all fields using
Reflection API when Deserializing JSON to Java Object
Second time, since we can't take all fields that were populated by
Jackson parser, for populate data into another object, we again need to run through Reflection API to populate another object.
This could result in lot of overhead.
Using the approach defined in above link provided, i think using "BeanDeserializerModifier" seems to be best approach. Now the question is, since we are also using Factory based approach to initialize ObjectMapper, we don't want to hard code all arrays for different customers.
Wanted to check if it's possible to provide the String[] array with list of Properties to be considered at runtime to "BeanDeserializerModifier"?
something similar to,
String[] propertiesToFilter2 = {"type","image.url", "image.width"};
BeanDeserializerModifier curBeanDeserializerModifier =
getBeanDeserializerModifierInstance();
curBeanDeserializerModifier.setPropertiesToConsider(propertiesToFilter2);
Thanks
use #JsonIgnoreProperties with configure parameters
http://www.programcreek.com/java-api-examples/index.php?api=com.fasterxml.jackson.annotation.JsonIgnoreProperties
I am not sure if there is a possibility to configure the deserialization dynamically with annotations.
I would suggest to create a class with a static ObjectMapper. In this class you can create different implementations of deserialization. The business logic of your application should then decide which implementation should be used for which customer. Inside your different implementations you are able to configure the ObjectMapper like you do it with annotations.
A second solution can be to deserialize the full json for every customer and let the business logic decide which fields/objects of the Pojo is used. This needs also an implementation in your application.
The benefit of the implementation of the configuration in the business logic is that you will have cleaner code and one place where your configuration is done for every customer.
static ObjectMapper information

Spring/Jackson Mapping Inner JSON Objects

I have a RESTful web service that provides JSON that I am consuming. I am using Spring 3.2 and Spring's MappingJacksonHttpMessageConverter. My JSON looks like this:
{
"Daives": {
"Daive": {},
"Daive": {},
"Daive": {},
"Daive": {}
}
}
Now everything I have read seems to indicate that this JSON should be refactored to an array of JSON Daives. However, this is valid JSON so I want to make sure that I am thinking correctly before going back to the service provider to ask for changes. In the format above, I would have to know ahead of time how many Daives there are going to be such that my DTO accounted for them. The handy dandy Jackson mapper isn't going work with this kind of JSON setup. If the JSON was altered to provide and Array of JSON Daives, I could use a List to dynamically map them using Spring/Jackson.
Am I correct? Thanks :)
According to this thread, the JSON spec itself does not forbid multiple fields with the same name (in your case, multiple fields named "Daive" in the object "Daives").
However, most parsers will either return an error or ignore any value but the last one. As you said, putting these values into an array seems much more sensible; and indeed, you'll be able to map this array to a List with Jackson.

Categories

Resources