I am trying to de-serialize this JSON object using Jackson 2.8 as part of Retrofit response. Here is the JSON response I get from the server.
{
"id":"8938209912"
"version":"1.1"
"cars":{
"mercedes":[
{
"property":"color"
},
{
"property":"price"
},
{
"property":"location"
}
],
"tesla":[
{
"property":"environment"
}
]
}
}
Based on the query, the cars above may have one or more models returned. I cannot create a class each for each model as these get created/removed arbitrarily. For each model of the car (say tesla), there may be one or more property key-value pairs.
I am new to Jackson. I have been looking at several examples and looks like a custom #JsonDeserialize is the best way to go. So, I created Root class and Cars class like this:
// In file Root.java
public class Root {
#JsonProperty("id")
private String id = null;
#JsonProperty("version")
private String version = null;
#JsonProperty("cars")
private Cars cars = null;
}
// In file Cars.java
public class Cars {
public Cars(){}
#JsonDeserialize(using = CarDeserializer.class)
private Map<String, List<Property>> properties;
public Map<String, List<Property>> getProperties() {
return properties;
}
public void setProperties(Map<String, List<Property>> properties) {
this.properties = properties;
}
}
// Property.java
public class Property {
#JsonProperty("property")
private String property;
}
My de-serializer is below. However, even though the empty constructor gets called, the parse method itself is not called at all!
// CarDeserializer.class
public class RelationshipDeserializer extends StdDeserializer<Map<String, List<Action>>>{
protected RelationshipDeserializer(){
super(Class.class);
}
#Override
public Map<String, List<Action>> deserialize(JsonParser parser, DeserializationContext ctx)
throws IOException, JsonProcessingException
{
// This method never gets invoked.
}
}
My questions:
Is this the right approach in the first place?
Why do you think the execution never gets to the deserialize()? (I checked, the cars object is present in JSON.
Are there better approaches to parse this JSON using Jackson?
The "properties" deserializer is never called because that does not match anything in that JSON. The field name in the JSON is "property" and it does not match Map<String, List<Property>>. It looks like it would be closer to List<Property>
Do you control the in coming JSON? It would be better for the car name/type to be in its own field rather than the name of the object. Then you can use a generic object. What you have now is going to break. Any time they add a new name/type and you do not have a matching object for it.
Related
I have a JSON string which I would like to translate into POJO using ObjectMapper.readValue method.
The thing is that the input Json string contains keys which I would like to filter out before the deserialization.
I came across DelegatingDeserialization class which according to my understanding allows you to extend it and override one of the deserialize method to reconstruct the json input and then pass it on the chain.
The thing is that I try to enable this custom delegating deserializer by adding the
#JsonDeserialize(using = CustomDelegatingDeserialization.class) on top of my Pojo - is that the right way to instantiate it??
Here is a snippet of my custom delegator:
public static class CustomDeserializer extends DelegatingDeserializer {
public CustomDeserializer() {
super(null);
}
public CustomDeserializer(JsonDeserializer<?> defaultDeserializer) {
super(defaultDeserializer);
}
#Override
protected JsonDeserializer<?> newDelegatingInstance(JsonDeserializer<?> newDelegatee) {
return new CustomDeserializer(newDelegatee);
}
#Override
public Object deserialize(JsonParser p, DeserializationContext ctxt) throws IOException {
return super.deserialize(restructure(p), ctxt);
}
private JsonParser restructure(JsonParser jp) throws IOException {
...
return newJsonParser;
}
}
Am I taking the right path or there is a more fitting solution??
THank you!
EDIT 1
Another approach is to have a CustomJsonDeserializer extends JsonDeserializer<T> and override its deserialize method then reconstruct the Node and propagate it by returning codec.treeToValue(jsonNode, Pojo.class); this makes sense BUT it gets me into infinite loop! any idea why?
Assuming your POJO doesn't have a property that you would like to ignore you can use annotation #JsonIgnoreProperties(ignoreUnknown = true)for your class. That tells Jeckson to ignore properties that are not present in your POJO. Read more on the issue how to ignore some properties here: Jackson Unmarshalling JSON with Unknown Properties
I use Spring MVC to drive the API of an application I am currently working with. The serialization of the API response is done via Jackson's ObjectMapper. I am faced with the following situation, we are extending a number of our objects to support UserDefinedFields (UDF) which is shown below in the abstract UserDefinedResponse. Being a SaaS solution, multiple clients have different configuration that is stored in the database for their custom fields.
The goal of this question is to be able to respond to each client with their UDF data. This would require
Dynamically rename the fields customString1, customString2, ... to their corresponding UDF labels
Remove undefined UDF fields (Example client uses only 2 out of the 4 fields.
Example of the abstract response
public abstract class UserDefinedResponse {
public String customString1;
public String customString2;
public String customString3;
public String customString4;
}
And response for a product that extends the UserDefinedResponse object
public class Product extends UserDefinedResponse {
public long id;
public String name;
public float price;
}
And finally, assuming a client sets
customString1 = "supplier"
customString2 = "warehouse"
Serializing Product for this customer should result in something similar to this:
{
"id" : 1234,
"name" : "MacBook Air",
"price" : 1299,
"supplier" : "Apple",
"warehouse" : "New York warehouse"
}
I think you could do what you need with the help of a few Jackson annotations:
public abstract class UserDefinedResponse {
#JsonIgnore
public String customString1;
#JsonIgnore
public String customString2;
#JsonIgnore
public String customString3;
#JsonIgnore
public String customString4;
#JsonIgnore // Remove if clientId must be serialized
public String clientId;
private Map<String, Object> dynamicProperties = new HashMap<>();
#JsonAnyGetter
public Map<String, Object> getDynamicProperties() {
Mapper.fillDynamicProperties(this, this.dynamicProperties);
return this.dynamicProperties;
}
#JsonAnySetter
public void setDynamicProperty(String name, Object value) {
this.dynamicProperties.put(name, value);
Mapper.setDynamicProperty(this.dynamicProperties, name, this);
}
}
First, annotate all the properties of your base class with #JsonIgnore, as these won't be part of the response. Then, make use of the #JsonAnyGetter annotation to flatten the dynamicProperties map, which will hold the dynamic properties. Finally, the #JsonAnySetter annotation is meant to be used by Jackson on deserialization.
The missing part is the Mapper utility class:
public abstract class Mapper<T extends UserDefinedResponse> {
private static final Map<Class<T>, Map<String, Mapper<T>>> MAPPERS = new HashMap<>();
static {
// Mappers for Products
Map<String, Mapper<Product>> productMappers = new HashMap<>();
productMappers.put("CLIENT_1", new ProductMapperClient1());
productMappers.put("CLIENT_2", new ProductMapperClient2());
// etc for rest of clients
MAPPERS.put(Product.class, productMappers);
// Mappers for Providers
Map<String, Mapper<Provider>> providerMappers = new HashMap<>();
providerMappers.put("CLIENT_1", new ProviderMapperClient1());
providerMappers.put("CLIENT_2", new ProviderMapperClient2());
// etc for rest of clients
MAPPERS.put(Provider.class, providerMappers);
// etc for rest of entities
// (each entity needs to add specific mappers for every client)
}
protected Mapper() {
}
public static void fillDynamicProperties(T response, Map<String, Object> dynamicProperties) {
// Get mapper for entity and client
Mapper<T> mapper = MAPPERS.get(response.getClass()).get(response.clientId);
// Perform entity -> map mapping
mapper.mapFromEntity(response, dynamicProperties);
}
public static void setDynamicProperty(Map<String, Object> dynamicProperties, String name, T response) {
// Get mapper for entity and client
Mapper<T> mapper = MAPPERS.get(response.getClass()).get(response.clientId);
// Perform map -> entity mapping
mapper.mapToEntity(dynamicProperties, name, response);
}
protected abstract void mapFromEntity(T response, Map<String, Object> dynamicProperties);
protected abstract void mapToEntity(Map<String, Object> dynamicProperties, String name, T response);
}
And for Product entity and client CLIENT_1:
public class ProductMapperClient1 extends Mapper<Product> {
#Override
protected void mapFromEntity(Product response, Map<String, Object> dynamicProperties) {
// Actual mapping from Product and CLIENT_1 to map
dynamicProperties.put("supplier", response.customString1);
dynamicProperties.put("warehouse", response.customString2);
}
#Override
protected void mapToEntity(Map<String, Object> dynamicProperties, String name, Product response) {
// Actual mapping from map and CLIENT_1 to Product
String property = (String) dynamicProperties.get(name);
if ("supplier".equals(name)) {
response.customString1 = property;
} else if ("warehouse".equals(name)) {
response.customString2 = property;
}
}
}
The idea is that there's a specific mapper for each (entity, client) pair. If you have many entities and/or clients, then you might consider filling the map of mappers dynamically, maybe reading from some config file and using reflection to read the properties of the entity.
Have you considered returning Map<> as a response? Or a part of the response, like response.getUDF().get("customStringX"))? This should save you some possible trouble in the future, e.g.: 10 millions of concurrent users means 10 million classes in your VM.
I'm using Amazon's DynamoDBMapper Java class to save data to a DynamoDB table. This code needs to work for data structured in multiple different ways, so I would like to stay away from writing particularly structure-specific code. For this reason, I store the code as JSON objects in Java -- which are basically glorified HashMaps.
I would like to store these JSON objects into DynamoDB as Dynamo's relatively new JSON Document type.
The way the DynamoDBMapper API works is essentially that you write a Java class (typically a POJO), then add some annotations, then pass your objects of that class into DynamoDBMapper so that it can then put items into the database with the structure of the Java class. This works well for many aspects of what I'm doing, but not with the fact that I want these classes to contain arbitrarily-structured JSON documents. This is the way you're meant to store JSON documents using DynamoDBMapper, and as you can see, it doesn't allow for the structure of the documents to be arbitrary.
I realize I could use Dynamo's putItem() to pass the jsons as Strings into Item objects -- I just wanted to see if what I want to do is possible with DynamoDBMapper before I shift my approach.
You can try using the DynamoDB Java document SDK instead of the object mapper. This allows you to serialize and deserialize JSON strings using the fromJSON and toJSON methods in the Item class. Check out http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/JavaDocumentAPIItemCRUD.html.
Here's how I came up with my answer of how to store arbitrary Map objects in DynamoDB. This is extremely useful for archiving REST API responses that have been unmarshaled to foreign objects. I'm personally using this to archive REST responses from the PayPal Payment API. I don't care what variables they use in their REST API or the structure of their POJO / beans. I just want to make sure I save everything.
#DynamoDBTable(tableName = "PaymentResponse")
#JsonIgnoreProperties(ignoreUnknown = true)
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include = JsonTypeInfo.As.PROPERTY)
#JsonSubTypes({
#JsonSubTypes.Type(value = PayPalPaymentResponse.class, name = "PayPalPaymentResponse"),
#JsonSubTypes.Type(value = BatchPayPalPaymentResponse.class, name = "BatchPayPalPaymentResponse")}
)
public abstract class PaymentResponse {
// store any arbitrary REST resrponse data in map form so we don't have to worry about the
// structure or the actual response itself
protected Map<String, String> paymentResponseData = Maps.newHashMap();
public PaymentResponse(PaymentResponseType paymentResponseType) {
this.paymentResponseType = paymentResponseType;
}
public Map<String, String> getPaymentResponseData() { return paymentResponseData; }
public void setPaymentResponseData(Map<String, String> paymentResponseData) { this.paymentResponseData = paymentResponseData; }
#Override
public String toString() {
return Arrays.toString(paymentResponseData.entrySet().toArray());
}
}
public class ConverterUtils {
public static BatchPayPalPaymentResponse getBatchPayPalPaymentResponse(PayoutBatch payoutBatch) throws IOException {
//read in the PayoutBatch response data and convert it first to a JSON string and then convert the
//JSON string into a Map<String, String>
Map<String, String> responseData = objectMapper.readValue(objectMapper.writeValueAsString(payoutBatch), new TypeReference<Map<String, String>>() {});
BatchPayPalPaymentResponse batchPayPalPaymentResponse = new BatchPayPalPaymentResponse(responseData);
return batchPayPalPaymentResponse;
}
public static PayPalPaymentResponse getSinglePayPalPaymentResponse(PayoutItemDetails payoutItemDetails) throws IOException {
//read in the paypal PayoutItemDetails response data and convert it first to a JSON string and then convert the
//JSON string into a Map<String, String>
Map<String, String> responseData = objectMapper.readValue(objectMapper.writeValueAsString(payoutItemDetails), new TypeReference<Map<String, String>>() {});
PayPalPaymentResponse payPalPaymentResponse = new PayPalPaymentResponse(responseData);
return payPalPaymentResponse;
}
}
public class BatchPayPalPaymentResponse extends PaymentResponse {
public BatchPayPalPaymentResponse(Map<String, String> responseData) {
super(responseData);
}
....
....
....
}
public class PayPalPaymentResponse extends PaymentResponse {
public PayPalPaymentResponse(Map<String, String> responseData) {
super(responseData);
}
....
....
....
}
Now you can just call mapper.save(instanceOfPaymentResponse). Note that my code also includes how to use a Jackson parser to pick and choose which sub-class of PaymentResponse to unmarshal too. That's because I use a DynamoDBTypeConverter to marshal my class to a string before putting it into the database.
Finally, I'll throw in my converter for completeness so it all hopefully makes sense.
public class PaymentResponseConverter implements DynamoDBTypeConverter<String, PaymentResponse> {
private static final ObjectMapper objectMapper = new ObjectMapper();
static {
objectMapper.setSerializationInclusion(JsonInclude.Include.NON_NULL);
}
#Override
public String convert(PaymentResponse object) {
try {
return objectMapper.writeValueAsString(object);
} catch (JsonProcessingException e) {
throw new IllegalArgumentException(String.format("Received invalid instance of PaymentResponse and cannot marshal it to a string (%s)", e.getMessage()));
}
}
#Override
public PaymentResponse unconvert(String object) {
try {
return objectMapper.readValue(object, PaymentResponse.class);
} catch (IOException e) {
throw new IllegalArgumentException(String.format("Unable to convert JSON to instance of PaymentResponse. This is a fatal error. (%s)", e.getMessage()));
}
}
}
I had the same problem and went the route of serializing and deserializing objects to json string by myself and then just store them as strings. The whole Document concept of DynamoDB is IMHO just a glorified object serializer. Only if you need to access attributes inside your object in dynamodb actions (eg. scans, projections) it makes sense to use the json document type. If our data is opaque to dynamodb, you are better off with strings.
I am trying to be able to define the following code:
public class MyObject {
private String name;
... // Other attributes
}
#Path(...)
#Stateless
public class MyRestResource {
#POST
#Consumes(MediaType.APPLICATION_JSON)
public Response create(List<MyObject> myObjects) {
// Do some stuff there
}
}
I know that I need to use:
DeserializationConfig.Feature.ACCEPT_SINGLE_VALUE_AS_ARRAY, true
to setup correctly my object mapper to be able to accept single value as array on my rest resources. I succeed to setup that part.
My problem with this approach is that the following content is not differentiable:
{
"name": "a name",
... // other attributes
}
and
[{
"name": "a name",
... // other attributes
}]
will result into a list (List) of size one. Then, in the method create(List myObjects), I will not be able to do the difference between the List and the Single Object sent to the Rest Resource.
Then, my question is how to do something like that. The idea is to have only one #POST that accepts both Arrays and Single values?
Ideally, I will get rid of the configuration of the ObjectMapper to avoid letting the possibility to set Single Object into the other level of the JSON document. For example, I do not want to allow that:
{
...
"attributes": {
...
}
}
where normally this format should be mandatory:
{
...
"attributes": [{
...
}]
}
Based on that, I tried to put in place an object wrapper of my List to set if I am able to the difference between the list and the object. With something like that:
public class ObjectWrapper<T> {
private List<T> list;
private T object;
public boolean isObject() {
return list == null;
}
}
with the resource that becomes:
#Path(...)
#Stateless
public class MyRestResource {
#POST
#Consumes(MediaType.APPLICATION_JSON)
public Response create(ObjectWrapper myObjects) {
// Do some stuff there
}
}
and trying to put in place the deserialization of my content through the JAX-RS/Jersey/Jackson mechanisms. If I let the solution as it is now, the deserialization fails due to the fact that the JSON format expected is the following:
{
"list": [{
"name": "a name",
... // other attributes
}]
}
Then I tried to write a custom deserializer but I am a bit lost in this task. I have something like that:
public class ObjectWrapperDeserializer<T> extends JsonDeserializer<T> {
#Override
public T deserialize(JsonParser jp, DeserializationContext ctxt) throws IOException, JsonProcessingException {
... // What to put there to deserialize Array or Object
}
}
I just want to deserialize the root level to set the content deserialized into the object wrapper. I also want to keep the feature configured in a class annotated with #ApplicationPath when the configuraiton of the different #Provider are done.
I hope that all the info will give a sufficient picture of what I want to do and what I already tested.
Waiting for suggestion on how to do a resource that accept Arrays or Objects on the same path.
Thanks a lot in advance.
Ok, finally I succeed to put in place a mechanism that do exactly what I am looking for. But, I am not sure if there are negative consequences such the performance or such things.
First, I defined a class that can accept both List or Single Object:
public class RootWrapper<T> {
private List<T> list;
private T object;
}
Then, I need a custom deserializer that is able to know which kind of T type to deserialize and to handle the collection or the single object.
public class RootWrapperDeserializer extends JsonDeserializer<CollectionWrapper<?>> {
private Class contentType;
public RootWrapperDeserializer(Class contentType) {
this.contentType = contentType;
}
#Override
public RootWrapper deserialize(JsonParser jp, DeserializationContext ctxt)
throws IOException, JsonProcessingException {
// Retrieve the object mapper and read the tree.
ObjectMapper mapper = (ObjectMapper) jp.getCodec();
JsonNode root = mapper.readTree(jp);
RootWrapper wrapper = new RootWrapper();
// Check if the root received is an array.
if (root.isArray()) {
List list = new LinkedList();
// Deserialize each node of the array using the type expected.
Iterator<JsonNode> rootIterator = root.getElements();
while (rootIterator.hasNext()) {
list.add(mapper.readValue(rootIterator.next(), contentType));
}
wrapper.setList(list);
}
// Deserialize the single object.
else {
wrapper.setObject(mapper.readValue(root, contentType));
}
return wrapper;
}
}
As far as I know, I try to only deserialize the root level manually and then let Jackson take the next operations in charge. I only have to know which real type I expect to be present in the Wrapper.
At this stage, I need a way to tell Jersey/Jackson which deserializer to use. One way I found for that is to create a sort of deserializer registry where are stored the type to deserialize with the right deserializer. I extended the Deserializers.Base class for that.
public class CustomDeserializers extends Deserializers.Base {
// Deserializers caching
private Map<Class, RootWrapperDeserializer> deserializers = new HashMap<>();
#Override
public JsonDeserializer<?> findBeanDeserializer(JavaType type,
DeserializationConfig config, DeserializerProvider provider,
BeanDescription beanDesc, BeanProperty property) throws JsonMappingException {
// Check if we have to provide a deserializer
if (type.getRawClass() == RootWrapper.class) {
// Check the deserializer cache
if (deserializers.containsKey(type.getRawClass())) {
return deserializers.get(type.getRawClass());
}
else {
// Create the new deserializer and cache it.
RootWrapperDeserializer deserializer =
new RootWrapperDeserializer(type.containedType(0).getRawClass());
deserializers.put(type.getRawClass(), deserializer);
return deserializer;
}
}
return null;
}
}
Ok, then I have my deserializers registry that create new deserializer only on demand and keep them once created. What I am not sure about that approach is if there is any concurrency issue. I know that Jackson do a lot of caching and do not call every time the findBeanDeserializer once it was called a first time on a specific deserialization context.
Now I have created my different classes, I need to do some plumbing to combine everything together. In a provider where I create the ObjectMapper, I can setup the deserializers registry to the created object mapper like below:
#Provider
#Produces(MediaType.APPLICATION_JSON)
public class JsonObjectMapper implements ContextResolver<ObjectMapper> {
private ObjectMapper jacksonObjectMapper;
public JsonObjectMapper() {
jacksonObjectMapper = new ObjectMapper();
// Do some custom configuration...
// Configure a new deserializer registry
jacksonObjectMapper.setDeserializerProvider(
jacksonObjectMapper.getDeserializerProvider().withAdditionalDeserializers(
new RootArrayObjectDeserializers()
)
);
}
#Override
public ObjectMapper getContext(Class<?> arg0) {
return jacksonObjectMapper;
}
}
Then, I can also define my #ApplicationPath that is my REST application like following:
public abstract class AbstractRestApplication extends Application {
private Set<Class<?>> classes = new HashSet<>();
public AbstractRestApplication() {
classes.add(JacksonFeature.class);
classes.add(JsonObjectMapper.class);
addResources(classes);
}
#Override
public Set<Class<?>> getClasses() {
return classes;
}
#Override
public Set<Object> getSingletons() {
final Set<Object> singletons = new HashSet<>(1);
singletons.add(new JacksonJsonProvider());
return singletons;
}
private void addResources(Set<Class<?>> classes) {
classes.add(SomeRestResource.class);
// ...
}
}
Now, everything is in place and I can write a REST resource method like that:
#POST
#Path("somePath")
#Consumes(MediaType.APPLICATION_JSON)
#Produces(MediaType.APPLICATION_JSON)
public Response create(RootWrapper<SpecificClass> wrapper) {
if (wrapper.isObject()) {
// Do something for one single object
SpecificClass sc = wrapper.getObject();
// ...
return Response.ok(resultSingleObject).build();
}
else {
// Do something for list of objects
for (SpecificClass sc = wrapper.getList()) {
// ...
}
return Response.ok(resultList).build();
}
}
That's all. Do not hesitate to comment the solution. Feedbacks are really welcome especially around the way of deserialization process where I am really not sure that it is safe for performance and concurrency.
I have the following class which contains a String field and a Map field. I want to use Jackson to serialize it to json.
public class Mapping
private String mAttribute;
#JsonIgnore
private Map<String, String> mMap;
#JsonAnyGetter
public Map<String, String> getMap() {
//some logic to populate map
}
#JsonAnySetter
public void put(// some params) {
//some more logic
}
#JsonProperty(value = "attribute")
public String getAttribute() {
return mAttribute;
}
public void setAttribute(String aAttribute) {
mAttribute= aAttribute;
}
}
I instantiate a Mapping object and then use ObjectMapper to write it to a file.
ObjectMapper om = new ObjectMapper();
om.writeValue(destFile, myMappingObject);
For some reason, it's writing the Mapping instance myMappingObject twice. I'm assuming I've not set some visibility option somewhere but I don't know where.
The json looks like this, only it comes up twice in the file.
{
"attribute" : "someValue",
"map-key1" : "map-value1",
"map-key2" : "map-value2"
}
There's this, but apparently it was fixed in previous version of Jackson. I also tried changing the name of the method to random() and it still gets called twice (the number of times it should).
The problem had nothing to do with the above class. I was using another class that had a list of Mappings. Before:
public class MappingsList {
#JsonProperty
private List<Mapping> mappings;
public List<Mapping> getMappings() {return mappings;}
}
After:
public class MappingsList {
private List<Mapping> mappings;
#JsonProperty
public List<Mapping> getMappings() {return mappings;}
}
And it worked. The cause is that the ObjectMapper was seeing two (2) properties in the MappingsList class and therefore doing serialization on both. First it would create json for the mappings field and then again for the getMappings() method.