I'm using Amazon's DynamoDBMapper Java class to save data to a DynamoDB table. This code needs to work for data structured in multiple different ways, so I would like to stay away from writing particularly structure-specific code. For this reason, I store the code as JSON objects in Java -- which are basically glorified HashMaps.
I would like to store these JSON objects into DynamoDB as Dynamo's relatively new JSON Document type.
The way the DynamoDBMapper API works is essentially that you write a Java class (typically a POJO), then add some annotations, then pass your objects of that class into DynamoDBMapper so that it can then put items into the database with the structure of the Java class. This works well for many aspects of what I'm doing, but not with the fact that I want these classes to contain arbitrarily-structured JSON documents. This is the way you're meant to store JSON documents using DynamoDBMapper, and as you can see, it doesn't allow for the structure of the documents to be arbitrary.
I realize I could use Dynamo's putItem() to pass the jsons as Strings into Item objects -- I just wanted to see if what I want to do is possible with DynamoDBMapper before I shift my approach.
You can try using the DynamoDB Java document SDK instead of the object mapper. This allows you to serialize and deserialize JSON strings using the fromJSON and toJSON methods in the Item class. Check out http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/JavaDocumentAPIItemCRUD.html.
Here's how I came up with my answer of how to store arbitrary Map objects in DynamoDB. This is extremely useful for archiving REST API responses that have been unmarshaled to foreign objects. I'm personally using this to archive REST responses from the PayPal Payment API. I don't care what variables they use in their REST API or the structure of their POJO / beans. I just want to make sure I save everything.
#DynamoDBTable(tableName = "PaymentResponse")
#JsonIgnoreProperties(ignoreUnknown = true)
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include = JsonTypeInfo.As.PROPERTY)
#JsonSubTypes({
#JsonSubTypes.Type(value = PayPalPaymentResponse.class, name = "PayPalPaymentResponse"),
#JsonSubTypes.Type(value = BatchPayPalPaymentResponse.class, name = "BatchPayPalPaymentResponse")}
)
public abstract class PaymentResponse {
// store any arbitrary REST resrponse data in map form so we don't have to worry about the
// structure or the actual response itself
protected Map<String, String> paymentResponseData = Maps.newHashMap();
public PaymentResponse(PaymentResponseType paymentResponseType) {
this.paymentResponseType = paymentResponseType;
}
public Map<String, String> getPaymentResponseData() { return paymentResponseData; }
public void setPaymentResponseData(Map<String, String> paymentResponseData) { this.paymentResponseData = paymentResponseData; }
#Override
public String toString() {
return Arrays.toString(paymentResponseData.entrySet().toArray());
}
}
public class ConverterUtils {
public static BatchPayPalPaymentResponse getBatchPayPalPaymentResponse(PayoutBatch payoutBatch) throws IOException {
//read in the PayoutBatch response data and convert it first to a JSON string and then convert the
//JSON string into a Map<String, String>
Map<String, String> responseData = objectMapper.readValue(objectMapper.writeValueAsString(payoutBatch), new TypeReference<Map<String, String>>() {});
BatchPayPalPaymentResponse batchPayPalPaymentResponse = new BatchPayPalPaymentResponse(responseData);
return batchPayPalPaymentResponse;
}
public static PayPalPaymentResponse getSinglePayPalPaymentResponse(PayoutItemDetails payoutItemDetails) throws IOException {
//read in the paypal PayoutItemDetails response data and convert it first to a JSON string and then convert the
//JSON string into a Map<String, String>
Map<String, String> responseData = objectMapper.readValue(objectMapper.writeValueAsString(payoutItemDetails), new TypeReference<Map<String, String>>() {});
PayPalPaymentResponse payPalPaymentResponse = new PayPalPaymentResponse(responseData);
return payPalPaymentResponse;
}
}
public class BatchPayPalPaymentResponse extends PaymentResponse {
public BatchPayPalPaymentResponse(Map<String, String> responseData) {
super(responseData);
}
....
....
....
}
public class PayPalPaymentResponse extends PaymentResponse {
public PayPalPaymentResponse(Map<String, String> responseData) {
super(responseData);
}
....
....
....
}
Now you can just call mapper.save(instanceOfPaymentResponse). Note that my code also includes how to use a Jackson parser to pick and choose which sub-class of PaymentResponse to unmarshal too. That's because I use a DynamoDBTypeConverter to marshal my class to a string before putting it into the database.
Finally, I'll throw in my converter for completeness so it all hopefully makes sense.
public class PaymentResponseConverter implements DynamoDBTypeConverter<String, PaymentResponse> {
private static final ObjectMapper objectMapper = new ObjectMapper();
static {
objectMapper.setSerializationInclusion(JsonInclude.Include.NON_NULL);
}
#Override
public String convert(PaymentResponse object) {
try {
return objectMapper.writeValueAsString(object);
} catch (JsonProcessingException e) {
throw new IllegalArgumentException(String.format("Received invalid instance of PaymentResponse and cannot marshal it to a string (%s)", e.getMessage()));
}
}
#Override
public PaymentResponse unconvert(String object) {
try {
return objectMapper.readValue(object, PaymentResponse.class);
} catch (IOException e) {
throw new IllegalArgumentException(String.format("Unable to convert JSON to instance of PaymentResponse. This is a fatal error. (%s)", e.getMessage()));
}
}
}
I had the same problem and went the route of serializing and deserializing objects to json string by myself and then just store them as strings. The whole Document concept of DynamoDB is IMHO just a glorified object serializer. Only if you need to access attributes inside your object in dynamodb actions (eg. scans, projections) it makes sense to use the json document type. If our data is opaque to dynamodb, you are better off with strings.
Related
I am creating an API (written in Java) which I am deploying through serverless which ports to a AWS Lambda function. All aspects of the API function great except for the fact that the requests which are returned include the '\' character in front of all quotes.
To put this into perspective, I have a person class which contains instance variables for name (String) and mood (String). I then have my class which uses this class to get and create a Person object, and then Jackson is used to parse this into JSON format. This is what is returned to the handler function (for lambda) and is displayed as the "object body".
public class Person{
String name;
String mood;
//getters and setters and constructor
}
Then, later on there will be something in a different class like
Person person = new Person("bob", "good");
Which would be passed into my method which is supposed to convert things to JSON:
private String convStrToJson(Person person) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
String json = mapper.writeValueAsString(person);
return json;
}
If I were to print this in the output, I'd get something like:
{"name":"bob","mood":"good"}
Which is what I want and expect. However, when deployed and called via GET request, the result is:
"{\"name\":\"bob\",\"mood\":\"good\"}"
I've tried several strategies, including additions to the parsing method such as:
json = json.replace("\"", "");
Which removes the quotes fully from both outputs, or:
json = json.replace("\\","");
Which has no effect at all. I also tried both of these as replaceAll methods and that just messed things up even more. I'm not sure what else I can do to get rid of these '\' characters, I understand why they're there but I don't know how to stop that. Any assistance is appreciated.
Okay so I figured it out. Turns out serverless not only includes Jackson, but actually in the layout it creates for handling responses, the "setObjectBody" section will accept any kind of object and use Jackson to parse it to JSON. This is where I messed up. I assumed it would only accept Strings, which is where the double encoding was occurring. Now, if I pass in the Person object, serverless/Jackson handles it appropriately for me and the expected output is returned. I'll include code snippets below to better demonstrate this solution. Serverless creates a 'handler' class which has a template including a method called handleRequest. Once filled in, this class now looks like this:
public class GetStatusHandler implements RequestHandler<Map<String, Object>, ApiGatewayResponse> {
private static final Logger LOG = Logger.getLogger(GetStatusHandler.class);
#SuppressWarnings("unchecked")
public ApiGatewayResponse handleRequest(Map<String, Object> input, Context context) {
BasicConfigurator.configure();
LOG.info("received: " + input);
try {
Map<String, String> pathParameters = (Map<String, String>) input.get("queryStringParameters");
if(pathParameters == null) {
LOG.info("Getting details for all persons ");
PersonControl control = new PersonControl();
Person[] result = control.myGetHandler(context);
return ApiGatewayResponse.builder()
.setStatusCode(200)
.setObjectBody(result)
.setHeaders(Collections.singletonMap("X-Powered-By", "AWS Lambda & serverless"))
.build();
}else {
String name = pathParameters.get("name");
LOG.info("Getting details for "+name);
PersonControl control = new PersonControl();
Person result = control.myGetHandler(name, context);
return ApiGatewayResponse.builder()
.setStatusCode(200)
.setObjectBody(result)
.setHeaders(Collections.singletonMap("X-Powered-By", "AWS Lambda & serverless"))
.build();
}
}catch(Exception e) {
LOG.error(e, e);
Response responseBody = new Response("Failure getting person", null);
return ApiGatewayResponse.builder()
.setStatusCode(500)
.setObjectBody(responseBody)
.setHeaders(Collections.singletonMap("X-Powered-By", "AWS Lambda & serverless"))
.build();
}
}
}
Not that when returning the ApiGatewayResponse (via builder), an object is simply passed in to the .setObjectBody method ('result') which serverless automatically converts to JSON for us. Thats it! No parsing to JSON necessary in the code.
The response can be a user defined object as below
class Handler implements RequestHandler<SQSEvent, CustomObject> {
public CustomObject handleRequest(SQSEvent event, Context context) {
return new CustomObject();
}
}
Sample code can be found here.
Just use the Google Gson java library that can be used to convert Java Objects into their JSON representation.
Gson gson = new Gson();
gson.toJson(person);
I was trying to filter out certain fields from serialization via SimpleBeanPropertyFilter using the following (simplified) code:
public static void main(String[] args) {
ObjectMapper mapper = new ObjectMapper();
SimpleFilterProvider filterProvider = new SimpleFilterProvider().addFilter("test",
SimpleBeanPropertyFilter.filterOutAllExcept("data1"));
try {
String json = mapper.writer(filterProvider).writeValueAsString(new Data());
System.out.println(json); // output: {"data1":"value1","data2":"value2"}
} catch (JsonProcessingException e) {
e.printStackTrace();
}
}
private static class Data {
public String data1 = "value1";
public String data2 = "value2";
}
Us I use SimpleBeanPropertyFilter.filterOutAllExcept("data1")); I was expecting that the created serialized Json string contains only {"data1":"value1"}, however I get {"data1":"value1","data2":"value2"}.
How to create a temporary writer that respects the specified filter (the ObjectMapper can not be re-configured in my case).
Note: Because of the usage scenario in my application I can only accept answers that do not use Jackson annotations.
If for some reason MixIns does not suit you. You can try this approach:
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.setAnnotationIntrospector(new JacksonAnnotationIntrospector(){
#Override
public boolean hasIgnoreMarker(final AnnotatedMember m) {
List<String> exclusions = Arrays.asList("field1", "field2");
return exclusions.contains(m.getName())|| super.hasIgnoreMarker(m);
}
});
You would normally annotate your Data class to have the filter applied:
#JsonFilter("test")
class Data {
You have specified that you can't use annotations on the class. You could use mix-ins to avoid annotating Data class.
#JsonFilter("test")
class DataMixIn {}
Mixins have to be specified on an ObjectMapper and you specify you don't want to reconfigure that. In such a case, you can always copy the ObjectMapper with its configuration and then modify the configuration of the copy. That will not affect the original ObjectMapper used elsewhere in your code. E.g.
ObjectMapper myMapper = mapper.copy();
myMapper.addMixIn(Data.class, DataMixIn.class);
And then write with the new ObjectMapper
String json = myMapper.writer(filterProvider).writeValueAsString(new Data());
System.out.println(json); // output: {"data1":"value1"}
The example of excluding properties by name:
public Class User {
private String name = "abc";
private Integer age = 1;
//getters
}
#JsonFilter("dynamicFilter")
public class DynamicMixIn {
}
User user = new User();
String[] propertiesToExclude = {"name"};
ObjectMapper mapper = new ObjectMapper()
.addMixIn(Object.class, DynamicMixIn.class);
FilterProvider filterProvider = new SimpleFilterProvider()
.addFilter("dynamicFilter", SimpleBeanPropertyFilter.filterOutAllExcept(propertiesToExclude));
mapper.setFilterProvider(filterProvider);
mapper.writeValueAsString(user); // {"name":"abc"}
You can instead of DynamicMixIn create MixInByPropName
#JsonIgnoreProperties(value = {"age"})
public class MixInByPropName {
}
ObjectMapper mapper = new ObjectMapper()
.addMixIn(Object.class, MixInByPropName.class);
mapper.writeValueAsString(user); // {"name":"abc"}
Note: If you want exclude property only for User you can change parameter Object.class of method addMixIn to User.class
Excluding properties by type you can create MixInByType
#JsonIgnoreType
public class MixInByType {
}
ObjectMapper mapper = new ObjectMapper()
.addMixIn(Integer.class, MixInByType.class);
mapper.writeValueAsString(user); // {"name":"abc"}
It seems you have to add an annotation which indicts which filter to use when doing the serialization to the bean class if you want the filter to work:
#JsonFilter("test")
public class Data {
public String data1 = "value1";
public String data2 = "value2";
}
EDIT
The OP has just added a note that just take the answer that not using a bean animation, then if the field you want to export is very less amount, you can just retrieve that data and build a Map of List yourself, there seems no other way to do that.
Map<String, Object> map = new HashMap<String, Object>();
map.put("data1", obj.getData1());
...
// do the serilization on the map object just created.
If you want to exclude specific field and kept the most field, maybe you could do that with reflect. Following is a method I have written to transfer a bean to a map you could change the code to meet your own needs:
protected Map<String, Object> transBean2Map(Object beanObj){
if(beanObj == null){
return null;
}
Map<String, Object> map = new HashMap<String, Object>();
try {
BeanInfo beanInfo = Introspector.getBeanInfo(beanObj.getClass());
PropertyDescriptor[] propertyDescriptors = beanInfo.getPropertyDescriptors();
for (PropertyDescriptor property : propertyDescriptors) {
String key = property.getName();
if (!key.equals("class")
&& !key.endsWith("Entity")
&& !key.endsWith("Entities")
&& !key.endsWith("LazyInitializer")
&& !key.equals("handler")) {
Method getter = property.getReadMethod();
if(key.endsWith("List")){
Annotation[] annotations = getter.getAnnotations();
for(Annotation annotation : annotations){
if(annotation instanceof javax.persistence.OneToMany){
if(((javax.persistence.OneToMany)annotation).fetch().equals(FetchType.EAGER)){
List entityList = (List) getter.invoke(beanObj);
List<Map<String, Object>> dataList = new ArrayList<>();
for(Object childEntity: entityList){
dataList.add(transBean2Map(childEntity));
}
map.put(key,dataList);
}
}
}
continue;
}
Object value = getter.invoke(beanObj);
map.put(key, value);
}
}
} catch (Exception e) {
Logger.getAnonymousLogger().log(Level.SEVERE,"transBean2Map Error " + e);
}
return map;
}
But I recommend you to use Google Gson as the JSON deserializer/serializer And the main reason is I hate dealing with exception stuff, it just messed up with the coding style.
And it's pretty easy to satisfy your need with taking advantage of the version control annotation on the bean class like this:
#Since(GifMiaoMacro.GSON_SENSITIVE) //mark the field as sensitive data and will not export to JSON
private boolean firstFrameStored; // won't export this field to JSON.
You can define the Macro whether to export or hide the field like this:
public static final double GSON_SENSITIVE = 2.0f;
public static final double GSON_INSENSITIVE = 1.0f;
By default, Gson will export all field that not annotated by #Since So you don't have to do anything if you do not care about the field and it just exports the field.
And if some field you are not want to export to json, ie sensitive info just add an annotation to the field. And generate json string with this:
private static Gson gsonInsensitive = new GsonBuilder()
.registerTypeAdapter(ObjectId.class,new ObjectIdSerializer()) // you can omit this line and the following line if you are not using mongodb
.registerTypeAdapter(ObjectId.class, new ObjectIdDeserializer()) //you can omit this
.setVersion(GifMiaoMacro.GSON_INSENSITIVE)
.disableHtmlEscaping()
.create();
public static String toInsensitiveJson(Object o){
return gsonInsensitive.toJson(o);
}
Then just use this:
String jsonStr = StringUtils.toInsensitiveJson(yourObj);
Since Gson is stateless, it's fine to use a static method to do your job, I have tried a lot of JSON serialize/deserialize framework with Java, but found Gson to be the sharp one both performance and handily.
I am trying to de-serialize this JSON object using Jackson 2.8 as part of Retrofit response. Here is the JSON response I get from the server.
{
"id":"8938209912"
"version":"1.1"
"cars":{
"mercedes":[
{
"property":"color"
},
{
"property":"price"
},
{
"property":"location"
}
],
"tesla":[
{
"property":"environment"
}
]
}
}
Based on the query, the cars above may have one or more models returned. I cannot create a class each for each model as these get created/removed arbitrarily. For each model of the car (say tesla), there may be one or more property key-value pairs.
I am new to Jackson. I have been looking at several examples and looks like a custom #JsonDeserialize is the best way to go. So, I created Root class and Cars class like this:
// In file Root.java
public class Root {
#JsonProperty("id")
private String id = null;
#JsonProperty("version")
private String version = null;
#JsonProperty("cars")
private Cars cars = null;
}
// In file Cars.java
public class Cars {
public Cars(){}
#JsonDeserialize(using = CarDeserializer.class)
private Map<String, List<Property>> properties;
public Map<String, List<Property>> getProperties() {
return properties;
}
public void setProperties(Map<String, List<Property>> properties) {
this.properties = properties;
}
}
// Property.java
public class Property {
#JsonProperty("property")
private String property;
}
My de-serializer is below. However, even though the empty constructor gets called, the parse method itself is not called at all!
// CarDeserializer.class
public class RelationshipDeserializer extends StdDeserializer<Map<String, List<Action>>>{
protected RelationshipDeserializer(){
super(Class.class);
}
#Override
public Map<String, List<Action>> deserialize(JsonParser parser, DeserializationContext ctx)
throws IOException, JsonProcessingException
{
// This method never gets invoked.
}
}
My questions:
Is this the right approach in the first place?
Why do you think the execution never gets to the deserialize()? (I checked, the cars object is present in JSON.
Are there better approaches to parse this JSON using Jackson?
The "properties" deserializer is never called because that does not match anything in that JSON. The field name in the JSON is "property" and it does not match Map<String, List<Property>>. It looks like it would be closer to List<Property>
Do you control the in coming JSON? It would be better for the car name/type to be in its own field rather than the name of the object. Then you can use a generic object. What you have now is going to break. Any time they add a new name/type and you do not have a matching object for it.
I use Spring MVC to drive the API of an application I am currently working with. The serialization of the API response is done via Jackson's ObjectMapper. I am faced with the following situation, we are extending a number of our objects to support UserDefinedFields (UDF) which is shown below in the abstract UserDefinedResponse. Being a SaaS solution, multiple clients have different configuration that is stored in the database for their custom fields.
The goal of this question is to be able to respond to each client with their UDF data. This would require
Dynamically rename the fields customString1, customString2, ... to their corresponding UDF labels
Remove undefined UDF fields (Example client uses only 2 out of the 4 fields.
Example of the abstract response
public abstract class UserDefinedResponse {
public String customString1;
public String customString2;
public String customString3;
public String customString4;
}
And response for a product that extends the UserDefinedResponse object
public class Product extends UserDefinedResponse {
public long id;
public String name;
public float price;
}
And finally, assuming a client sets
customString1 = "supplier"
customString2 = "warehouse"
Serializing Product for this customer should result in something similar to this:
{
"id" : 1234,
"name" : "MacBook Air",
"price" : 1299,
"supplier" : "Apple",
"warehouse" : "New York warehouse"
}
I think you could do what you need with the help of a few Jackson annotations:
public abstract class UserDefinedResponse {
#JsonIgnore
public String customString1;
#JsonIgnore
public String customString2;
#JsonIgnore
public String customString3;
#JsonIgnore
public String customString4;
#JsonIgnore // Remove if clientId must be serialized
public String clientId;
private Map<String, Object> dynamicProperties = new HashMap<>();
#JsonAnyGetter
public Map<String, Object> getDynamicProperties() {
Mapper.fillDynamicProperties(this, this.dynamicProperties);
return this.dynamicProperties;
}
#JsonAnySetter
public void setDynamicProperty(String name, Object value) {
this.dynamicProperties.put(name, value);
Mapper.setDynamicProperty(this.dynamicProperties, name, this);
}
}
First, annotate all the properties of your base class with #JsonIgnore, as these won't be part of the response. Then, make use of the #JsonAnyGetter annotation to flatten the dynamicProperties map, which will hold the dynamic properties. Finally, the #JsonAnySetter annotation is meant to be used by Jackson on deserialization.
The missing part is the Mapper utility class:
public abstract class Mapper<T extends UserDefinedResponse> {
private static final Map<Class<T>, Map<String, Mapper<T>>> MAPPERS = new HashMap<>();
static {
// Mappers for Products
Map<String, Mapper<Product>> productMappers = new HashMap<>();
productMappers.put("CLIENT_1", new ProductMapperClient1());
productMappers.put("CLIENT_2", new ProductMapperClient2());
// etc for rest of clients
MAPPERS.put(Product.class, productMappers);
// Mappers for Providers
Map<String, Mapper<Provider>> providerMappers = new HashMap<>();
providerMappers.put("CLIENT_1", new ProviderMapperClient1());
providerMappers.put("CLIENT_2", new ProviderMapperClient2());
// etc for rest of clients
MAPPERS.put(Provider.class, providerMappers);
// etc for rest of entities
// (each entity needs to add specific mappers for every client)
}
protected Mapper() {
}
public static void fillDynamicProperties(T response, Map<String, Object> dynamicProperties) {
// Get mapper for entity and client
Mapper<T> mapper = MAPPERS.get(response.getClass()).get(response.clientId);
// Perform entity -> map mapping
mapper.mapFromEntity(response, dynamicProperties);
}
public static void setDynamicProperty(Map<String, Object> dynamicProperties, String name, T response) {
// Get mapper for entity and client
Mapper<T> mapper = MAPPERS.get(response.getClass()).get(response.clientId);
// Perform map -> entity mapping
mapper.mapToEntity(dynamicProperties, name, response);
}
protected abstract void mapFromEntity(T response, Map<String, Object> dynamicProperties);
protected abstract void mapToEntity(Map<String, Object> dynamicProperties, String name, T response);
}
And for Product entity and client CLIENT_1:
public class ProductMapperClient1 extends Mapper<Product> {
#Override
protected void mapFromEntity(Product response, Map<String, Object> dynamicProperties) {
// Actual mapping from Product and CLIENT_1 to map
dynamicProperties.put("supplier", response.customString1);
dynamicProperties.put("warehouse", response.customString2);
}
#Override
protected void mapToEntity(Map<String, Object> dynamicProperties, String name, Product response) {
// Actual mapping from map and CLIENT_1 to Product
String property = (String) dynamicProperties.get(name);
if ("supplier".equals(name)) {
response.customString1 = property;
} else if ("warehouse".equals(name)) {
response.customString2 = property;
}
}
}
The idea is that there's a specific mapper for each (entity, client) pair. If you have many entities and/or clients, then you might consider filling the map of mappers dynamically, maybe reading from some config file and using reflection to read the properties of the entity.
Have you considered returning Map<> as a response? Or a part of the response, like response.getUDF().get("customStringX"))? This should save you some possible trouble in the future, e.g.: 10 millions of concurrent users means 10 million classes in your VM.
I am trying to be able to define the following code:
public class MyObject {
private String name;
... // Other attributes
}
#Path(...)
#Stateless
public class MyRestResource {
#POST
#Consumes(MediaType.APPLICATION_JSON)
public Response create(List<MyObject> myObjects) {
// Do some stuff there
}
}
I know that I need to use:
DeserializationConfig.Feature.ACCEPT_SINGLE_VALUE_AS_ARRAY, true
to setup correctly my object mapper to be able to accept single value as array on my rest resources. I succeed to setup that part.
My problem with this approach is that the following content is not differentiable:
{
"name": "a name",
... // other attributes
}
and
[{
"name": "a name",
... // other attributes
}]
will result into a list (List) of size one. Then, in the method create(List myObjects), I will not be able to do the difference between the List and the Single Object sent to the Rest Resource.
Then, my question is how to do something like that. The idea is to have only one #POST that accepts both Arrays and Single values?
Ideally, I will get rid of the configuration of the ObjectMapper to avoid letting the possibility to set Single Object into the other level of the JSON document. For example, I do not want to allow that:
{
...
"attributes": {
...
}
}
where normally this format should be mandatory:
{
...
"attributes": [{
...
}]
}
Based on that, I tried to put in place an object wrapper of my List to set if I am able to the difference between the list and the object. With something like that:
public class ObjectWrapper<T> {
private List<T> list;
private T object;
public boolean isObject() {
return list == null;
}
}
with the resource that becomes:
#Path(...)
#Stateless
public class MyRestResource {
#POST
#Consumes(MediaType.APPLICATION_JSON)
public Response create(ObjectWrapper myObjects) {
// Do some stuff there
}
}
and trying to put in place the deserialization of my content through the JAX-RS/Jersey/Jackson mechanisms. If I let the solution as it is now, the deserialization fails due to the fact that the JSON format expected is the following:
{
"list": [{
"name": "a name",
... // other attributes
}]
}
Then I tried to write a custom deserializer but I am a bit lost in this task. I have something like that:
public class ObjectWrapperDeserializer<T> extends JsonDeserializer<T> {
#Override
public T deserialize(JsonParser jp, DeserializationContext ctxt) throws IOException, JsonProcessingException {
... // What to put there to deserialize Array or Object
}
}
I just want to deserialize the root level to set the content deserialized into the object wrapper. I also want to keep the feature configured in a class annotated with #ApplicationPath when the configuraiton of the different #Provider are done.
I hope that all the info will give a sufficient picture of what I want to do and what I already tested.
Waiting for suggestion on how to do a resource that accept Arrays or Objects on the same path.
Thanks a lot in advance.
Ok, finally I succeed to put in place a mechanism that do exactly what I am looking for. But, I am not sure if there are negative consequences such the performance or such things.
First, I defined a class that can accept both List or Single Object:
public class RootWrapper<T> {
private List<T> list;
private T object;
}
Then, I need a custom deserializer that is able to know which kind of T type to deserialize and to handle the collection or the single object.
public class RootWrapperDeserializer extends JsonDeserializer<CollectionWrapper<?>> {
private Class contentType;
public RootWrapperDeserializer(Class contentType) {
this.contentType = contentType;
}
#Override
public RootWrapper deserialize(JsonParser jp, DeserializationContext ctxt)
throws IOException, JsonProcessingException {
// Retrieve the object mapper and read the tree.
ObjectMapper mapper = (ObjectMapper) jp.getCodec();
JsonNode root = mapper.readTree(jp);
RootWrapper wrapper = new RootWrapper();
// Check if the root received is an array.
if (root.isArray()) {
List list = new LinkedList();
// Deserialize each node of the array using the type expected.
Iterator<JsonNode> rootIterator = root.getElements();
while (rootIterator.hasNext()) {
list.add(mapper.readValue(rootIterator.next(), contentType));
}
wrapper.setList(list);
}
// Deserialize the single object.
else {
wrapper.setObject(mapper.readValue(root, contentType));
}
return wrapper;
}
}
As far as I know, I try to only deserialize the root level manually and then let Jackson take the next operations in charge. I only have to know which real type I expect to be present in the Wrapper.
At this stage, I need a way to tell Jersey/Jackson which deserializer to use. One way I found for that is to create a sort of deserializer registry where are stored the type to deserialize with the right deserializer. I extended the Deserializers.Base class for that.
public class CustomDeserializers extends Deserializers.Base {
// Deserializers caching
private Map<Class, RootWrapperDeserializer> deserializers = new HashMap<>();
#Override
public JsonDeserializer<?> findBeanDeserializer(JavaType type,
DeserializationConfig config, DeserializerProvider provider,
BeanDescription beanDesc, BeanProperty property) throws JsonMappingException {
// Check if we have to provide a deserializer
if (type.getRawClass() == RootWrapper.class) {
// Check the deserializer cache
if (deserializers.containsKey(type.getRawClass())) {
return deserializers.get(type.getRawClass());
}
else {
// Create the new deserializer and cache it.
RootWrapperDeserializer deserializer =
new RootWrapperDeserializer(type.containedType(0).getRawClass());
deserializers.put(type.getRawClass(), deserializer);
return deserializer;
}
}
return null;
}
}
Ok, then I have my deserializers registry that create new deserializer only on demand and keep them once created. What I am not sure about that approach is if there is any concurrency issue. I know that Jackson do a lot of caching and do not call every time the findBeanDeserializer once it was called a first time on a specific deserialization context.
Now I have created my different classes, I need to do some plumbing to combine everything together. In a provider where I create the ObjectMapper, I can setup the deserializers registry to the created object mapper like below:
#Provider
#Produces(MediaType.APPLICATION_JSON)
public class JsonObjectMapper implements ContextResolver<ObjectMapper> {
private ObjectMapper jacksonObjectMapper;
public JsonObjectMapper() {
jacksonObjectMapper = new ObjectMapper();
// Do some custom configuration...
// Configure a new deserializer registry
jacksonObjectMapper.setDeserializerProvider(
jacksonObjectMapper.getDeserializerProvider().withAdditionalDeserializers(
new RootArrayObjectDeserializers()
)
);
}
#Override
public ObjectMapper getContext(Class<?> arg0) {
return jacksonObjectMapper;
}
}
Then, I can also define my #ApplicationPath that is my REST application like following:
public abstract class AbstractRestApplication extends Application {
private Set<Class<?>> classes = new HashSet<>();
public AbstractRestApplication() {
classes.add(JacksonFeature.class);
classes.add(JsonObjectMapper.class);
addResources(classes);
}
#Override
public Set<Class<?>> getClasses() {
return classes;
}
#Override
public Set<Object> getSingletons() {
final Set<Object> singletons = new HashSet<>(1);
singletons.add(new JacksonJsonProvider());
return singletons;
}
private void addResources(Set<Class<?>> classes) {
classes.add(SomeRestResource.class);
// ...
}
}
Now, everything is in place and I can write a REST resource method like that:
#POST
#Path("somePath")
#Consumes(MediaType.APPLICATION_JSON)
#Produces(MediaType.APPLICATION_JSON)
public Response create(RootWrapper<SpecificClass> wrapper) {
if (wrapper.isObject()) {
// Do something for one single object
SpecificClass sc = wrapper.getObject();
// ...
return Response.ok(resultSingleObject).build();
}
else {
// Do something for list of objects
for (SpecificClass sc = wrapper.getList()) {
// ...
}
return Response.ok(resultList).build();
}
}
That's all. Do not hesitate to comment the solution. Feedbacks are really welcome especially around the way of deserialization process where I am really not sure that it is safe for performance and concurrency.