I am currently working on a small android app that needs to be able to save/restore a range of settings via serialization/deserialization using the Jackson lib.
For most parts it works really well and I can serialize my objects and restore them again.
Now I need to serialize/deserialize private member with the type:
HashMap<Model, List<Integer>>
Model being one of my own objects containing a bunch of simple values + getter/setter methods.
And this is where I run into problems. It starts throwing the following error at me:
DataAccess: Cannot find a (Map) Key deserializer for type [simple type, class com.schulz.toolie.Models.Model] at [Source: (String)"{"; line: 1, column: 1]
I have tried setting annotations like #JsonAnyGetter and #JsonAnySetter on the getter/setter methods along with the #JsonProperty("subscribe") on both the getter, setter and variable.
Is there any way to get around this? preferably without writing custom serialization/deserialization methods as I will get quite a few of these.
Your problem is that Jackson has a standard for converting a Map to JSON. The keys of the map are used as the property names in the results JSON.
Map<String, Double> groceryPrices = new HashMap<>();
groceryPrices.put("apple", 0.25);
groceryPrices.put("orange", 0.30);
This naturally translates to a JSON object:
{
"apple": 0.25,
"orange": 0.30
}
The problem is you are using a complex object to represent a key. There is no simple method for serializing and deserializing your complex object to/from a String.
If you don't want to write custom serialization, I suggest you change your data structure. Your current structure ties together a model with it's Integers. You could fold the list of Integers into the Model object itself:
Map<String, Model> models; // This could map modelId -> Model which now contains the integers
Related
I have a Java map Map<String, String> map = Collections.singletonMap("dummyKey", "dummyValue").
When I use ObjectMapper to serialize it, I get {"dummyKey":"dummyValue"}.
Is there any way that I can insert JSON keys in the process of serialization so my JSON string looks like {"key":"dummyKey", "value":"dummyValue"}.
I'd like to be able to do this without having to create a separate class, if possible. I'm looking for something like the allow explicit property renaming feature.
I looked at the other mapper features provided, but can't find anything useful. I don't want to create a different class because I want to retain the ability to be able to use it as a map (look up by key) when I deserialize it later. I'm looking for something that'll only change the way serialization and deserialization happen.
I am using Flink and have a stream of JSON strings arriving in my system with dynamically changing fields and nested fields. So I can't mock and convert this incoming JSON as a static POJO and I have to rely on a Map instead.
My first transformation is to convert the JSON string stream into a Map object stream using GSON parsing and then I wrap the map in a DTO called Data.
(inside the first map transformation)
LinkedTreeMap map = gson.fromJson(input, LinkedTreeMap.class);
Data data = new Data(map); // Data has getters, setters for the map and implements Serializable
Problem arises when right after this transformation processing, I attempt to feed the resultant stream into my custom Flink sink. The invoke function does not get called in the sink. The sink works however, if I change from this Map containing DTO to a primitive or a regular DTO with no Map.
My DTO looks like this:
public class FakeDTO {
private String id;
private LinkedTreeMap map; // com.google.gson.internal
// getters and setters
// constructors, empty and with fields
I have tried the two following solutions:
env.getConfig().addDefaultKryoSerializer(LinkedTreeMap.class,MapSerializer.class;
env.getConfig().disableGenericTypes();
Any expert advise I could use in this situation?
I was able to resolve this issue. In my Flink logs I saw that one Kryo file called ReflectionSerializerFactory class was not being found. I updated the Kryo version in maven and use a Map type for my map which Flink documentation says Flink supports.
Just make sure to have generic types specified in your code and add getters and setters inside your POJOs for Maps.
I also use the .returns(xyz.class) type decleration to avoid the effects of Type Erasure.
I am trying to send a collection of diffrent objects to server which accepts Json objects. Figuring out which is the optimal way to do this.
Plan A:
Serialize a collection of objects, like this:
ArrayList<Object> objects = new ArrayList<>();
objects.put(new Human("Johhny"));
objects.put(new Cannon(13));
objects.put(new Hamburger(1.3));
String json = Utils.getObjectMapper().writeValueAsString(objects);
Serialization works fine, and when I must deserialize, I receive array of Objects too (linked hash maps, to be precise). The problem is, I don't know which of these are Human, Cannon or Hamburger.
Is there any way to put object's class name into json, so when it's deserialized, the object mappers knows which class to use?
Plan B:
Parse each LinkedHashMap, manually determine it's true Class based on properties, and manually deserialize into object
Plan C:
Put each type of objects into diffrent collection with specific type.
Serialize three collections and combine them into one string with specific splitter
Deserialize back in reversed order.
The solution is:
Simply add mapper setting before writing to string: Utils.getObjectMapper().enableDefaultTyping(ObjectMapper.DefaultTyping.JAVA_LANG_OBJECT).writeValueAsString(objects);
Thanks to #dmitry-zvorygin!
Polymorphic (de)serialization is all you need -
https://github.com/FasterXML/jackson-docs/wiki/JacksonPolymorphicDeserialization
Just make sure you have base class for all the entities.
My java application makes use of complex object graphs that are jackson annotated and serialized to json in their entirety for client side use. Recently I had to change one of the objects in the domain model such that instead of having two children of type X it will instead contain a Set<X>. This changed object is referenced by several types of objects in the model.
The problem now is that I have a large quantity of test data in json form for running my unit tests that I need to convert to this new object model. My first thought for updating the json files was to use the old version java object model to deserialize the json data, create new objects using the new version object model, hydrate the new objects from the old objects and then finally serialize the new objects back to json. I realized though that the process of programmatically creating matching object graphs and then hydrating those object graphs could be just as tedious as fixing the json by hand since the object graphs are relatively deep and its not a simple clone.
I'm wondering how I can get around fixing these json files entirely by hand? I'm open to any suggestions even non-java based json transformation or parsing tools.
One possibility, if Objects in question are closely-enough related, structurally, is to just read using one setting of data-binding, write using another.
For example: if using Jackson, you could consider implementing custom set and get methods; so that setters could exist for child types; but getter only for Set value. Something like:
```
public class POJO {
private X a, b;
public void setA(X value) { a = value; }
public void setB(X value) { b = value; }
public X[] getValues() {
return new X[] { a, b };
}
```
would, just as an example, read structure where POJO would have two Object-valued properties, "a" and "b", but write structure that has one property "values", with JSON array of 2 Objects.
This is just an example of the basic idea that reading in (deserialization) and serialization (writing out) need not be symmetric or identical.
I have a bean in jackson which uses the #JSonAnySetter method to store all unknown parameters in a map.
#JSonAnySetter
handleUnkowns(String k, Object v)
{
myMap.put(k,v);
}
I use this as the "Base bean" for all my data types, so that if data is missing, the unknown parameters are populated and data is not lost.... rather than jackson crashing.
However, I want the serialized form of these unknowns to NOT be nested - that is - I want serialized parameters to be at the top level of the object, when the object is serialized. Additionally, I want the custom fields to also be serialized :
//I want this map to be serialized/deserialized : {"collarWidth":10 "name":"fido"}
class Dog extens JSonBean
{
int collarWidth=0;
getCollarWidth(){return collarWidth;}
setCollarWidth(int x){collarWidth=x;}
}
Note that in the above case - since I extend from a Map, Jackson's custom Map serialization will take place, and the unknownParameters will be a "field" in my json.
Thus the expected JSON serialization would be
{"collarWidth":10 "unknownParameters":{"name":"fido"}}
rather than
{"collarWidth":10 "name":"fido"}
So - what is the simplest way to "merge" the unknown parameters with the known ones, so that the java bean serializer retains the same nesting as the input string ?
The obvious solution is to merge the parameters from the "myMap" object with the serialized map , but that seems like overkill, and i assume that this problem might have a more elegant solution.
Have you checked out #JsonAnyGetter annotation? Map that method returns will be unwrapped, to make it work with #JsonAnySetter. This blog entry explains usage.