My java application makes use of complex object graphs that are jackson annotated and serialized to json in their entirety for client side use. Recently I had to change one of the objects in the domain model such that instead of having two children of type X it will instead contain a Set<X>. This changed object is referenced by several types of objects in the model.
The problem now is that I have a large quantity of test data in json form for running my unit tests that I need to convert to this new object model. My first thought for updating the json files was to use the old version java object model to deserialize the json data, create new objects using the new version object model, hydrate the new objects from the old objects and then finally serialize the new objects back to json. I realized though that the process of programmatically creating matching object graphs and then hydrating those object graphs could be just as tedious as fixing the json by hand since the object graphs are relatively deep and its not a simple clone.
I'm wondering how I can get around fixing these json files entirely by hand? I'm open to any suggestions even non-java based json transformation or parsing tools.
One possibility, if Objects in question are closely-enough related, structurally, is to just read using one setting of data-binding, write using another.
For example: if using Jackson, you could consider implementing custom set and get methods; so that setters could exist for child types; but getter only for Set value. Something like:
```
public class POJO {
private X a, b;
public void setA(X value) { a = value; }
public void setB(X value) { b = value; }
public X[] getValues() {
return new X[] { a, b };
}
```
would, just as an example, read structure where POJO would have two Object-valued properties, "a" and "b", but write structure that has one property "values", with JSON array of 2 Objects.
This is just an example of the basic idea that reading in (deserialization) and serialization (writing out) need not be symmetric or identical.
Related
I am working on a project in Java where the user must be able to create a Kanban board, (basically Trello) if you have used one before. So at the current time, we have reached a point where we can create a Kanban board in the program. The next task is to be able to save the board to a file and load it back. I have heard that JSON could be a good solution for this.
I wanted to know whether anyone agreed with this or if they had any other solutions. The Kanban board object will have an ID and an Array of columns on the board, each column will have an array of cards for each column. I think that saving this should be fairly straight forward if using a JSON, but I wanted to know how you guys would go about loading the board back into the program. Of course, there are methods in place that add cards, columns etc.
Json would be a good idea, because you can store complex objects in it. You could use the Gson Library from Google for Java to convert Java objects to Json and store it and the opposite, read the Json from a file and convert it to Java objects.
Another method would be to use a ObjectOutputStream to store and read the objects in a file.
Still, I would prefer to use Json, because I think it is easier to use and you have a better overview, because you still can read the Json, whereas the ObjectOutputstream will encode it (if I remember correctly).
You can check here for examples:
https://www.mkyong.com/java/how-to-write-an-object-to-file-in-java/
If there any more questions, let us know.
XML might even be better. With JAXB and some annotations in the DOM classes.
Very simple, but okay a tutorial.
public static Definitions loadModel(Path path) throws JAXBException {
JAXBContext context = JAXBContext.newInstance(Definitions.class);
Unmarshaller um = context.createUnmarshaller();
Definitions definitions = (Definitions) um.unmarshal(path.toFile());
return definitions;
}
#XmlRootElement
public class Definitions {
#XmlAttribute(name = "subject")
public String subject;
#XmlElement(name="field"),
public List<Field> items;
}
I am using Flink and have a stream of JSON strings arriving in my system with dynamically changing fields and nested fields. So I can't mock and convert this incoming JSON as a static POJO and I have to rely on a Map instead.
My first transformation is to convert the JSON string stream into a Map object stream using GSON parsing and then I wrap the map in a DTO called Data.
(inside the first map transformation)
LinkedTreeMap map = gson.fromJson(input, LinkedTreeMap.class);
Data data = new Data(map); // Data has getters, setters for the map and implements Serializable
Problem arises when right after this transformation processing, I attempt to feed the resultant stream into my custom Flink sink. The invoke function does not get called in the sink. The sink works however, if I change from this Map containing DTO to a primitive or a regular DTO with no Map.
My DTO looks like this:
public class FakeDTO {
private String id;
private LinkedTreeMap map; // com.google.gson.internal
// getters and setters
// constructors, empty and with fields
I have tried the two following solutions:
env.getConfig().addDefaultKryoSerializer(LinkedTreeMap.class,MapSerializer.class;
env.getConfig().disableGenericTypes();
Any expert advise I could use in this situation?
I was able to resolve this issue. In my Flink logs I saw that one Kryo file called ReflectionSerializerFactory class was not being found. I updated the Kryo version in maven and use a Map type for my map which Flink documentation says Flink supports.
Just make sure to have generic types specified in your code and add getters and setters inside your POJOs for Maps.
I also use the .returns(xyz.class) type decleration to avoid the effects of Type Erasure.
I am trying to send a collection of diffrent objects to server which accepts Json objects. Figuring out which is the optimal way to do this.
Plan A:
Serialize a collection of objects, like this:
ArrayList<Object> objects = new ArrayList<>();
objects.put(new Human("Johhny"));
objects.put(new Cannon(13));
objects.put(new Hamburger(1.3));
String json = Utils.getObjectMapper().writeValueAsString(objects);
Serialization works fine, and when I must deserialize, I receive array of Objects too (linked hash maps, to be precise). The problem is, I don't know which of these are Human, Cannon or Hamburger.
Is there any way to put object's class name into json, so when it's deserialized, the object mappers knows which class to use?
Plan B:
Parse each LinkedHashMap, manually determine it's true Class based on properties, and manually deserialize into object
Plan C:
Put each type of objects into diffrent collection with specific type.
Serialize three collections and combine them into one string with specific splitter
Deserialize back in reversed order.
The solution is:
Simply add mapper setting before writing to string: Utils.getObjectMapper().enableDefaultTyping(ObjectMapper.DefaultTyping.JAVA_LANG_OBJECT).writeValueAsString(objects);
Thanks to #dmitry-zvorygin!
Polymorphic (de)serialization is all you need -
https://github.com/FasterXML/jackson-docs/wiki/JacksonPolymorphicDeserialization
Just make sure you have base class for all the entities.
I am having a field of type "text" in mysql and storing json data in it.
Eg:- "["android_app","iphone_app","windows_app"]";
I am interacting with mysql using hibernate and while reading this field I am deserializing it to an arraylist in Java.
My question is, is this the best and fastest way to handle such cases or there are some better ways of doing it.
If you're able to take advantage of some JPA 2.1 features, you could use anAttributeConverter to handle this for you automatically without having to deal with this in your business code.
public class YourEntity {
// other stuff
#Convert(converter = StringArrayToJsonConverter.class)
List<String> textValues;
}
Then you just define the converter as follows:
public class StringArraytoJsonConverter
implements AttributeConverter<List<String>, String> {
#Override
public string convertToDatabaseColumn(List<String> list) {
// convert the list to a json string here and return it
}
#Override
public List<String> convertToEntityAttribute(String dbValue) {
// convert the json string to the array list here and return it
}
}
The best part is this becomes a reusable component that you can simply place anywhere you need to represent a json array as a List<> in your java classes but would rather store it as JSON in the database using a single text field.
Another alternative would be to avoid storing the data as JSON but instead use a real table where that it would allow you to actually query on the JSON values. To do this, you'd rewrite that mapping using JPA's #ElementCollection
#ElementCollection
private List<String> textValues;
Internally, Hibernate creates a secondary table where it stores the text string values for the array with a reference to the owning entity's primary key, where the entity primary key and the string value are considered the PK of this secondary table.
You then either handle serializing the List<> as a JSON array in your controller/business code to avoid mixing persistence with that type of medium, particularly given that most databases have not yet introduced a real JSON data-type yet :).
class Result {
private List orders;// Holds objects of class Object only.
private Number a;
private Number b;
}
class Order {
private Number oid;
private Number status;
}
public class TestClass {
public static void main(String... args){
String testString = "{\"orders\":[{\"oid\":\"347\",\"status\":\"1\"},{\"oid\":\"348\",\"status\":\"1\"}],\"a\":14.15,\"b\":0}";
Gson gson = new GsonBuilder().create();
Result res = gson.fromJson( testString, Result.class );
}
Class Result cannot be modified. Json does not have type information of the list and Json is not able to parse.
Assuming we cannot change these classes, how will be solve this problem? Consider the case when there can be multiple such lists of different type.
What if the list contains objects of multiple types?
When parsing JSon you expect it to have certain structure. For example, you expect that Result objects contain Orders and not strings or whatever. So the parser must be intelligent enough to figure out what kind of objects can be contained in the list. There is no way to know this unless you parametrize your collections. It is believed that all java generics information is erased at compile time. This is not true. Gson for example uses generics to figure out how to parse a specific list value. So if you are using GSon parser and List<Order> there is no problem with different list types.
Dealing with different list item classes is another matter. If you can change Json data, for each list item you could encode class name along with its fields. If you absolutely can't change your data classes you would have to revert to interpreting json as collection of hierarchical key-value pairs and do the deserialization yourself.
probably this thread will be of help