We have a Jackson ObjectReader that is overwriting any original data if it is missing from the json update request.
This is what the data structure looks like:
data class Model(
val fieldTypeA: FieldTypeA? = null,
)
data class FieldTypeA(
val valueA: String? = null,
val valueB: String? = null,
)
We then read an existing value from a database so that fieldTypeA.valueA = "Test"
val existingModel = repository.findById(id).getOrNull()
Then this line reads the existing data into an ObjectReader:
val readerForUpdating: ObjectReader = CustomMapper.readerForUpdating(existingEntity)
This is where the problem occurs. The readValue overwrites the fieldTypeA.valueA after this line is executed with a jsonRequest:
val updatedRequest: Model = readerForUpdating.readValue(jsonRequest)
jsonRequest:
{"fieldTypeA":{"valueB":"I am value B"}}
The existingEntity object now only contains fieldTypeA.valueB, with fieldTypeA.valueA getting overwritten with null.
Is there a way to tell Jackson not to overwrite when a value is missing from the JSON?
I've fixed my own problem in the following way, in case it helps someone in the future.
There was a new feature introduced in Jackson 2.9 which allows deep merging. To do this, the relevant property needs to be tagged with #JsonMerge.
So in my example question above, the Model object would be changed like this:
import com.fasterxml.jackson.annotation.JsonMerge
data class Model(
#JsonMerge
val fieldTypeA: FieldTypeA? = null,
)
This means if the original object contains a field with a value, it does not get overwritten to null when the new json comes in.
readerForUpdating(obj) only promises to use the root object you pass
Deserialization occurs normally except that the root-level value in JSON is not used for instantiating a new object; instead give updateable object is used as root
whereas fieldTypeA.valueA is an inner object.
It works on a shallow object:
val existingModel = FieldTypeA(valueA = "A")
println(existingModel)
val readerForUpdating: ObjectReader = MAPPER.readerForUpdating(existingModel)
val jsonRequest = "{\"valueB\":\"I am value B\"}"
val updatedRequest: FieldTypeA = readerForUpdating.readValue(jsonRequest)
println(updatedRequest)
produces
FieldTypeA(valueA=A, valueB=null)
FieldTypeA(valueA=A, valueB=I am value B)
It's hard to see how you could develop a feature like readerForUpdating() which would know which fields to adjust in a deep object graph. If the incoming object's object's field had a value, should that indicate an overwrite? but not if they come as nulls?
If I had to do this way myself, I might approach it this way:
read the incoming JSON into a traditional JSONObject then walk the keys and set the properties on the target object
For setting properties you could use old school Apache Commons BeanUtils tool, but I am not sure that that is maintained and it has had some vulnerabilities issues, so you could look at Spring's BeanWrapperImpl
Related
I am taking a JSON file as input for a class and parsing the values using gson through respective data classes.
I want to call a function that takes a String value as an argument.
The string value allowed is decided from the values parsed from JSON file. Can I somehow check for that string value passed to the function at compile-time & give an error at compile-time?
Or If I can allow only certain values in the argument for the function based on the values from JSON
Detailed Explanation of use case:
I am building a SDK in which a the person using sdk inputs json String. The json is standardised and is parsed in my code.
{
"name": "Test",
"objects": [
{
"name": "object1",
"type": "object1"
}
]
}
Here name values and other values may vary based on the input by the developer using it but key remains same. But we need to call a function using the value in objects name parameter.
fun testMethod(objectName:String)
So developer calls the testMethod as testMethod(object1).
I need to validate object1 parameter based on json but is there any way possible restricting the test method parameter to object1 only & give error at compile time if the developer calls testMethod(obj1)
Right now I parse JSON & have checks inside the testMethod()
Sure it's possible to do, but somehow in different way, that you described. First of all, as you already mentioned this behavior could be done easily. For this purpose we have Objects.requireNotNull() or Guava.Preconditions(). At the same way you can define you checking but this will work on runtime only.
To do in compile time, you need to create Annotation Preprocessor. The same, as did in different libraries, and one of them, could be Lombok, with their NotNull and Nullable. Android annotation just provide mark and bound for IDE warning, but in their case they adding NotNull checking and throw exception for every annotation usage during compile time.
It's not an easy way, but it's what you are looking for.
No, it's impossible check it in compiler time. It's string handling, as numeric calculation.
In my app, I convert string to JSON and JSON to string, passing class descriptor. My aim is record JSON string in a text file to load in SQLite database. This code I've run in my desktop computer not in Android.
data class calcDescr (
...
)
val calc = CalcDescr(...)
// toJson: internal Kotlin data to JSON
val content = Gson().toJson(calc)
//==================
// Testing validity
// ================
// fromJson: JSON to internal Kotlin data.
// It needs pass the class descriptor. Uses *Java* token, but it's *Kotlin*
var testModel = Gson().fromJson(content, CalcDescr::class.java)
// toJson: internal Kotlin data to JSON again
var contentAgain = Gson().toJson(testModel)
// shoul be equal!
if (content == contentAgain) println("***ok***")
In my file, I write the variable content in a file
I have JavaScript that parses a JSON object (object has array) and returns the value from the ZONE field.
var obj = JSON.parse(json_text);
parsed_val = obj.features[0].attributes.ZONE
I would like to convert the JavaScript code to Jython.
This is what I've tried:
from com.ibm.json.java import JSONObject
obj = JSONObject.parse(json_text)
parsed_val = obj.get('features.attributes.ZONE');
The Jython compiles, but it doesn't return a valid value (it returns None). I think this is because I haven't referenced the array properly.
How can I parse the JSON object/array using Jython to get the ZONE value?
(Jython version is 2.7.0. However, I can't seem to use Python's JSON library (normally included in Jython)).
I needed to use get() at each level of the object.
As well as specify the array's index position after the first level: [0].
from com.ibm.json.java import JSONObject
obj = JSONObject.parse(json_text)
parsed_val = obj.get("features")[0].get("attributes").get("WEEK")
Credit goes to #vikarjramun for pointing me in the right direction. Thanks.
I'm trying to update data in Elastic Search in my Java program with TranportClient class. I know I can UPDATE in this way :
XContentBuilder builder = XContentFactory.jsonBuilder().startObject()
.field("source.ivarId", source.ivarId)
.field("source.channel", source.channel)
.field("source.bacId", source.bacId).endObject();
UpdateResponse response = client.prepareUpdate(_index, _type, _id).setDoc(builder.string()).get();
while source is my user-defined class which contains 3 fields : ivarId, channel and bacId.
But I want to know is there any method that could do the same thing, but using another more efficient and easier way, so that I don't need to assign each field inside a class? For example, can I do like this?
XContentBuilder builder = XContentFactory.jsonBuilder().startObject()
.field("source", source).endObject();
UpdateResponse response = client.prepareUpdate(_index, _type, _id).setDoc(builder.string()).get();
I tried the latter method, and I got this exception :
MapperParsingException[object mapping for [source] tried to parse field [source] as object, but found a concrete value]
I'm using Java 1.8.0.121, and both versions of ElasticSearch and TransportClient are 5.1. Thanks!
The answer is much more easier than I thought.
Gson gson = new Gson();
String sourceJsonString = gson.toJson(updateContent);
UpdateResponse response = client
.prepareUpdate(_index, "logs", id).setDoc(sourceJsonString).get();
updateContent is the object that contained new data, just to transform it to Json string, then use that to update, done.
Since my very first days of Java + JSON I tried to extract just some certain parts of a JSON.
But no matter if which of the libraries I used:
Gson
json-simple
javax.json
it never was possible to make it quick and comfortable. Mostly for easy task or even prototyping. It already cost me many hours of different approaches.
Going trough the hierarchy of an JSON
Object jsonObject = gson.fromJson(output, Object.class);
JsonElement jsonTree = gson.toJsonTree(jsonObject);
JsonArray commitList = jsonTree.getAsJsonArray();
JsonElement firstElement = commitList.get(0);
JsonObject firstElementObj = firstElement.getAsJsonObject();
System.out.println(firstElementObj.get("sha"));
JsonElement fileList = firstElementObj.get("files");
This is dirty code for a reason. It shows how many early approaches looks like and how many people cannot achieve it to do it better early.
Deserializing JSON to a Java Object
Your have to analyse the complete JSON to create an complete Java-Object representation just to get access to some single memebers of it. This is a way I never wanted to do for prototyping
JSON is an easy format. But using libraries like that is quite difficult and often an problem for beginner. I've found several different answers via Google and even StackOverflow. But most were quite big larged which required to create a own specific class for the whole JSON-Object.
What is the best approach to make it more beginner-friendly?
or
What is the best beginner-friendly approach?
Using Jackson (which you tagged), you can use JsonPointer expressions to navigate through a tree object:
ObjectMapper mapper = new ObjectMapper();
JsonNode tree = mapper
.readTree("[ { \"sha\": \"foo\", \"files\": [ { \"sha\": \"bar\" }, { \"sha\": \"quux\" } ] } ]");
System.out.println(tree.at("/0/sha").asText());
for (JsonNode file : tree.at("/0/files")) {
System.out.println(file.get("sha").asText());
}
You could also use the ObjectMapper to convert just parts of a tree to your model objects, if you want to start using that:
for (JsonNode fileNode : tree.at("/0/files")) {
FileInfo fileInfo = mapper.convertValue(fileNode, FileInfo.class);
System.out.println(fileInfo.sha);
}
If your target class (FileInfo) specifies to ignore unknown properties (annotate target class with #JsonIgnoreProperties(ignoreUnknown = true) or disable DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES on the mapper), then you can simply declare the properties you are interested in.
"Best" is whatever works to get you going.
Generate Plain Old Java Objects from JSON or JSON-Schema
One little helper I found via my research was an Online-Tool like
http://www.jsonschema2pojo.org/
This is a little help, when you know about that. But the negative side I mentioned at point 2 is still there.
You can use JsonSurfer to selectively extract value or object from big json with streaming JsonPath processor.
JsonSurfer jsonSurfer = JsonSurfer.gson();
System.out.println(jsonSurfer.collectOne(json, "$[0].sha"));
System.out.println(jsonSurfer.collectOne(json, "$[0].files"));
I'm developing a Java application using MongoDB and Java-driver.
I need to parse a BasicDBObject into an own object of my code, and I don't know if there is a way to develop automatically.
Is it possible to parse from BasicDBObject to JSON String? Then, I could parse from JSON String to my own Object, for example with GSON library. Something like
BasicDBObject object;
String myJSONString = object.toString();
Gson gson = new Gson();
MyOwnObject myObject = gson.fromJson(myJSONString, MyOwnObject.class);
And I don't want to add complex to my code, also I don't to add more extern libraries. I don't want to add Gson library or other.
Any ideas?? Is it possible to do this without external libraries??
Thanks!!
You could have taken a look at the API: Just call object#toString() (http://api.mongodb.org/java/2.0/com/mongodb/BasicDBObject.html#toString()).
You could either use Groovy with gmongo library for that, there you have lots of handy tools for such casting.
If the language change is not an option for you, write your own reflection-based mapper. If you POJO is simple enough, the mapper shall be pretty simple.
This is the correct answer to my question
From http://docs.mongodb.org/ecosystem/tutorial/use-java-dbobject-to-perform-saves/
For example, suppose one had a class called Tweet that they wanted to save:
public class Tweet implements DBObject {
/* ... */
}
Then you can say:
Tweet myTweet = new Tweet();
myTweet.put("user", userId);
myTweet.put("message", msg);
myTweet.put("date", new Date());
collection.insert(myTweet);
When a document is retrieved from the database, it is automatically converted to a DBObject. To convert it to an instance of your class, use DBCollection.setObjectClass():
collection.setObjectClass(Tweet.class);
Tweet myTweet = (Tweet)collection.findOne();
If for some reason you wanted to change the message you can simply take that tweet and save it back after updating the field.
Tweet myTweet = (Tweet)collection.findOne();
myTweet.put("message", newMsg);
collection.save(myTweet);