Double-formatting in com.fasterxml.jackson.databind.ObjectMapper - java

I have an application, which loads a data row from a SQL Server database and should transmit that content JSON-encoded to a webservice. I used ObjectMapper to do the JSON conversion. There is some infrastructure behind this project, namely I get the data from the db as a Hashmap<string,object> where the keys are the column-names from the db-table.
Is there a way to tell com.fasterxml.jackson.databind.ObjectMapper how to format double-values upon serialization, when you can not use annotations?
I did a simple
ObjectMapper o = new ObjectMapper();
String jsonStr = o.writeValueAsString(m); // m is my hashmap-instance
While in theory this works quite nicely, I have a problem with the serialization of doubles. In the db there is a column of type float (which corresponds to double in Java, java-float would be real in SQL Server) with the value 402.4818. If I let ObjectMapper serialize this, I get 402.48179999999996 in JSON.
How can I customize the double-formatting / precision in ObjectMapper? I have already used setDateFormat(new SimpleDateFormat("yyyy-MM-dd")) to get my dates right, is there any way to do something similar for double?
Most examples in the net serialize POJOs where you can use annotations to specify how specific properties should be serialized, but I have a HashMap here.
Please don't tell me that I have to go through the hassle of creating a POJO, fill it with the DB-Data and serialize it for every row I want to send via this API. (This is a db-row which has > 30 columns and the project is nearly done, besides this little double-formatting-problem.)
So what do I need to do to tell ObjectMapper to round to the 4. decimal point before serializing or give some custom DecimalFormat? Documentation on this is quite sparse.

Related

Easier way of converting Json to Java (Jackson)

I am receiving massive json objects from a service and so far i've been creating POJOs to match the json that comes in.
However, this is getting far too tedious as with every different service I hit I have to build 15-20 new model classes to represent the new service i'm hitting.
In short, what i'm looking for is a way to get a value I need from a neested object in the json as below (sorry for format):
random1 {
random2 {
arrayOfRandoms
}
random3 {
random4 {
random5 {
someValueIWant
}
}
}
}
so in this case I want random5s someValueIWant object. I want to get it without creating the models for random1/3/4/5 as i've been doing this whole time.
I should mention that I use Jacksons ObjectMapper to turn the json into java objects.
Hope this makes sense.
You could experiment with this online pojo generator:
http://www.jsonschema2pojo.org/
It will generate java classes from plain json (or json schema) and even add jackson annotations.
Make sure you check "Allow additional properties".
It requires valid json as input, so don't forget double quotes around fields names and values
If you find yourself doing that often, there's even scriptable versions and maven plugins.

Elegant mapping from POJOs to vertx.io's JsonObject?

I am currently working on a vertx.io application and wanted to use the provide mongo api for data storage. I currently have a rather clunky abstraction on top of the stock JsonObject classes where all get and set methods are replaced with things like:
this.backingObject.get(KEY_FOR_THIS_PROPERTY);
This is all well and good for now, but it won't scale particularly well. it also seems dirty, specifically when using nested arrays or objects. For example, if I want to be able to fill fields only when actual data is known, I have to check if the array exists, and if it doesn't create it and store it in the object. Then I can add an element to the list. For example:
if (this.backingObject.getJsonArray(KEY_LIST) == null) {
this.backingObject.put(KEY_LIST, new JsonArray());
}
this.backingObject.getJsonArray(KEY_LIST).add(p.getBackingObject());
I have thought about potential solutions but don't particularly like any of them. Namely, I could use Gson or some similar library with annotation support to handle loading the object for the purposes of manipulating the data in my code, and then using the serialize and unserialize function of both Gson and Vertx to convert between the formats (vertx to load data -> json string -> gson to parse json into pojos -> make changes -> serialize to json string -> parse with vertx and save) but that's a really gross and inefficient workflow. I could also probably come up with some sort of abstract wrapper that extends/implements the vertx json library but passes all the functionality through to gson, but that also seems like a lot of work.
Is there any good way to achieve more friendly and maintainable serialization using vertx?
I just submitted a patch to Vert.x that defines two new convenience functions for converting between JsonObject and Java object instances without the inefficiency of going through an intermediate JSON string representation. This will be in version 3.4.
// Create a JsonObject from the fields of a Java object.
// Faster than calling `new JsonObject(Json.encode(obj))`.
public static JsonObject mapFrom(Object obj)
// Instantiate a Java object from a JsonObject.
// Faster than calling `Json.decodeValue(Json.encode(jsonObject), type)`.
public <T> T mapTo(Class<T> type)
Internally this uses ObjectMapper#convertValue(...), see Tim Putnam's answer for caveats of this approach. The code is here.
I believe Jackson's ObjectMapper.convertValue(..) functions don't convert via String, and Vert.x is using Jackson for managing JsonObject anyway.
JsonObject just has an underlying map representing the values, accessible via JsonObject.getMap(), and a Jackson serializer/deserializer on the public ObjectMapper instance in io.vertx.core.json.Json.
To switch between JsonObject and a data model expressed in Pojos serializable with Jackson, you can do:
JsonObject myVertxMsg = ...
MyPojo pojo = Json.mapper.convertValue ( myVertxMsg.getMap(), MyPojo.class );
I would guess this is more efficient than going via a String (but its just a guess), and I hate the idea of altering the data class just to suit the environment, so it depends on the context - form vs performance.
To convert from Pojo to JsonObject, convert to a map with Jackson and then use the constructor on JsonObject:
JsonObject myobj = new JsonObject ( Json.mapper.convertValue ( pojo, Map.class ));
If you have implied nested JsonObjects or JsonArray objects in your definition, they will get instantiated as Maps and Lists by default. JsonObject will internally re-wrap these when you access fields specifying those types (e.g. with getJsonArray(..).
Because JsonObject is freeform and you're converting to a static type, you may get some unwanted UnrecognizedPropertyException to deal with. It may be useful to create your own ObjectMapper, add the vertx JsonObjectSerializer and JsonArraySerializer, and then make configuration changes to suit (such as DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES in Jackson).
Not sure if I've understood you correctly, but it sounds like you're trying to find a simple way of converting POJOs to JsonObject?
So, we have lots of pojos that we send over the EventBus as JsonObjects
I've found the easiest way is to use the vert.x Json class which has loads of helper methods to convert to / from Json Strings
JsonObject jsonObject = new JsonObject(Json.encode(myPojo));
Sometimes you need to add some custom (de)serializers, but we always stick with Jackson - that is what Vert.x is using so they work out of the box.
What we actually do, is provide an interface like the following:
public JsonObjectSerializable {
public JsonObject toJson();
}
And all our pojos that need to be sent over the EventBus have to implement this interface.
Then our EventBus sending code looks something like (simplified):
public <T extends JsonObjectSerializable> Response<T> dispatch(T eventPayload);
Also, as we generally don't unit test Pojos, adding this interface encourages the developers to unit test their conversion.
Hope this helps,
Will
Try this:
io.vertx.core.json.Json.mapper.convertValue(json.getMap(), cls)
I think that using Gson as you described is the best possible solution at the current time.
While I agree that if a protocol layer was included in Vert.x it would indeed be first prize, using Gson keeps your server internals pretty organised and is unlikely to be the performance bottleneck.
When and only when this strategy becomes the performance bottleneck have you reached the point to engineer a better solution. Anything before that is premature optimisation.
My two cents.
You can try:
new JsonObject().mapFrom(object)

Validate bad formatted date in JSON with Jackson

I am currently using the objectMapper of Jackson to serialise values in a JSON and put them in a POJO. I need to validate those values so I use validation annotation such as #Regex, #Max and others.
So what happens for now is that I call objectMapper method to read JSON
publicEnquiry = objectMapper.readValue(jsonNode, Enquiry.class);
Then I retrieve all the validation messages in a list and I return it to the user.
payload = publicEnquiry.getPayload();
Set<ConstraintViolation<Enquiry<payload>>> constraintViolations =
publicEnquiry.getConstraintViolations();
Everything works fine if I send a too big integer, a bad formatted string, or anything that will not create a problem for the serialisation, but if I send a date formatted in an unexpected format, or simply bad formatted like "2010-02" instead of "2010-02-03", then I get a JsonMappingException. It is of course expected because the mapper can't understand a bad-formatted date.
However, I need to manage those exception and to be able to add a validation message each time it happens, in a way that will seem transparent for the user. I need a message like "Validation failed: the expected format is yyyy-MM-dd". And I need to perform the normal validation on the other properties opf the POJO like in a normal case.
Unfortunately, Jackson doesn't offer a method in the objectMapper that would skip the exception-generator fields and give back a list of those troublesome fields, or anything like that. It simply fails if there is a problem.
Does someone would have a solution or at least a suggestion on how to proceed ?

Parsing a json object which has many fields

I want to parse json from a server and place it into a class. I use json4s for this. The issue is that a json object contains too many fields, it's about 40-50 of them, some of them have the long names.
I wonder, what will be a sensible way to store all of them, will I have to create 40-50 fields in a class? Remember, some of them will have the long names, as I said earlier.
I use Scala, but a Java's approach might be similar to it, so I added a tag of Java also.
I don't know json4s but in Jersey with Jackson, for example, you can use a Map to hold the Json data or you can use a POJO with all those names.
Sometimes its better to have the names. It makes the code much easier to understand.
Sometimes its better to use a Map. For example, if the field names change from time to time.
If I recall it correctly, using pure Jackson you do something like this:
String jsonString = ....; // This is the string of JSON stuff
JsonFactory factory = new JsonFactory();
ObjectMapper mapper = new ObjectMapper(factory); // A Jackson class
Map<String,Object> data = mapper.readValue(jsonString, HashMap.class);
You can use a TypeReference to make it a little cleaner as regards the generics. Jackson docs tell more about it. There is also more here: StackOverflow: JSON to Map
There are generally two ways of parsing json to object
1) Parse json to object representation.
the other which might suit you as you mention that your object has too many fields is amap/hashtable, or you could just keep it as JObject, an get fields ehrn you need them

Converter from Gson to MongoDB objects

Is anyone aware of a converter to transform from Gson to DBOjects for MongoDB, similarly to https://code.google.com/p/mongo2gson/ but in the other direction (i.e. gson2mongo)?
My aim is to convert a string (which is a valid JSONArray) into a DBObject, so that I can insert it into a Mongo database. There seems to be a standard technique for converting JSON objects into DBObject i.e
DBObject dbObject = (DBObject) JSON.parse("some json object string");
However, this approach does not work for JSONArrays and there doesn't seem to be a simple alternative. I've seen a few hacks that work for very simple JSONArrays, but nothing that could be used on a more complex structure. The gson library has some really useful stuff, and in the link above, this problem has been solved in one direction - (it allows you to convert from DBObjects to JsonArrays) but not the other way. Hopefully that's a little clearer!
I would suggest to use Jongo for interacting with MongoDB since Gson is only a JSON toolkit.
You can save, query and update POJOs with Jongo, that makes pretty much everything you need to with MongoDB.
Gson can be used to marshal JSON to POJOs and vice versa, but when it comes to interacting with MongoDB, you can use Jongo with confident.
They can be mixed too, like converting a REST response to a POJO with with the help of Gson then writing that information to MongoDB with Jongo.

Categories

Resources