I'm saving an object with a java.util.Date field into a MongoDB 3.2 instance.
ObjectMapper mapper = new ObjectMapper();
String json = mapper.writeValueAsString(myObject);
collection.insertOne(Document.parse(json));
the String contains:
"captured": 1454549266735
then I read it from the MongoDB instance:
final Document document = collection.find(eq("key", value)).first();
final String json = document.toJson();
ObjectMapper mapper = new ObjectMapper();
mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
xx = mapper.readValue(json, MyClass.class);
the deserialization fails:
java.lang.RuntimeException: com.fasterxml.jackson.databind.JsonMappingException:
Can not deserialize instance of java.util.Date out of START_OBJECT token
I see that the json string created by "document.toJson()" contains:
"captured": {
"$numberLong": "1454550216318"
}
instead of what was there originally ("captured": 1454549266735)
MongoDB docs say they started using "MongoDB Extended Json". I tried both Jackson 1 and 2 to parse it - no luck.
what is the easiest way to convert those Document objects provided by MongoDB 3 to Java POJOs? maybe I can skip toJson() step altogether?
I tried mongojack - that one does not support MongoDB3.
Looked at couple other POJO mappers listed on MongoDB docs page - they all require putting their custom annotations to Java classes.
You should define and use custom JsonWriterSettings to fine-tune JSON generation:
JsonWriterSettings settings = JsonWriterSettings.builder()
.int64Converter((value, writer) -> writer.writeNumber(value.toString()))
.build();
String json = new Document("a", 12).append("b", 14L).toJson(settings);
Will produce:
{ "a" : 12, "b" : 14 }
If you will not use custom settings then document will produce extended json:
{ "a" : 12, "b" : { "$numberLong" : "14" } }
This looks like Mongo Java driver bug, where Document.toJson profuces non-standard JSON even if JsonMode.STRICT is used. This problem is described in the following bug https://jira.mongodb.org/browse/JAVA-2173 for which I encourage you to vote.
A workaround is to use com.mongodb.util.JSON.serialize(document).
I save a tag with my mongo document that specifies the original type of the object stored. I then use Gson to parse it with the name of that type. First, to create the stored Document
private static Gson gson = new Gson();
public static Document ConvertToDocument(Object rd) {
if (rd instanceof Document)
return (Document)rd;
String json = gson.toJson(rd);
Document doc = Document.parse(json);
doc.append(TYPE_FIELD, rd.getClass().getName());
return doc;
}
then to read the document back into the Java,
public static Object ConvertFromDocument(Document doc) throws CAAException {
String clazzName = doc.getString(TYPE_FIELD);
if (clazzName == null)
throw new RuntimeException("Document was not stored in the DB or got stored without becing created by itemToStoredDocument()");
Class<?> clazz;
try {
clazz = (Class<?>) Class.forName(clazzName);
} catch (ClassNotFoundException e) {
throw new CAAException("Could not load class " + clazzName, e);
}
json = com.mongodb.util.JSON.serialize(doc);
return gson.fromJson(json, clazz);
}
Thanks to Aleksey for pointing out JSON.serialize().
It looks like you are using Date object inside "myObject". In that case, you should use a DateSerializer that implements JsonSerializer<LocalDate>, JsonDeserializer<LocalDate> and then register it with GsonBuilder. Sample code follows:
public class My_DateSerializer implements JsonSerializer<LocalDate>,
JsonDeserializer<LocalDate> {
#Override
public LocalDate deserialize(JsonElement json, Type typeOfT,
JsonDeserializationContext context) throws JsonParseException {
final String dateAsString = json.getAsString();
final DateTimeFormatter dtf = DateTimeFormat.forPattern(DATE_FORMAT);
if (dateAsString.length() == 0)
{
return null;
}
else
{
return dtf.parseLocalDate(dateAsString);
}
}
#Override
public JsonElement serialize(LocalDate src, Type typeOfSrc,
JsonSerializationContext context) {
String retVal;
final DateTimeFormatter dtf = DateTimeFormat.forPattern(DATE_FORMAT);
if (src == null)
{
retVal = "";
}
else
{
retVal = dtf.print(src);
}
return new JsonPrimitive(retVal);
}
}
Now register it with GsonBuilder:
final GsonBuilder builder = new GsonBuilder()
.registerTypeAdapter(LocalDate.class, new My_DateSerializer());
final Gson gson = builder.create();
Related
I am trying to read the events from a large JSON file one-by-one using the Jackson JsonParser. I would like to store each event temporarily in an Object something like JsonObject or any other object which I later want to use for some further processing.
I was previously reading the JSON events one-by-one and storing them into my own custom context: Old Post for JACKSON JsonParser Context which is working fine. However, rather than context, I would like to store them into jsonObject or some other object one by one.
Following is my sample JSON file:
{
"#context":"https://context.org/context.jsonld",
"isA":"SchoolManagement",
"format":"application/ld+json",
"schemaVersion":"2.0",
"creationDate":"2021-04-21T10:10:09+00:00",
"body":{
"members":[
{
"isA":"student",
"name":"ABCS",
"class":10,
"coaching":[
"XSJSJ",
"IIIRIRI"
],
"dob":"1995-04-21T10:10:09+00:00"
},
{
"isA":"teacher",
"name":"ABCS",
"department":"computer science",
"school":{
"name":"ABCD School"
},
"dob":"1995-04-21T10:10:09+00:00"
},
{
"isA":"boardMember",
"name":"ABCS",
"board":"schoolboard",
"dob":"1995-04-21T10:10:09+00:00"
}
]
}
}
At a time I would like to store only one member such as student or teacher in my JsonObject.
Following is the code I have so far:
What's the best way to store each event in an Object which I can later use for some processing.
Then again clear that object and use it for the next event?
public class Main {
private JSONObject eventInfo;
private final String[] eventTypes = new String[] { "student", "teacher", "boardMember" };
public static void main(String[] args) throws JsonParseException, JsonMappingException, IOException, JAXBException, URISyntaxException {
// Get the JSON Factory and parser Object
JsonFactory jsonFactory = new JsonFactory();
JsonParser jsonParser = jsonFactory.createParser(new File(Main.class.getClassLoader().getResource("inputJson.json").toURI()));
JsonToken current = jsonParser.nextToken();
// Check the first element is Object
if (current != JsonToken.START_OBJECT) {
throw new IllegalStateException("Expected content to be an array");
}
// Loop until the start of the EPCIS EventList array
while (jsonParser.nextToken() != JsonToken.START_ARRAY) {
System.out.println(jsonParser.getCurrentToken() + " --- " + jsonParser.getCurrentName());
}
// Goto the next token
jsonParser.nextToken();
// Call the method to loop until the end of the events file
eventTraverser(jsonParser);
}
// Method which will traverse through the eventList and read event one-by-one
private static void eventTraverser(JsonParser jsonParser) throws IOException {
// Loop until the end of the EPCIS events file
while (jsonParser.nextToken() != JsonToken.END_OBJECT) {
//Is there a possibility to store the complete object directly in an JSON Object or I need to again go through every token to see if is array and handle it accordingly as mentioned in my previous POST.
}
}
}
After trying some things I was able to get it working. I am posting the whole code as it can be useful to someone in the future cause I know how frustrating it is to find the proper working code sample:
public class Main
{
public void xmlConverter (InputStream jsonStream) throws IOException,JAXBException, XMLStreamException
{
// jsonStream is the input JSOn which is normally passed by reading the JSON file
// Get the JSON Factory and parser Object
final JsonFactory jsonFactory = new JsonFactory ();
final JsonParser jsonParser = jsonFactory.createParser (jsonStream);
final ObjectMapper objectMapper = new ObjectMapper ();
//To read the duplicate keys if there are any key duplicate json
final SimpleModule module = new SimpleModule ();
module.addDeserializer (JsonNode.class, new JsonNodeDupeFieldHandlingDeserializer ());
objectMapper.registerModule (module);
jsonParser.setCodec (objectMapper);
// Check the first element is Object if not then invalid JSON throw error
if (jsonParser.nextToken () != JsonToken.START_OBJECT)
{
throw new IllegalStateException ("Expected content to be an array");
}
while (!jsonParser.getText ().equals ("members"))
{
//Skipping the elements till members key
// if you want you can do some process here
// I am skipping for now
}
// Goto the next token
jsonParser.nextToken ();
while (jsonParser.nextToken () != JsonToken.END_ARRAY)
{
final JsonNode jsonNode = jsonParser.readValueAsTree ();
//Check if the JsonNode is valid if not then exit the process
if (jsonNode == null || jsonNode.isNull ())
{
System.out.println ("End Of File");
break;
}
// Get the eventType
final String eventType = jsonNode.get ("isA").asText ();
// Based on eventType call different type of class
switch (eventType)
{
case "student":
final Student studentInfo =
objectMapper.treeToValue (jsonNode, Student.class);
//I was calling the JAXB Method as I was doing the JSON to XML Conversion
xmlCreator (studentInfo, Student.class);
break;
case "teacher":
final Teacher teacherInfo =
objectMapper.treeToValue (jsonNode, Teacher.class);
xmlCreator (teacherInfo, Teacher.class);
break;
}
}
}
//Method to create the XML using the JAXB
private void xmlCreator (Object eventInfo,
Class eventType) throws JAXBException
{
private final StringWriter sw = new StringWriter ();
// Create JAXB Context object
JAXBContext context = JAXBContext.newInstance (eventType);
// Create Marshaller object from JAXBContext
Marshaller marshaller = context.createMarshaller ();
// Print formatted XML
marshaller.setProperty (Marshaller.JAXB_FORMATTED_OUTPUT, Boolean.TRUE);
// Do not add the <xml> version tag
marshaller.setProperty (Marshaller.JAXB_FRAGMENT, Boolean.TRUE);
// XmlSupportExtension is an interface that every class such as Student Teacher implements
// xmlSupport is a method in XmlSupportExtension which has been implemented in all classes
// Create the XML based on type of incoming event type and store in SW
marshaller.marshal (((XmlSupportExtension) eventInfo).xmlSupport (),
sw);
// Add each event within the List
eventsList.add (sw.toString ());
// Clear the StringWritter for next event
sw.getBuffer ().setLength (0);
}
}
This is the class that overrides the JACKSON class.
This can be used if your Json has duplicate JSON keys. Follow this post for the complete explnation if you need. If you dont need then skip this part and remove the part of the code module from the above class:
Jackson #JsonAnySetter ignores values of duplicate key when used with Jackson ObjectMapper treeToValue method
#JsonDeserialize(using = JsonNodeDupeFieldHandlingDeserializer.class)
public class JsonNodeDupeFieldHandlingDeserializer extends JsonNodeDeserializer {
#Override
protected void _handleDuplicateField(JsonParser p, DeserializationContext ctxt, JsonNodeFactory nodeFactory, String fieldName,
ObjectNode objectNode, JsonNode oldValue, JsonNode newValue) {
ArrayNode asArrayValue = null;
if (oldValue.isArray()) {
asArrayValue = (ArrayNode) oldValue;
} else {
asArrayValue = nodeFactory.arrayNode();
asArrayValue.add(oldValue);
}
asArrayValue.add(newValue);
objectNode.set(fieldName, asArrayValue);
}
}
I would like to make sure o is a serializable top level JSON object, that is [] or {} else throw an exception. I have tried the following code using "" and null as input but they are not triggering an exception.
static void checkIsjsonSerializable(Object o, String message)
throws MissingRequiredValueException {
try{
Gson gson = new Gson();
gson.toJson(o);
} catch (Exception e) {
throw new MissingRequiredValueException(message);
}
}
What would need to change to get the check I want?
Update:
After comments it clear my understanding was wrong. My question has change to:
How can I assert only [] and {} are valid in the following function?
As others have mentioned, modern definitions of JSON do allow primitives (strings, numbers, booleans, null) as top-level elements. But if you really need to do this check with GSON, here's one option:
private static final Gson gson = new Gson();
static void checkIsjsonSerializable(Object o, String message)
throws MissingRequiredValueException {
JsonElement rootElement = gson.toJsonTree(o);
if (!rootElement.isJsonArray() && !rootElement.isJsonObject()) {
throw new MissingRequiredValueException(message);
}
}
I have the below mongoDB code with Java, I am trying to insert the TechnologyDetails to mongoDB,
and in the mapper method I am setting all the values and persisting to DB. The problem is as the collection.insertOne() takes only document as an argument, after I convert the TechnologyDetails pojo to Document, during insertion to MongoDB the "createdAt" Date datatype field is being inserted as String to DB. Could anyone help on this, how to maintain the same dataType even after converting the pojo to Document. So that I insert Date as Date to mongoDb. Thanks.
final FindIterable<Document> iterable = technologiesCollection
.find(and(eq(APPLICATION, techKey.getApplication()), eq(VERSION, techKey.getVersion()),
eq(TECHNOLOGY, techKey.getTechnology())));
final Document document = iterable.first();
final ObjectMapper mapper = new ObjectMapper();
final TechnologyDetails technology = mapper.convertValue(document, TechnologyDetails.class);
if (technology == null) {
//mapper method to set the technology fields
Document tech = mapper(techKey, hosts);
try {
technologiesCollection.insertOne(tech);
} catch (Exception e) {
LOGGER.error("error", e);
}
}
private Document mapper(final TechnologyKey techKey, final Set<ApplicationHost> hosts) {
final TechnologyDetails technology = new TechnologyDetails();
final TransactionDetail txnDetail = new TransactionDetail();
final UserDetail userDetail = new UserDetail();
technology.setApplication(techKey.getApplication());
technology.setVersion(techKey.getVersion());
technology.setTechnology(techKey.getTechnology());
if (hosts != null) {
technology.setApplicationHosts(hosts);
}
userDetail.setDsid("123");
userDetail.setName("APP");
txnDetail.setCreatedBy(userDetail);
final Date date = new Date(System.currentTimeMillis());
txnDetail.setCreatedAt(date);
technology.setTxnDetails(txnDetail);
Document document = Document.parse(new JSONObject(technology).toString());
return document;
}
I would recommend using an ODM
https://www.baeldung.com/mongodb-morphia
I have a json file witch contains a list of a given type, for example:
[{
"createdBy":"SYSTEM_INIT",
"status":401,
"message":"Unauthorized",
"description":"",
"lang":"en"
},{
"createdBy":"SYSTEM_INIT",
"status":413,
"message":"Payload Too Large",
"description":"The request is larger than the server is willing or able to process",
"lang":"en"
}{....
I have a java class(this case call it MyClass) witch represent this one entity.
When i want to deserialize the json into collection i only have a Class object, with is MyClass.class . How can read in to a Collection<MyClass>
Actually it"s a spring applicaiton and this is my current code:
public boolean importFromJsonFile(String table, Class clazz) {
Collection<Object> objectCollection = new ArrayList<>();
ObjectMapper mapper = new ObjectMapper();
try {
ClassLoader classLoader = getClass().getClassLoader();
File file = new File(classLoader.getResource("init/" + table + ".import.json").getFile());
objectCollection = mapper.readValue(file, Collection.class);
}catch (Exception e){
log.error("error: ", e);
return false;
}
if(objectCollection.size() < 0){
log.error("There is no data in the file");
return false;
}
try {
for(Object obj : objectCollection){
save(obj, clazz);
}
}catch (Exception e){
log.error("fatal error: ", e);
return false;
}
return true;
}
So my problem after reading the file the result will be a Collection of LinkedHasMap instead of Collection of MyClass with is in this particular sample is represented by variable clazz. Is there any way to do this?
There's way
List<MyClass> myObjects = Arrays.asList(mapper.readValue(file, MyClass[].class))
You can get different approaches here, however, this one is faster.
Based on the comments(As It is mentioned that class variable and file is present on the runtime):
public List getObjects(Class<T> aClass, File file) throws ClassNotFoundException, IOException {
ObjectMapper mapper = new ObjectMapper();
//Fetches Arrayed Class
Class namedClass = Class.forName("[L" + aClass.getName() + ";");
List myObjects = Arrays.asList(mapper.readValue(file, namedClass));
return myObjects;
}
To solve my type mismatch problem discussed in this thread I created custom Deserializers and added them to ObjectMapper. However the performance deteriorates significantly with this.
With default deserializer i get 1-2 Garbage collection calls in logcat while with custom deserializer there are at least 7-8 GC calls, and hence the processing time is also increase significantly.
My Deserializer :
public class Deserializer<T> {
public JsonDeserializer<T> getDeserializer(final Class<T> cls) {
return new JsonDeserializer<T> (){
#Override
public T deserialize(JsonParser jp, DeserializationContext arg1) throws IOException, JsonProcessingException {
JsonNode node = jp.readValueAsTree();
if (node.isObject()) {
return new ObjectMapper().convertValue(node, cls);
}
return null;
}
};
}
}
And I am using this to add to Mapper
public class DeserializerAttachedMapper<T> {
public ObjectMapper getMapperAttachedWith(final Class<T> cls , JsonDeserializer<T> deserializer) {
ObjectMapper mapper = new ObjectMapper();
SimpleModule module = new SimpleModule(deserializer.toString(), new Version(1, 0, 0, null, null, null));
module.addDeserializer(cls, deserializer);
mapper.registerModule(module);
return mapper;
}
}
EDIT: Added extra data
My JSON is of considerable size but not huge:
I have pasted it here
Now for parsing the same JSON if i use this code:
String response = ConnectionManager.doGet(mAuthType, url, authToken);
FLog.d("location object response" + response);
// SimpleModule module = new SimpleModule("UserModule", new Version(1, 0, 0, null, null, null));
// JsonDeserializer<User> userDeserializer = new Deserializer<User>().getDeserializer(User.class);
// module.addDeserializer(User.class, userDeserializer);
ObjectMapper mapper = new ObjectMapper();
// mapper.registerModule(module);
JsonNode tree = mapper.readTree(response);
Integer code = Integer.parseInt(tree.get("code").asText().trim());
if(Constants.API_RESPONSE_SUCCESS_CODE == code) {
ExploreLocationObject locationObject = mapper.convertValue(tree.path("response").get("locationObject"), ExploreLocationObject.class);
FLog.d("locationObject" + locationObject);
FLog.d("locationObject events" + locationObject.getEvents().size());
return locationObject;
}
return null;
Then my logcat is like this
But if I use this code for same JSON
String response = ConnectionManager.doGet(mAuthType, url, authToken);
FLog.d("location object response" + response);
SimpleModule module = new SimpleModule("UserModule", new Version(1, 0, 0, null, null, null));
JsonDeserializer<User> userDeserializer = new Deserializer<User>().getDeserializer(User.class);
module.addDeserializer(User.class, userDeserializer);
ObjectMapper mapper = new ObjectMapper();
mapper.registerModule(module);
JsonNode tree = mapper.readTree(response);
Integer code = Integer.parseInt(tree.get("code").asText().trim());
if(Constants.API_RESPONSE_SUCCESS_CODE == code) {
ExploreLocationObject locationObject = mapper.convertValue(tree.path("response").get("locationObject"), ExploreLocationObject.class);
FLog.d("locationObject" + locationObject);
FLog.d("locationObject events" + locationObject.getEvents().size());
return locationObject;
}
return null;
Then my logcat is like this
How big is the object? Code basically builds a tree model (sort of dom tree), and that will take something like 3x-5x as much memory as the original document. So I assume your input is a huge JSON document.
You can definitely write a more efficient version using Streaming API. Something like:
JsonParser jp = mapper.getJsonFactory().createJsonParser(input);
JsonToken t = jp.nextToken();
if (t == JsonToken.START_OBJECT) {
return mapper.readValue(jp, classToBindTo);
}
return null;
it is also possible to implement this with data-binding (as JsonDeserializer), but it gets bit complicated just because you want to delegate to "default" deserializer.
To do this, you would need to implement BeanDeserializerModifier, and replace standard deserializer when "modifyDeserializer" is called: your own code can retain reference to the original deserializer and delegate to it, instead of using intermediate tree model.
If you are not tied to jackson you could also try Genson http://code.google.com/p/genson/.
In your case there are two main advantages: you will not loose in performance, it should be easier to implement. If the property event does not start with upper letter annotate it with #JsonProperty("Event") (same for the other properties starting with an upper letter).
With the following code you should be done:
Genson genson = new Genson.Builder()
.withDeserializerFactory(new EventDeserializerFactory()).create();
YourRootClass[] bean = genson.deserialize(json, YourRootClass[].class);
class EventDeserializerFactory implements Factory<Deserializer<Event>> {
public Deserializer<Event> create(Type type, Genson genson) {
return new EventDeserializer(genson.getBeanDescriptorFactory().provide(Event.class,
genson));
}
}
class EventDeserializer implements Deserializer<Event> {
private final Deserializer<Event> standardEventDeserializer;
public EventDeserializer(Deserializer<Event> standardEventDeserializer) {
this.standardEventDeserializer = standardEventDeserializer;
}
public Event deserialize(ObjectReader reader, Context ctx) throws TransformationException,
IOException {
if (ValueType.ARRAY == reader.getValueType()) {
reader.beginArray().endArray();
return null;
}
return standardEventDeserializer.deserialize(reader, ctx);
}
}