There's a chat which content I want to parse. I got the url to get the .json of it. So it looks like:
{"messages":[
{"id":"111111","uid":"22222","name":"User","message":"Message","date":"2013-02-15 17:21:54","chatId":"111"},
{"id":"111111","uid":"22222","name":"User","message":"Message","date":"2013-02-15 17:21:54","chatId":"111"},
{"id":"111111","uid":"22222","name":"User","message":"Message","date":"2013-02-15 17:21:54","chatId":"111"}
]}
But this json has some limitation, I think approximately 20-30 records. New records are added at the beginning. It looks like:
{"messages":[
{"id":"222222","uid":"33333","name":"User","message":"Message","date":"2013-02-15 18:21:59","chatId":"111"},
{"id":"111111","uid":"22222","name":"User","message":"Message","date":"2013-02-15 17:21:54","chatId":"111"},
{"id":"111111","uid":"22222","name":"User","message":"Message","date":"2013-02-15 17:21:55","chatId":"111"}
]}
.......
{"messages":[
{"id":"333333","uid":"44444","name":"User","message":"Message","date":"2013-02-15 19:13:34","chatId":"111"},
{"id":"222222","uid":"33333","name":"User","message":"Message","date":"2013-02-15 18:21:59","chatId":"111"},
{"id":"111111","uid":"22222","name":"User","message":"Message","date":"2013-02-15 17:21:54","chatId":"111"}
]}
I gonna read this json via GSON or JSON Java and place to any output, it doesn't matter :)
But is there any best-practices on how to parse new records in dynamically updated json? In fact I don't know how to control that it is updated, but reading it every second and put results to output will result in data duplication I think.
You will need to slice this data some way... To not make your GSON parse slow when your file grows, I think you would need to preprocess your file, slicing the disposable data. I would do something like:
First execution: parse the file entirely and store the first id, since it is the newest data;
Second execution (and others): read the file and store it in a StringBuilder. Using the String obtained, slice the disposable data, since you have the id stored prior. This id will show you where you need to start your slicing. With the new data, parse the GSON and stores the first id again.
You may use this create your code to perform the slicing and adapt to the idea that I said below:
String data = "{\"messages\":[" +
"{\"id\":\"333333\",\"uid\":\"44444\",\"name\":\"User\",\"message\":\"Message\",\"date\":\"2013-02-15 19:13:34\",\"chatId\":\"111\"}," +
"{\"id\":\"222222\",\"uid\":\"33333\",\"name\":\"User\",\"message\":\"Message\",\"date\":\"2013-02-15 18:21:59\",\"chatId\":\"111\"}," +
"{\"id\":\"111111\",\"uid\":\"22222\",\"name\":\"User\",\"message\":\"Message\",\"date\":\"2013-02-15 17:21:54\",\"chatId\":\"111\"}" +
"]}";
String lastId = "111111";
int sliceUntil = data.indexOf( "{\"id\":\"" + lastId + "\"" );
// since your disposable data is in the "tail" you your file,
// you just need to get the valid data (the data until the "last id")
// and add the chars "]" and "}" to close your JSON
String newData = data.substring( 0, sliceUntil ) + "]}";
System.out.println( newData );
You have uniq "date" for every message, so you can just create Map<Date, Message>, update it with all non-existant elements. Or you can just use Map<String, Message> with date string if you don't need time sorting
Related
I have a slight problem with return data from Mongo find() vs a findOneAndReplace().
First, to know I'm working on an API that queries Mongo and returns data in JSON format.
Problem I'm having is that if I do a findOneAndReplace() to update and return the modified document, much like so:
// javaDocument() is just a org.bson.Document
var modDoc = jobsCollection.findOneAndReplace( javaDocument({"_id": jobid}), javaDocument(jobData), foarOptions );
if(isDefined("modDoc")) {
sReturn.DATA = deserializeJSON(modDoc.toJSON());
}
the dates set in my document returns within subkeys named "$date", which I do not want:
It really should read:
"created_at": "2022-02-03T10:15:01.634Z"
Doing a simple jobsCollection.find() however, seems to return me the date appropriately, like so:
var data= [];
jobsCollection.find( javaDocument({}) ).into(data);
sReturn.DATA = data;
What am I missing here ? I could simple copy the "$date" key and fix the structure, but I don't always know where in the structure I'll have dates.... Is there a way I could have this properly returned with FindOneAndReplace() ? I'm thinking in might have something to do with the modDoc.toJSON() but haven't quite found my answer yet...
Thanks for your time. Cheers! Pat
Is there any way to convert the string below to a list?
This string is retrieved after scanning a QR code.
CashRequest{
orderid='0',
user_id='nvHt2U5RnqUwXB4ZK37Zn1DXPV82',
userName='username',
userEmail='whateveremailthisis#email.blabla',
fullName='full name',
phoneNumber=0,
totalCash='$304.00',
totalRV='$34.00',
foods=[
Order{
userID='nvHt2U5RnqUwXB4ZK37Zn1DXPV82',
ProductID='-LMDiT7klgoXU8bQEM-4',
ProductName='Coke',
Quantity='4',
Price='1',
RedemptionPrice='10',
RedemptionValue='1'},
Order{
userID='nvHt2U5RnqUwXB4ZK37Zn1DXPV82',
ProductID='1000',
ProductName='Kunau Ring Ring Pradu',
Quantity='3',
Price='100',
RedemptionPrice='10',
RedemptionValue='10'
}
]
}
The desired output is to store it in firebase realtime database as below :
Well you have a few options. Since it is newline between values, you could use simple newline reads and compare if it starts with "reserved word that you are looking for" then substring from there, but that can get messy and a lot of bloat code.
The simplest way would be to do the known replace first.
Make a method that replaces all bad json keys with quote surrounded json keys like:
val myJsonCorrected = yourStringAbove.replace("Order", "\"Order"\")
repeat for all known entities until you have made it into valid json. Single ticks are fine for the values, but the keys need quotes as well.
Then simply create an object that matches the json format.
class CashRequestModel{
#SerializableName("orderid")
var orderID: Int? = null
etc.....
#SerializableName("foods")
var myFoods: ArrayList<OrderModel>? = null
}
class OrderMode {
#SerializableName("userID")
var userID: String? = null
#SerializableName("ProductID")
var userID: String? = null
etc..
}
Then simply convert it to JSON
val cashRequest = getGson().fromJson(cleanedUpJson, classTypeForCashRequest);
and your done. Now just use the list. Of course it would be better if you could get valid JSON without having to clean it up first, but it looks like the keys are known and you can easily code string replaces to fix the bad json before casting it to object that matches the structure.
Hope that helps.
In Java, using the Jackson ObjectMapper, I'm trying to deserialize a dynamo db object being read from a dynamo db stream.
I first call:
record.getDynamodb().getNewImage().get("primaryKey").getS().toString()
to get the primaryKey value of "1_12345" back from the stream.
I then use it in the object mapper to create a new instance of the Metrics object with the primaryKey member set:objectMapper.readValue("1_12345", Metrics.class);
The problem is I get an exception on that call:
Unexpected character ('_' (code 95)): Expected space separating root-level values
Metrics.class is a simple POJO with no constructor. I'm wondering if I need any special annotations or escape characters in my readValue call. I can't seem to find any clear indications on what the solution is in the case of this error.
(Side note - the reason I can't parse it straight from the json is because the json's structure when it's parsed from the stream isn't straightforward, a value looks like this, S indicating String, N for number etc:
{primaryKey={S: 1_12345,}, rangeKey={N: xxx}... etc. })
Thank you, that was the problem, the readValue() call takes a String in the format of JSON. The solution was to convert the dynamo streamed image into lists & maps (using the dynamodbv2 libs) until it was in the correct format as below:
Map<String, AttributeValue> newImage = record.getDynamodb().getNewImage();
List<Map<String, AttributeValue>> listOfMaps = new ArrayList<Map<String, AttributeValue>>();
listOfMaps.add(newImage);
List<Item> itemList = InternalUtils.toItemList(listOfMaps);
for (Item item : itemList) {
String json = item.toJSON();
Metrics metric = objectMapper.readValue(json, Metrics.class);
}
I'm building a logging application that does the following:
gets JSON strings from many loggers continuously and saves them to a db
serves the collected data as a per logger bulk
my intention is to use a document based NoSQL storage to have the bulk structure right away. After some research I decided to go for MongoDB because of the following features:
- comprehensive functions to insert data into existing structures ($push, (capped) collection)
- automatic sharding with a key I choose (so I can shard on a per logger basis and therefore serve bulk data in no time - all data already being on the same db server)
The JSON I get from the loggers looks like this:
[
{"bdy":{
"cat":{"id":"36494h89h","toc":55,"boc":99},
"dataT":"2013-08-12T13:44:03Z","had":0,
"rng":23,"Iss":[{"id":10,"par":"dim, 10, dak"}]
},"hdr":{
"v":"0.2.7","N":2,"Id":"KBZD348940"}
}
]
The logger can send more than one element in the same array. I this example it is just one.
I started coding in Java with the mongo driver and the first problem I discovered was: I have to parse my with no doubt valid JSON to be able to save it in mongoDB. I learned that this is due to BSON being the native format of MongoDB. I would have liked to forward the JSON string to the db directly to save that extra execution time.
so what I do in a first Java test to save just this JSON string is:
String loggerMessage = "...the above JSON string...";
DBCollection coll = db.getCollection("logData");
DBObject message = (DBObject) JSON.parse(loggerMessage);
coll.insert(message);
the last line of this code causes the following exception:
Exception in thread "main" java.lang.IllegalArgumentException: BasicBSONList can only work with numeric keys, not: [_id]
at org.bson.types.BasicBSONList._getInt(BasicBSONList.java:161)
at org.bson.types.BasicBSONList._getInt(BasicBSONList.java:152)
at org.bson.types.BasicBSONList.get(BasicBSONList.java:104)
at com.mongodb.DBCollection.apply(DBCollection.java:767)
at com.mongodb.DBCollection.apply(DBCollection.java:756)
at com.mongodb.DBApiLayer$MyCollection.insert(DBApiLayer.java:220)
at com.mongodb.DBApiLayer$MyCollection.insert(DBApiLayer.java:204)
at com.mongodb.DBCollection.insert(DBCollection.java:76)
at com.mongodb.DBCollection.insert(DBCollection.java:60)
at com.mongodb.DBCollection.insert(DBCollection.java:105)
at mongomockup.MongoMockup.main(MongoMockup.java:65)
I tried to save this JSON via the mongo shell and it works perfectly.
How can I get this done in Java?
How could I maybe save the extra parsing?
What structure would you choose to save the data? Array of messages in the same document, collection of messages in single documents, ....
It didn't work because of the array. You need a BasicDBList to be able to save multiple messages. Here is my new solution that works perfectly:
BasicDBList data = (BasicDBList) JSON.parse(loggerMessage);
for(int i=0; i < data.size(); i++){
coll.insert((DBObject) data.get(i));
}
I have a JSONObject which inturn contains two JSONObjects ( key is rows_map and columns_map)
{
"rows_map":{
"3":["Test","Test","Test","Test","Test",null,null,null,null,null,"2011-10-07 15:47:56.0",null,null],
"2":["test","","","","",null,null,null,"123456789","123456789.user","2011-10-07 12:49:49.0",null,null]
},
"columns_map":{
columns_map":"fld1","fld2","fld3","fld4","fld5","Latitude","Longitude","Altitude","Mobile Number","Name","Time","Message","Advertisment"]
}
}
In rows_map 3 and 2 are record numbers.
Each record is related to columns in columns_map.
I want to get records list based on mobile number column
for eg: recordslist of mobilenumber equal to 123456789
How can i do this.
In the case that you're trying to read a json using java, you should be parsing it as an object rather than via syntax. Take a look at:
https://stackoverflow.com/questions/338586/a-better-java-json-library