I have two JSON strings which are essentially arrays of JSONObject. The two JSON strings have below structure and keys:
JSON-1:
[
{
"title": "ABC",
"edition": 7,
"year": 2011
},
{
"title": "DEF",
"edition": 2,
"year": 2012
},
{
"title": "XYZ",
"edition": 3,
"year": 2013
}
]
And, JSON-2:
[
{
"title": "ABC",
"price": "20"
},
{
"title": "DEF",
"price": "20"
},
{
"title": "XYZ",
"price": "20"
}
]
Both these JSONs have a common key "title" based on which I want to merge these two JSONs, either merging JSON-2 values into JSON-1 or creating a JSON object with the merged result.
The merged result should look like below:
[
{
"title": "ABC",
"edition": 7,
"year": 2011,
"price": "20"
},
{
"title": "DEF",
"edition": 2,
"year": 2012
"price": "20"
},
{
"title": "XYZ",
"edition": 3,
"year": 2013
"price": "20"
}
]
How can I achieve this by minimum looping and minimum object creation? I also can not use entity/model classes. The idea is to do it without creating any model classes.
Note: I cannot use Gson because I don't have the approval to use the same.
I tried to use List<JSONObject> listObj = objectMapper.readValue(jsonOneString, new TypeReference<List<JSONObject>>(){});,
but I am getting an unknown property exception.
I tried JsonNode node = objectMapper.readTree(jsonOneString);, but I cannot proceed much further with this approach.
I know what I am doing here is highly inefficient, so looking for ways which will use no entity class, minimum new object creation and minimum loops. Kindly advise.
UPDATE: I updated the below code with a slight modification:
if (json1elem.get("title")!=null
&& json2elem.get("title")!=null
&& json1elem.get("title").equals(json2elem.get("title"))) {
//
}
JsonNode json1 = objectMapper.readTree(jsonOneString);
JsonNode json2 = objectMapper.readTree(jsonTwoString);
for (JsonNode json1elem : json1) {
for (JsonNode json2elem : json2) {
if (json1elem.get("title").equals(json2elem.get("title"))) {
((ObjectNode) json1elem).setAll((ObjectNode) json2elem);
break;
}
}
}
System.out.println(json1.toPrettyString());
Related
I am trying to get the id of the OutterObject where the InnerObject id is a specific value and the date is the most recent of all InnerObject of all OutterObject.
I'm trying to achieve that with streams.
Searching for id "ab", it should return "def"
here is a json example of the structre.
{
"OutterObject": [
{
"id": "abc",
"InnerObject": [
{
"id": "ab",
"date": "1"
},
{
"id": "de",
"date": "2"
},
{
"id": "ab",
"date": "3"
}
]
},
{
"id": "def",
"InnerObject": [
{
"id": "ab",
"date": "9"
},
{
"id": "de",
"date": "3"
},
{
"id": "ab",
"date": "1"
}
]
}
]
}
Use flatMap to gather all objects in the same array :
OutterObject.stream().flatMap(outer -> outer.getInnerObject().stream());
Use max() and Comparator to get the higher value
Combine both to get something like :
Optional<Date> maxDate = OutterObject.stream().flatMap(outer -> outer.getInnerObject().stream()).max(Comparator.comparing(InnerObject::getDate));
This is solved with :
Optional<SimpleImmutableEntry<String, InnerObject>> mostRecentOutterObjectId= MasterObject.getOutter().stream()
.flatMap(outter-> outter.getInner().stream()
.map(inner-> new SimpleImmutableEntry<String, InnerObject>(outter.getId(), inner)))
.filter(outterMap-> StringUtils.equals(outterMap.getValue().getId(), "ab")))
.max((innerA,innerB) -> innerA.getValue().getDate().compareTo(innerB.getValue().getDate()));
I am open to improvement if you see some.
I need to collect an entity records as from date to date, based on one attribute's value. If the value of type attribute is sequentially same based on date, it should be grouped by date. Since date mentioned as sequentially, should be ordered. Even if the value of type attribute of a record is different, rest of the records also should be under the same day. See the visual representation. I've tried this;
Map<LocalDate, List<Entity>> collection = entities.stream().collect(Collectors.groupingBy(Entity::getDate))
.entrySet().stream().sorted(Map.Entry.comparingByKey()).collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue, (oldValue, newValue) -> oldValue, LinkedHashMap::new));
In my implementation, I am able to only collect by dates, but I want to collect by DateRange. I want to achieve some thing like this;
Map<DateRange, List<Entity>> collection = entities.stream()...// implementation
Entity
[
{
"id": 1,
"date": "2020-01-01",
"type": 5
},
{
"id": 2,
"date": "2020-01-01",
"type": 5
},
{
"id": 1,
"date": "2020-01-02",
"type": 5
},
{
"id": 2,
"date": "2020-01-02",
"type": 5
},
.
.
.
]
Example
Date range changes based on the value of the type attribute. For example, if type=5 for all dates, so all records should be in one range. Let's say there are only records for one year and I'm assuming there are only two unique id value(id=1, id=2), so in collection I should achieve this;
[{
"From: 2020-01-01, To: 2020-12-31": [{
"record1":
{
"id": 1,
"type": "5"
},
"record2":
{
"id": 2,
"type": "5"
}
}]
}]
Another example
If the type=5 for all dates except '2020-02-01' and in '2020-02-01' type=6 for the id=1, then ranges should be like the below. I'm still assuming, there are records only for one year and there are only two unique id value(id=1, id=2).
[
{
"From: 2020-01-01, To: 2020-01-31": [{
"record1":
{
"id": 1,
"type": "5"
},
"record2":
{
"id": 2,
"type": "5"
}
}],
},
{
"From: 2020-02-01, To: 2020-02-01": [{
"record1":
{
"id": 1,
"type": "6"
},
"record2":
{
"id": 2,
"type": "5"
}
}],
},
{
"From: 2020-02-02, To: 2020-12-31": [{
"record1":
{
"id": 1,
"type": "5"
},
"record2":
{
"id": 2,
"type": "5"
}
}]
}
]
I'm trying to aggregate json objects to json list - dynamically create struct objects that are created with various amount of fields. Each time I create an aggregate using the below snippet:
batched = dataset.select(col(asteriskChar), row_number()
.over(Window.orderBy(order)).alias(rowNumAlias))
.withColumn(batchAlias, functions.ceil(col(rowNumAlias).divide(batchSize)))
.groupBy(col(batchAlias)) .agg(functions.collect_list(struct(structCol)).alias(batchedColAlias));
I would like to have object batches like below:
[
{
"id": 1,
"first": "John",
"last": "Thomas",
"score": 88
},
{
"id": 2,
"first": "Anne",
"last": "Jacobs",
"score": 32
}
]
, but I got below:
[
{
"col1": {
"id": 1,
"first": "John",
"last": "Thomas",
"score": 88
}
},
{
"col1": {
"id": 2,
"first": "Anne",
"last": "Jacobs",
"score": 32
}
}
]
How can I get rid of "col1" fields and make those jsons a single objects within an array? Thank you in advance.
Most probably you don't need the struct there:
.groupBy(col(batchAlias))
.agg(functions.collect_list(structCol).alias(batchedColAlias));
I have a JSONArray within a JSONArray, I want to apply JSONPath expression on this in such a way that i get JSONObject or JSONArray as a result when a condition is satsified on the inner JSONArray.
Eg:
{
"A": [
{
"B": [
{
"id": 1
},
{
"id": 2
},
{
"id": 3
}
],
"C": {
"id": 10,
"name": "PQR"
},
"id": 25,
"name": "XYZ"
},
{
"B": [
{
"id": 4
},
{
"id": 5
},
{
"id": 6
}
],
"C": {
"id": 15,
"name": "PQR"
},
"id": 20,
"name": "XYZ"
}
]
}
if i want all elements of A where C.id = 10, I would use: $.A[?(#.C.id == 10)]
Now, What predicate is to be used to obtain all the objects within A, where B.id = 1? Note: B is an array of JSON objects.
I had success with $.A[?(#.B[?(#.id == 1)])]
but only when using Scala's Gatling implementation:
http://jsonpath.herokuapp.com/
The Jayway implementation seems to totally ignore the inner filter and according to an issue on their GitHub, that's a bug.
Lets say I have two arrays of JSONObjects in memory and each object has a key that is similar in both arrays:
Array 1
[
{
"name": "Big Melons Co.",
"location": "Inner City Dubai"
"id": "1A"
},
{
"name": "Pear Flavored Juices Ltd",
"location": "Seychelles"
"id": "2A"
},
{
"name": "Squeeze My Lemons LLC",
"location": "UK"
"id": "3A"
}, {other JSON Objects...} ]
Array 2
[
{
"acceptsCard": "true"
"id": "1A"
},
{
"acceptsCard": "false"
"id": "2A"
},
{
"acceptsCard": "false"
"id": "3A"
}, {other JSON Objects...} ]
Now, I want to merge the two arrays together based on the primary key of "id" so they become one on my server side and then send the results back to my frontend - the resulting arraylist of objects should look like this:
MERGED ARRAY (Result)
[
{
"name": "Great Juice Co.",
"location": "Inner City Dubai"
"acceptsCard": "true"
"id": "1A"
},
{
"name": "Pear Flavored Juices Ltd",
"location": "Seychelles"
"acceptsCard": "false"
"id": "2A"
},
{
"name": "Squeeze My Lemons LLC",
"location": "UK"
"acceptsCard": "false"
"id": "3A"
}, {other JSON Objects...} ]
How can I do this efficiently?
I can think of one highly inefficient way to do this (I'm dreading implementing this) - I would loop though each item in either array 1 or 2 and use the equal() method for the string in the "id" field to see whether the two matches. If they match, I would create a new JSONObject to contain both the fields from array 1 and 2.
My Java is a little rusty but I would use a map.
List<JSONObject> objectsA = ... ;
List<JSONObject> objectsB = ... ;
Map entries = new HashMap<String, JSONObject>();
List<JSONObject> allObjects = new ArrayList<JSONObject>();
allObjects.addAll(objectsA);
allObjects.addAll(objectsB);
for (JSONObject obj: allObjects) {
String key = obj.getString("id");
JSONObject existing = entries.get(key);
if (existing == null) {
existing = new JSONObject();
entries.put(key, existing);
}
for (String subKey : obj.keys()) {
existing.put(subKey, obj.get(subKey));
}
}
List<JSONObject> merged = entries.values();
This is more efficient than two nested loops and there's still room for improvement.
EDIT: References to external documentation and related answers.
http://docs.oracle.com/javase/7/docs/api/java/util/Map.html
http://www.json.org/javadoc/org/json/JSONObject.html
https://stackoverflow.com/a/2403427/937006