I have been working on a problem statement where we have a huge JSON response coming in and when we were parsing it using conventional gson parsing technique, it used to give OutOfMemoryException as this method stores the data in memory before processing it, so as a solution to this i have worked on streaming the JSON response where it won't put everything in memory, so it worked fine till somewhere around 1.6 million records and after that even that broke. So this is the exception we are getting.
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
This is the entire code i'm using for this:
// Getting reponse into InputStream and casting it to JsonReader object for parsing
InputStream liInStream = luURLConn.getInputStream();
lCycleTimeReader = new JsonReader(new InputStreamReader(liInStream, "UTF-8"));
Our JSON looks like this:
{
"Report_Entry": [
{
"key1": "value",
"key2": "value",
"key3": "value",
"key4": "value",
"key5": "value"
},
{
"key1": "value",
"key2": "value",
"key3": "value",
"key4": "value",
"key5": "value"
}
]}
Using this object into our parsing method:
public HashMap<String, HashMap<String, String>> getcycleTimeMap(JsonReader poJSONReaderObj,
CycleTimeConstant cycleTimeConstant, int processId) {
Integer counter = 0;
HashMap<String, HashMap<String, String>> cycleTimeMap = new HashMap<String, HashMap<String, String>>();
HashMap<String, HashMap<String, String>> finalcycleTimeMap = new HashMap<String, HashMap<String, String>>();
try {
CycleTime cycleTime = new CycleTime();
poJSONReaderObj.beginObject();
while (poJSONReaderObj.hasNext()) {
String name = poJSONReaderObj.nextName();
if (name.equals("Report_Entry")) {
poJSONReaderObj.beginArray();
while (poJSONReaderObj.hasNext()) {
JsonToken nextToken2 = poJSONReaderObj.peek();
if (JsonToken.BEGIN_OBJECT.equals(nextToken2)) {
poJSONReaderObj.beginObject();
} else if (JsonToken.END_OBJECT.equals(nextToken2)) {
poJSONReaderObj.endObject();
} else {
String nextString = "";
if (JsonToken.STRING.equals(nextToken2)) {
nextString = poJSONReaderObj.nextString();
} else if (JsonToken.NAME.equals(nextToken2)) {
nextString = poJSONReaderObj.nextName();
}
switch (nextString) {
case "key1":
cycleTime.setKey1(poJSONReaderObj.nextString());
break;
case "key2":
cycleTime.setKey2(poJSONReaderObj.nextString());
break;
case "key3":
cycleTime.setKey3(poJSONReaderObj.nextString());
break;
case "key4":
cycleTime.setKey4(poJSONReaderObj.nextString());
break;
case "key5":
cycleTime.setKey5(poJSONReaderObj.nextString());
break;
}
}
poJSONReaderObj.endObject();
System.out
.println("Value of Map is : " + new Gson().toJson(cycleTime) + "counter : " + counter);
counter++;
System.out.println("Counter : " + counter);
cycleTimeMap = (HashMap<String, HashMap<String, String>>) cycleTimeBpProcessIterator(
cycleTime, cycleTimeConstant, counter, processId);
}
finalcycleTimeMap.putAll(cycleTimeMap);
}
}
JsonToken nextToken = poJSONReaderObj.peek();
if (JsonToken.END_OBJECT.equals(nextToken)) {
poJSONReaderObj.endObject();
} else if (JsonToken.END_ARRAY.equals(nextToken)) {
poJSONReaderObj.endArray();
}
} catch (IOException ioException) {
ioException.printStackTrace();
}
System.out.println("FINAL MAP TO BE LOADED : " + new Gson().toJson(finalcycleTimeMap));
return finalcycleTimeMap;
}
POJO class for handling response:
public class CycleTime {
private String key1 = "";
private String key2 = "";
private String key3 = "";
private String key4 = "";
private String key5 = "";
public String getKey1() {
return key1;
}
public void setKey1(String key1) {
this.key1 = key1;
}
public String getKey2() {
return key2;
}
public void setKey2(String key2) {
this.key2 = key2;
}
public String getKey3() {
return key3;
}
public void setKey3(String key3) {
this.key3 = key3;
}
public String getKey4() {
return key4;
}
public void setKey4(String key4) {
this.key4 = key4;
}
public String getKey5() {
return key5;
}
public void setKey5(String key5) {
this.key5 = key5;
}
}
I'm not sure what might be a culprit here but seems it is giving the same error, i'm wondering what should be the next approach to avoid this OutOfMemoryException.
Reading the entire document into a single object does not mean that the streamed reading would help you.
Moreover, Gson uses streaming under the hood because it is just an optional way of reading and writing.
Your approach, however, is very far from being good:
Gson things:
The main thing is: use Gson properly in full and let it do its job. I couldn't run your code for the JSON document you provided: it works neither for the root JSON object, nor for the only top object entry (so your deserializer is broken due to improper used of the hasNext and the beginObject/endObject pair).
Common Java things:
don't catch exceptions in middle returning a partially composed object (is it correct?);
don't use Throwable.printStackTrace (use proper logging facilities);
if you don't want using loggers, then print it to System.err (this is just a proper standard stream for such purposes);
Integer as a counter is a bad idea because of creating many boxed values, especially for huge documents (use int -- it is just fine);
enum values can (and should be) checked for equality with == (it is safe since they are singletons);
then, you can use switch to for enums too (both shorted, more compile-time safe);
don't create Gson instances in a loop especially that has that many iterations (Gson instances are known to be immutable as thread-safe, but not that cheap at constructing its objects);
don't use maps where you can have statically typed plain objects (for good);
what's the purposes of returning an always-one-key-value-pair map; (return the value);
Common design things:
use as common types as possible for declarations: not HashMap, but Map (what if someday you need another map with ordered keys? or what if you don't need a map after all?);
inverse dependencies (what if you don't need CycleTime with five keys?);
Streaming things:
if it runs in an OOM error, then what's the point of collecting a huge map that obviously cannot fit your app RAM? (use callbacks or promises (pushing approach) to process a single element, iterators or streams (pulling approach), reactive streams (pushing approach), whatever);
collect the result only for a small memory foot-print or use aggregation (otherwise you're at risk of having OOM).
This is how you can reduce the memory foot-print by using a pushing approach via callbacks:
#UtilityClass
public final class StreamSupport {
public static void acceptArrayElements(#WillNotClose final JsonReader jsonReader, final Consumer<? super JsonReader> acceptElement)
throws IOException {
jsonReader.beginArray();
while ( jsonReader.hasNext() ) {
acceptElement.accept(jsonReader);
}
jsonReader.endArray();
}
}
#UtilityClass
public final class CycleDeserializer {
public static void readCycles(final JsonReader jsonReader, final Consumer<? super JsonReader> acceptJsonReader)
throws IOException {
jsonReader.beginObject();
while ( jsonReader.hasNext() ) {
switch ( jsonReader.nextName() ) {
case "Report_Entry":
StreamSupport.acceptArrayElements(jsonReader, acceptJsonReader);
break;
default:
jsonReader.skipValue();
break;
}
}
jsonReader.endObject();
}
}
private static final Gson gson = new GsonBuilder()
.disableHtmlEscaping()
.disableInnerClassSerialization()
.create();
#Test
public void test()
throws IOException {
try ( final JsonReader jsonReader = openTheHugeDocument() ) {
CycleDeserializer.readCycles(jsonReader, jr -> {
final CycleTime cycleTime = gson.fromJson(jr, CycleTime.class);
System.out.println(cycleTime);
});
}
// do the simplest aggregation operation: `COUNT`
try ( final JsonReader jsonReader = openTheHugeDocument() ) {
final AtomicInteger count = new AtomicInteger();
CycleDeserializer.readCycles(jsonReader, jr -> {
try {
jr.skipValue();
count.incrementAndGet();
} catch ( final IOException ex ) {
throw new RuntimeException(ex);
}
});
System.out.println("Count = " + count);
}
// this will probably fail when the document is huge because it is collected into a single collection
// (you need to let your JVM use as much RAM as possible if it is a must for you)
try ( final JsonReader jsonReader = openTheHugeDocument() ) {
final Collection<CycleTime> cycleTimes = new ArrayList<>();
CycleDeserializer.readCycles(jsonReader, jr -> {
final CycleTime cycleTime = gson.fromJson(jr, CycleTime.class);
cycleTimes.add(cycleTime);
});
System.out.println("Count in list = " + cycleTimes.size());
}
}
As you can see, in the runner above you can choose the way you prefer to process your entries: either a dumb logging, or a simple count, or a simple collect-to operation.
For the pull approach via Stream approach please see: https://stackoverflow.com/a/69282822/12232870
Related
I can easily query the Alfresco audit log in REST using this query:
http://localhost:8080/alfresco/service/api/audit/query/audit-custom?verbose=true
But how to perform the same request in Java within Alfresco module?
It must be synchronous.
A lazy solution would be to call the REST URL in Java, but it would probably be inefficient, and more importantly it would require me to store an admin's password somewhere.
I noticed AuditService has a auditQuery method so I am trying to call it. Unfortunately it seems to be for asynchronous operations? I don't need callbacks, as I need to wait until the queried data is ready before going on to the next step.
Here is my implementation, mostly copied from the source code of the REST API:
int maxResults = 10000;
if (!auditService.isAuditEnabled(AUDIT_APPLICATION, ("/" + AUDIT_APPLICATION))) {
throw new WebScriptException(
"Auditing for " + AUDIT_APPLICATION + " is disabled!");
}
final List<Map<String, Object>> entries =
new ArrayList<Map<String,Object>>(limit);
AuditQueryCallback callback = new AuditQueryCallback() {
#Override
public boolean valuesRequired() {
return true; // true = verbose
}
#Override
public boolean handleAuditEntryError(
Long entryId, String errorMsg, Throwable error) {
return true;
}
#Override
public boolean handleAuditEntry(
Long entryId,
String applicationName,
String user,
long time,
Map<String, Serializable> values) {
// Convert values to Strings
Map<String, String> valueStrings =
new HashMap<String, String>(values.size() * 2);
for (Map.Entry<String, Serializable> mapEntry : values.entrySet()) {
String key = mapEntry.getKey();
Serializable value = mapEntry.getValue();
try {
String valueString = DefaultTypeConverter.INSTANCE.convert(
String.class, value);
valueStrings.put(key, valueString);
}
catch (TypeConversionException e) {
// Use the toString()
valueStrings.put(key, value.toString());
}
}
entry.put(JSON_KEY_ENTRY_VALUES, valueStrings);
}
entries.add(entry);
return true;
}
};
AuditQueryParameters params = new AuditQueryParameters();
params.setApplicationName(AUDIT_APPLICATION);
params.setForward(true);
auditService.auditQuery(callback, params, maxResults);
Though the callback might it look asynchronous, it is not.
I want to compare two JSON strings which is a huge hierarchy and want to know where they differ in values. But some values are generated at runtime and are dynamic. I want to ignore those particular nodes from my comparison.
I am currently using JSONAssert from org.SkyScreamer to do the comparison. It gives me nice console output but does not ignore any attributes.
for ex.
java.lang.AssertionError messageHeader.sentTime
expected:null
got:09082016 18:49:41.123
Now this comes dynamic and should be ignored. Something like
JSONAssert.assertEquals(expectedJSONString, actualJSONString,JSONCompareMode, *list of attributes to be ignored*)
It would be great if someone suggests a solution in JSONAssert. However other ways are also welcome.
You can use Customization for this. For example, if you need to ignore a top-level attribute named "timestamp" use:
JSONAssert.assertEquals(expectedResponseBody, responseBody,
new CustomComparator(JSONCompareMode.LENIENT,
new Customization("timestamp", (o1, o2) -> true)));
It's also possible to use path expressions like "entry.id". In your Customization you can use whatever method you like to compare the two values. The example above always returns true, no matter what the expected value and the actual value are. You could do more complicated stuff there if you need to.
It is perfectly fine to ignore that values of multiple attributes, for example:
#Test
public void ignoringMultipleAttributesWorks() throws JSONException {
String expected = "{\"timestamp\":1234567, \"a\":5, \"b\":3 }";
String actual = "{\"timestamp\":987654, \"a\":1, \"b\":3 }";
JSONAssert.assertEquals(expected, actual,
new CustomComparator(JSONCompareMode.LENIENT,
new Customization("timestamp", (o1, o2) -> true),
new Customization("a", (o1, o2) -> true)
));
}
There is one caveat when using Customizations: The attribute whose value is to be compared in a custom way has to be present in the actual JSON. If you want the comparison to succeed even if the attribute is not present at all you would have to override CustomComparator for example like this:
#Test
public void extendingCustomComparatorToAllowToCompletelyIgnoreCertainAttributes() throws JSONException {
// AttributeIgnoringComparator completely ignores some of the expected attributes
class AttributeIgnoringComparator extends CustomComparator{
private final Set<String> attributesToIgnore;
private AttributeIgnoringComparator(JSONCompareMode mode, Set<String> attributesToIgnore, Customization... customizations) {
super(mode, customizations);
this.attributesToIgnore = attributesToIgnore;
}
protected void checkJsonObjectKeysExpectedInActual(String prefix, JSONObject expected, JSONObject actual, JSONCompareResult result) throws JSONException {
Set<String> expectedKeys = getKeys(expected);
expectedKeys.removeAll(attributesToIgnore);
for (String key : expectedKeys) {
Object expectedValue = expected.get(key);
if (actual.has(key)) {
Object actualValue = actual.get(key);
compareValues(qualify(prefix, key), expectedValue, actualValue, result);
} else {
result.missing(prefix, key);
}
}
}
}
String expected = "{\"timestamp\":1234567, \"a\":5}";
String actual = "{\"a\":5}";
JSONAssert.assertEquals(expected, actual,
new AttributeIgnoringComparator(JSONCompareMode.LENIENT,
new HashSet<>(Arrays.asList("timestamp")))
);
}
(With this approach you still could use Customizations to compare other attributes' values in the way you want.)
you can use JsonUnit It has the functionality that you are looking for we can ignore fields, paths, and values that are null etc. Check it out for more info. As for the example, you can ignore a path like this
assertJsonEquals(
"{\"root\":{\"test\":1, \"ignored\": 2}}",
"{\"root\":{\"test\":1, \"ignored\": 1}}",
whenIgnoringPaths("root.ignored")
);
Sometimes you need to ignore certain values when comparing. It is possible to use ${json-unit.ignore} placeholder like this
assertJsonEquals("{\"test\":\"${json-unit.ignore}\"}",
"{\n\"test\": {\"object\" : {\"another\" : 1}}}");
First of all there is open issue for it.
In my tests I compare json from controller with actual object with help of JsonUtil class for serialization/deserialization:
public class JsonUtil {
public static <T> List<T> readValues(String json, Class<T> clazz) {
ObjectReader reader = getMapper().readerFor(clazz);
try {
return reader.<T>readValues(json).readAll();
} catch (IOException e) {
throw new IllegalArgumentException("Invalid read array from JSON:\n'" + json + "'", e);
}
}
public static <T> T readValue(String json, Class<T> clazz) {
try {
return getMapper().readValue(json, clazz);
} catch (IOException e) {
throw new IllegalArgumentException("Invalid read from JSON:\n'" + json + "'", e);
}
}
public static <T> String writeValue(T obj) {
try {
return getMapper().writeValueAsString(obj);
} catch (JsonProcessingException e) {
throw new IllegalStateException("Invalid write to JSON:\n'" + obj + "'", e);
}
}
To ignore specific object field I've add new method:
public static <T> String writeIgnoreProps(T obj, String... ignoreProps) {
try {
Map<String, Object> map = getMapper().convertValue(obj, new TypeReference<Map<String, Object>>() {});
for (String prop : ignoreProps) {
map.remove(prop);
}
return getMapper().writeValueAsString(map);
} catch (JsonProcessingException e) {
throw new IllegalStateException("Invalid write to JSON:\n'" + obj + "'", e);
}
}
and my assert in test now look like this:
mockMvc.perform(get(REST_URL))
.andExpect(status().isOk())
.andExpect(content().contentTypeCompatibleWith(MediaType.APPLICATION_JSON))
.andExpect(content().json(JsonUtil.writeIgnoreProps(USER, "registered")))
Thank you #dknaus for the detailed answer. Although this solution will not work in STRICT mode and checkJsonObjectKeysExpectedInActual method code needs to be replaced by following code [As suggested by #tk-gospodinov]:
for (String attribute : attributesToIgnore) {
expected.remove(attribute);
super.checkJsonObjectKeysExpectedInActual(prefix, expected, actual, result);
}
I'm working on a project that has hosts and clients, and where hosts can send commands to clients (via sockets).
I'm determined that using JSON to communicate works the best.
For example:
{
"method" : "toasty",
"params" : ["hello world", true]
}
In this example, when this JSON string is sent to the client, it will be processed and a suitable method within the client will be run as such:
public abstract class ClientProcessor {
public abstract void toasty(String s, boolean bool);
public abstract void shutdown(int timer);
private Method[] methods = getClass().getDeclaredMethods();
public void process(String data) {
try {
JSONObject json = new JSONObject(data);
String methodName = (String) json.get("method");
if (methodName.equals("process"))
return;
for (int i = 0; i < methods.length; i++)
if (methods[i].getName().equals(methodName)) {
JSONArray arr = json.getJSONArray("params");
int length = arr.length();
Object[] args = new Object[length];
for (int i2 = 0; i2 < length; i2++)
args[i2] = arr.get(i2);
methods[i].invoke(this, args);
return;
}
} catch (Exception e) {}
}
}
And using the ClientProcessor:
public class Client extends ClientProcessor {
#Override
public void toasty(String s, boolean bool) {
//make toast here
}
#Override
public void shutdown(int timer) {
//shutdown system within timer
}
public void processJSON(String json) {
process(json);
}
}
The JSON is sent by the server to the client, but the server could be modified to send different JSONs.
My questions are:
Is this a safe way of running methods by processing JSON?
Is there a better way to do this? I'm thinking that using reflection is terribly slow.
There's a 100 and 1 ways you can process a JSON message so that some processing occurs, but they'll all boil down to:
parse message
map message to method
invoke method
send response
While you could use a reflective call (performance-wise it would be fine for most cases) to invoke a method, that, imho, would be a little too open - a malicious client could for example crash your system by issuing wait calls.
Reflection also opens you up to having to correctly map the parameters, which is more complicated than the code you've shown in your question.
So don't use Reflection.
Would you could do is define a simple interface, implementations of which would understand how to process the parameters and have your processor (more commonly referred to as a Controller) invoke that, something like this:
public interface ServiceCall
{
public JsonObject invoke(JsonArray params) throws ServiceCallException;
}
public class ServiceProcessor
{
private static final Map<String, ServiceCall> SERVICE_CALLS = new HashMap<>();
static
{
SERVICE_CALLS.put("toasty", new ToastCall());
}
public String process(String messageStr)
{
try
{
JsonObject message = Json.createReader(new StringReader(messageStr)).readObject();
if (message.containsKey("method"))
{
String method = message.getString("method");
ServiceCall serviceCall = SERVICE_CALLS.get(method);
if (serviceCall != null)
{
return serviceCall.invoke(message.getJsonArray("params")).toString();
}
else
{
return fail("Unknown method: " + method);
}
}
else
{
return fail("Invalid message: no method specified");
}
}
catch (Exception e)
{
return fail(e.message);
}
}
private String fail(String message)
{
return Json.createObjectBuilder()
.add("status", "failed")
.add("message", message)
.build()
.toString();
}
private static class ToastCall implements ServiceCall
{
public JsonObject invoke(JsonArray params) throws ServiceCallException
{
//make toast here
}
}
}
Map method names to int constants and just switch(case) on these constants to invoke appropriate method.
"toasty" : 1
"shutdown": 2
switch()
case 1: toasty()
case 2: shutdown()
I believe you are trying to convert JSON string to Java object and vice versa... if that is the requirement then this would not be the right approach...
Try any open source API like Gson...
it is the API by Google for conversin of Java to JSON and vice versa.
Please check ...
https://google-gson.googlecode.com/svn/trunk/gson/docs/javadocs/com/google/gson/Gson.html
Let me know if you have any further questions...
Considering the following function:
public void execute4() {
File filePath = new File(filePathData);
File[] files = filePath.listFiles((File filePathData) -> filePathData.getName().endsWith("CDR"));
List<CDR> cdrs = new ArrayList<CDR>();
Arrays.asList(files).parallelStream().forEach(file -> readCDRP(cdrs, file));
cdrs.sort(cdrsorter);
}
which reads a list of Files containing CDR and executes the readCDRP() which is this:
private void readCDRP(List<CDR> cdrs, File file) {
final CDR cdr = new CDR(file.getName());
try (BufferedReader bfr = new BufferedReader(new FileReader(file))) {
List<String> lines = bfr.lines().collect(Collectors.toList());
lines.parallelStream().forEach(e -> {
String[] data = e.split(",", -1);
CDREntry entry = new CDREntry(file.getName());
for (int i = 0; i < data.length; i++) {
entry.setField(i, data[i]);
}
cdr.addEntry(entry);
});
if (cdr != null) {
cdrs.add(cdr);
}
} catch (IOException e) {
e.printStackTrace();
}
}
What I observe is that occasionally and NOT all the time, I either get a ArrayIndexNotBound Exception at the readCDRP function over the line (which is awkward, as the list of cdr is an ArrayList() ):
cdr.addEntry(entry);
or
at the last line in execute4() where I apply the sorting.
I think the issue is that the first parallelStream from execute4 is not in a separate space in memory from the second parallelStream execution inside readCDRP() and also seems to share wrongly the data. Using "seem" word as I can't confirm and is just a hutch.
The questions are:
is my code buggy to the bone from JDK8 perspective?
Is there a workaround using the same flow, something like using CountDownLatch for example?
Is limitation of the ForkJoinPool ?
Thanks for any responce....
EDIT(1):
The addEntry is part of a class itself:
class CDR {
public final String fileName;
private final List<CDREntry> entries = new ArrayList<CDREntry>();
public CDR(String fileName) {
super();
this.fileName = fileName;
}
public List<CDREntry> getEntries() {
return entries;
}
public List<CDREntry> addEntry(CDREntry e) {
entries.add(e);
return entries;
}
public String getFileName() {
return this.fileName;
}
}
Your code is broken from a thread safety point of view. In readCDR you add elements to the cdrs list which is an ArrayList that does not support concurrent writes. That is why it breaks.
A better approach would be to have readCDR return a cdr object and do something like:
List<CDR> cdrs = Arrays.stream(files)
.parallel()
.map(this::readCDR)
.collect(Collectors.toList());
Also, using parallel streams for IO related operations is generally a bad idea, but that is another discussion.
When you starting programming in functional style you should prefer immutable objects which can be fully created via construction (or probably using builder pattern or some factory method). So your CDREntry class may look like this:
class CDREntry {
private final String[] fields;
private final String name;
public CDREntry(String name, String[] fields) {
this.name = name;
this.fields = fields;
}
// Add getters and whatever
}
And your CDR class may look like this:
class CDR {
private final String fileName;
private final List<CDREntry> entries;
public CDR(String fileName, List<CDREntry> entries) {
this.fileName = fileName;
this.entries = entries;
}
public List<CDREntry> getEntries() {
return entries;
}
public String getFileName() {
return this.fileName;
}
}
Having such classes things become easier. The rest of the code can be rewritten like this:
public void execute4() {
File filePath = new File(filePathData);
File[] files = filePath.listFiles((File data, String name) ->
data.getName().endsWith("CDR")); // fixed this line: it had compilation error
List<CDR> cdrs = Arrays.stream(files).parallel()
.map(this::readCDRP).sorted(cdrsorter)
.collect(Collectors.toList());
}
private CDR readCDRP(File file) {
try (BufferedReader bfr = new BufferedReader(new FileReader(file))) {
// I'm not sure that collecting lines into list
// before main processing was actually necessary
return bfr.lines().parallelStream()
.map(e -> new CDREntry(file.getName(), e.split(",", -1)))
.collect(Collectors.collectingAndThen(
Collectors.toList(), list -> new CDR(file.getName(), list)));
} catch (IOException e) {
throw new UncheckedIOException(e);
}
}
In general remember that forEach is usually not the cleanest way to solve the tasks. It may be helpful when you integrate the streams into legacy code, but in general should be avoided.
you are using a parallel stream and a lambda that has side effects
(the lambda updates the ArrayList 'cdrs')
try to use a Collector or a Reduction-Operation.
Inshort : I am trying to find some api that could just change the value by taking first parameter as jsonString , second parameter as JSONPath and third will be new value of that parameter. But, all I found is this..
https://code.google.com/p/json-path/
This api allows me to find any value in JSON String. But, I am not finding easy way to update the value of any key. For example, Here is a book.json.
{
"store":{
"book":[
{
"category":"reference",
"author":"Nigel Rees",
"title":"Sayings of the Century",
"price":8.95
},
{
"category":"fiction",
"author":"Evelyn Waugh",
"title":"Sword of Honour",
"price":12.99,
"isbn":"0-553-21311-3"
}
],
"bicycle":{
"color":"red",
"price":19.95
}
}
}
I can access color of bicycle by doing this.
String bicycleColor = JsonPath.read(json, "$.store.bicycle.color");
But I am looking for a method in JsonPath or other api some thing like this
JsonPath.changeNodeValue(json, "$.store.bicycle.color", "green");
String bicycleColor = JsonPath.read(json, "$.store.bicycle.color");
System.out.println(bicycleColor); // This should print "green" now.
I am excluding these options,
Create a new JSON String.
Create a JSON Object to deal with changing value and convert it back to jsonstring
Reason: I have about 500 different requests for different types of service which return different json structure. So, I do not want to manually create new JSON string always. Because, IDs are dynamic in json structure.
Any idea or direction is much appreciated.
Updating this question with following answer.
Copy MutableJson.java.
copy this little snippet and modify as per you need.
private static void updateJsonValue() {
JSONParser parser = new JSONParser();
JSONObject jsonObject = new JSONObject();
FileReader reader = null;
try {
File jsonFile = new File("path to book.json");
reader = new FileReader(jsonFile);
jsonObject = (JSONObject) parser.parse(reader);
} catch (Exception ex) {
System.out.println(ex.getLocalizedMessage());
}
Map<String, Object> userData = null;
try {
userData = new ObjectMapper().readValue(jsonObject.toJSONString(), Map.class);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
MutableJson json = new MutableJson(userData);
System.out.println("Before:\t" + json.map());
json.update("$.store.book[0].author", "jigish");
json.update("$.store.book[1].category", "action");
System.out.println("After:\t" + json.map().toString());
}
Use these libraries.
import org.json.simple.JSONObject;
import org.json.simple.parser.JSONParser;
import org.codehaus.jackson.map.ObjectMapper;
The thing is that the functionality you want is already an undocumented feature of JsonPath. Example using your json structure:
String json = "{ \"store\":{ \"book\":[ { \"category\":\"reference\", \"author\":\"Nigel Rees\", \"title\":\"Sayings of the Century\", \"price\":8.95 }, { \"category\":\"fiction\", \"author\":\"Evelyn Waugh\", \"title\":\"Sword of Honour\", \"price\":12.99, \"isbn\":\"0-553-21311-3\" } ], \"bicycle\":{ \"color\":\"red\", \"price\":19.95 } } }";
DocumentContext doc = JsonPath.parse(json).
set("$.store.bicycle.color", "green").
set("$.store.book[0].price", 9.5);
String newJson = new Gson().toJson(doc.read("$"));
Assuming that parsed JSON can be represented in memory as a Map, you can build an API similar to JsonPath that looks like:
void update(Map<String, Object> json, String path, Object newValue);
I've quickly done a gist of a dirty implementation for simple specific paths (no support for conditions and wildcards) that can traverse json tree, E.g. $.store.name, $.store.books[0].isbn. Here it is: MutableJson.java. It definitely needs improvement, but can give a good start.
Usage example:
import java.util.*;
public class MutableJson {
public static void main(String[] args) {
MutableJson json = new MutableJson(
new HashMap<String, Object>() {{
put("store", new HashMap<String, Object>() {{
put("name", "Some Store");
put("books", Arrays.asList(
new HashMap<String, Object>() {{
put("isbn", "111");
}},
new HashMap<String, Object>() {{
put("isbn", "222");
}}
));
}});
}}
);
System.out.println("Before:\t" + json.map());
json.update("$.store.name", "Book Store");
json.update("$.store.books[0].isbn", "444");
json.update("$.store.books[1].isbn", "555");
System.out.println("After:\t" + json.map());
}
private final Map<String, Object> json;
public MutableJson(Map<String, Object> json) {
this.json = json;
}
public Map<String, Object> map() {
return json;
}
public void update(String path, Object newValue) {
updateJson(this.json, Path.parse(path), newValue);
}
private void updateJson(Map<String, Object> data, Iterator<Token> path, Object newValue) {
Token token = path.next();
for (Map.Entry<String, Object> entry : data.entrySet()) {
if (!token.accept(entry.getKey(), entry.getValue())) {
continue;
}
if (path.hasNext()) {
Object value = token.value(entry.getValue());
if (value instanceof Map) {
updateJson((Map<String, Object>) value, path, newValue);
}
} else {
token.update(entry, newValue);
}
}
}
}
class Path {
public static Iterator<Token> parse(String path) {
if (path.isEmpty()) {
return Collections.<Token>emptyList().iterator();
}
if (path.startsWith("$.")) {
path = path.substring(2);
}
List<Token> tokens = new ArrayList<>();
for (String part : path.split("\\.")) {
if (part.matches("\\w+\\[\\d+\\]")) {
String fieldName = part.substring(0, part.indexOf('['));
int index = Integer.parseInt(part.substring(part.indexOf('[')+1, part.indexOf(']')));
tokens.add(new ArrayToken(fieldName, index));
} else {
tokens.add(new FieldToken(part));
}
};
return tokens.iterator();
}
}
abstract class Token {
protected final String fieldName;
Token(String fieldName) {
this.fieldName = fieldName;
}
public abstract Object value(Object value);
public abstract boolean accept(String key, Object value);
public abstract void update(Map.Entry<String, Object> entry, Object newValue);
}
class FieldToken extends Token {
FieldToken(String fieldName) {
super(fieldName);
}
#Override
public Object value(Object value) {
return value;
}
#Override
public boolean accept(String key, Object value) {
return fieldName.equals(key);
}
#Override
public void update(Map.Entry<String, Object> entry, Object newValue) {
entry.setValue(newValue);
}
}
class ArrayToken extends Token {
private final int index;
ArrayToken(String fieldName, int index) {
super(fieldName);
this.index = index;
}
#Override
public Object value(Object value) {
return ((List) value).get(index);
}
#Override
public boolean accept(String key, Object value) {
return fieldName.equals(key) && value instanceof List && ((List) value).size() > index;
}
#Override
public void update(Map.Entry<String, Object> entry, Object newValue) {
List list = (List) entry.getValue();
list.set(index, newValue);
}
}
A JSON string can be easily parsed into a Map using Jackson:
Map<String,Object> userData = new ObjectMapper().readValue("{ \"store\": ... }", Map.class);
Just answering for folks landing on this page in future for reference.
You could consider using a Java implementation of jsonpatch. RFC can be found here
JSON Patch is a format for describing changes to a JSON document. It can be used to avoid sending a whole document when only a part has changed. When used in combination with the HTTP PATCH method it allows partial updates for HTTP APIs in a standards compliant way.
You can specify the operation that needs to be performed (replace, add....), json path at which it has to be performed, and the value which should be used.
Again, taking example from the RFC :
[
{ "op": "test", "path": "/a/b/c", "value": "foo" },
{ "op": "remove", "path": "/a/b/c" },
{ "op": "add", "path": "/a/b/c", "value": [ "foo", "bar" ] },
{ "op": "replace", "path": "/a/b/c", "value": 42 },
{ "op": "move", "from": "/a/b/c", "path": "/a/b/d" },
{ "op": "copy", "from": "/a/b/d", "path": "/a/b/e" }
]
For Java implementation, I have not used it myself, but you can give a try to https://github.com/fge/json-patch
So in order to change a value within a JSon string, there are two steps:
Parse the JSon
Modify the appropriate field
You are trying to optimize step 2, but understand that you are not going to be able to avoid step 1. Looking at the Json-path source code (which, really, is just a wrapper around Jackson), note that it does do a full parse of the Json string before being able to spit out the read value. It does this parse every time you call read(), e.g. it is not cached.
I think this task is specific enough that you're going to have to write it yourself. Here is what I would do:
Create an object that represents the data in the parsed Json string.
Make sure this object has, as part of it's fields, the Json String pieces that you do not expect to change often.
Create a custom Deserializer in the Json framework of your choice that will populate the fields correctly.
Create a custom Serializer that uses the cached String pieces, plus the data that you expect to change
I think the exact scope of your problem is unusual enough that it is unlikely a library already exists for this. When a program receives a Json String, most of the time what it wants is the fully deserialized object - it is unusual that it needs to FORWARD this object on to somewhere else.