I have a stable SpringBoot project that runs. I want to add a end point that reads a json file from classpath and passes it through to the response without having to create any Model objects (pass thru).
I have no issues reading the json file into JsonNode or ObjectNode, I'm struggling with where to go next to set the data in my response object.
Added this caveat later, I do need to update the json from a database.
Ok, paired up with a colleague at work and we tried two things, return a string (escapes strings in REST output - returns a big String.) not good. What worked is setting the response object to a and calling mapper.readValue(jsonFeed, Map.class), that returned the JSON in proper object notation.
#Value("${metadata.json.file}") //defined in application.context
private Resource metaJsonFileName;
public String getJsonFromFile(List<UnitUiItem> uiitems){
JsonNode root;
ObjectMapper mapper = new ObjectMapper();
InputStream stream = metaJsonFileName.getInputStream();
root = mapper.readTree(stream);
JsonNode dataNode = root.get("data");
JsonNode optionDataNode = dataNode.get("storelocation");
((ObjectNode)optionDataNode).putArray("units");
for(UnitUiItem item : uiitems){
JsonNode unitNode = ((ObjectNode)optionDataNode).withArray("units").addObject();
((ObjectNode)unitNode).put("code",item.getCode());
((ObjectNode)unitNode).put("displayName",item.getDisplayName());
}
LOGGER.info("buildMetaJson exit");
return root.toString();
}
//calling method
String jsonFeed = getJsonFromFile();
ObjectMapper mapper = new ObjectMapper();
response.setData(mapper.readValue(jsonFeed, Map.class));
I have some code cleanup to do.. any cleaner ways of doing this?
Related
I am working with an exiting API that expects a "Metadata" field as part of its json payload. That "Metadata" field is a json object that is completely free-form. Currently, I need to read this data provided from another source, do some enrichment, then pass it on. I am struggling with how to define this "Metadata" object so that it can be any valid json object. OR, if that field was not provided, an empty json object.
I attempted to use org.json.JSONObject like so.
//meta is the json string read from the db
JSONObject jsonobject = new JSONObject(meta);
message.Metadata = jsonobject;
However, jackson, not unexpectedly, threw a serialization error:
com.fasterxml.jackson.databind.JsonMappingException: No serializer found for class org.json.JSONObject and no properties discovered...
This is a critical requirement that I'm guessing I am missing some relatively obvious solution to. Any help would be greatly appreciated.
UPDATED FIX
As suggested by #shmosel I just switched the json object to a com.fasterxml.jackson.databind.JsonNode and all works beautifully.
// working code (rough of course)
ObjectMapper mapper = new ObjectMapper();
JsonNode rootNode = null;
try {
rootNode = mapper.readTree(meta);
} catch (IOException e) {
e.printStackTrace();
}
message.Metadata = rootNode;
I have got stuck trying to insert a JSONArray into a Jackson ObjectNode. This is what I am trying to do:
public void myMethod(JSONArray jsonArray) {
ObjectNode payload = objectMapper.createObjectNode(0);
payload.put("array", /* jsonArray */);
payload.put(/* some other things */);
...
}
It feels like something really silly but what is actually the best way to do it?!
EDIT: I am sorry beacause I did not mention an important point, that is I have to serialize the ObjectNode once I finished building it, so using putPOJO() is not a possibility.
I like aribeiro's approach more. You can use the putPOJO() method to do this. For example:
// Incoming org.json.JSONArray.
JSONArray incomingArray = new JSONArray("[\"Value1\",\"Value2\"]");
ObjectMapper objectMapper = new ObjectMapper();
ObjectNode payload = objectMapper.createObjectNode();
// Adds the JSONArray node to the payload as POJO (plain old Java object).
payload.putPOJO("array", incomingArray);
System.out.println(objectMapper.writeValueAsString(payload));
Javadoc can be found here.
Note: here's a previous implementation that I submitted using readTree():
// Incoming org.json.JSONArray.
JSONArray incomingArray = new JSONArray("[\"Value1\",\"Value2\"]");
ObjectMapper objectMapper = new ObjectMapper();
ObjectNode payload = objectMapper.createObjectNode();
// Reads the JSON array into a Jackson JsonNode.
JsonNode jsonNode = objectMapper.readTree(incomingArray.toString());
// Sets the Jackson node on the payload.
payload.set("array", jsonNode);
System.out.println(objectMapper.writeValueAsString(payload));
I am using the Jackson streaming api to read in a json file like so:
// Go through json model and grab needed resources.
JsonFactory jsonfactory = new JsonFactory();
JsonParser jp = jsonfactory.createParser(fis);
JsonToken current;
current = jp.nextToken();
ObjectMapper mapper = new ObjectMapper();
if (current != JsonToken.START_OBJECT) {
System.out.println("Error: root should be object: quiting.");
return null;
}
while (jp.nextToken() != JsonToken.END_OBJECT) {
String fieldName = jp.getCurrentName();
// move from field name to field value
if ("Field1".equals(fieldName)) {
jp.nextToken();
JsonNode json = mapper.readTree(jp);
//Manipulate JsonNode
/*Want to write back into json file in place of
old object with manipulated node*/
}
else {
jp.skipChildren();
}
}
From the code above I am basically parsing the json file until I find the desired field I am looking for and then I read that into a JsonNode object, I then go through that JsonNode object and manipulate some of the data associated with it. My question is is there a way to delete that node out of the json file and write a newly created POJO into the file with the same field name in place of the old one? Everything I can find online about it involve reading the whole json file into a JsonNode which I would like to avoid as this file can be quite large.
In-place editing of a file like that is usually pretty complicated; a simpler approach is to create a new temporary file, and for the most part just copy what you're writing until you hit the conditions to modify what's going to the new one.
Then at the end you could delete the original file and rename the temporary one to "replace" it; Unless disk space is an issue though, I personally like keeping the original source around (especially in automated systems) for troubleshooting
I have written a program which inserts in bulk to Elasticsearch in batch of around 3000. The problem is that I need to convert these object to json before executing the bulk insert request. But there is major downside with json convertion and it is becoming a bottle neck of my whole computation.
Can any one suggest a super fast way to convert object to json in java. My code looks like this:
private String getESValueAsString(ElasticSearchValue elasticSearchValue) throws JsonProcessingException {
ElasticSearchValue prevValue = null;
if (stateType == StateType.OPAQUE) {
prevValue = (ElasticSearchValue) elasticSearchValue.getPrevious();
}
elasticSearchValue.setPrevious(null);
ObjectMapper om = new ObjectMapper();
Map<String, Object> props = om.convertValue(elasticSearchValue, Map.class);
if (stateType == stateType.OPAQUE) {
props.put("previous", prevValue);
}
return om.writeValueAsString(props);
}
Just found the issue, I am creating too many ObjectMapper for each serialization and that is making my whole processing slow.
This is a very good guide and it improved my performance 100x
http://wiki.fasterxml.com/JacksonBestPracticesPerformance
why not just insert into BulkRequestBuilder json records in the first place, something like this
Client client = new TransportClient().addTransportAddress(new InetSocketTransportAddress("localhost", 9300));
BulkRequestBuilder bulk = client.prepareBulk();
.....
bulk.add(client.prepareIndex(<your index>, <your type>)
.setSource(<your object>.toJson());
....
and in <your object> class
create Gson like this:
Gson gson = new GsonBuilder().excludeFieldsWithoutExposeAnnotation().create();
and method:
public String toJson(){
return gson.toJson(this, <you class>.class);
}
Let's say I have a json that looks like this:
{"body":"abcdef","field":"fgh"}
Now suppose the value of the 'body' element is huge(~100 MB or more). I would like to stream out the value of the body element instead of storing it in a String.
How can I do this? Is there any Java library I could use for this?
This is the line of code that fails with an OutOfMemoryException when a large json value comes in:
String inputStreamString = (String) JsonPath.read(textValue.toString(), "$.body");
'textValue' here is a hadoop.io.Text object.
I'm assuming that the OutOfMemory error occurs because we try to do method calls like toString() (which creates a new object), and JsonPath.read(), all of which are done in-memory. I need to know if there is an approach I could take while handling large-sized textValue objects.
Please let me know if you need additional info.
JsonSurfer is good for processing very large JSON data with selective extraction.
Example how to surf in JSON data collecting matched values in the listeners:
BufferedReader reader = new BufferedReader(new FileReader(jsonFile));
JsonSurfer surfer = new JsonSurfer(GsonParser.INSTANCE, GsonProvider.INSTANCE);
SurfingConfiguration config = surfer.configBuilder().bind("$.store.book[*]", new JsonPathListener() {
#Override
public void onValue(Object value, ParsingContext context) throws Exception {
JsonObject book = (JsonObject) value;
}
}).build();
surfer.surf(reader, config);
Jackson offers a streaming API for generating and processing JSON data.