I have a User class defined as:
User.java
package model;
import java.util.List;
import java.util.Map;
public class User {
private final Map<String, List<String>> accountTransactionsMap;
public User(final Map<String, List<String>> accountTransactionsMap) {
this.accountTransactionsMap = accountTransactionsMap;
}
public Map<String, List<String>> getAccountTransactionsMap() {
return accountTransactionsMap;
}
}
I am calling a REST API that returns the following response:
{
"username1":{
"456":[
],
"123":[
],
"789":[
]
},
"username2":{
"123":[
],
"456":[
],
"789":[
]
},
"username3":{
"789":[
],
"123":[
],
"456":[
"transaction10",
"transaction6",
"transaction9",
"transaction3"
]
}
}
I would like to be able to parse through the response and store it in a User object.
I have tried the following:
Test.java
public class Test {
public static void main(final String[] args) {
final String response = "{\"username1\":{\"456\":[],\"123\":[],\"789\":[]},\"username2\":{\"123\":[],\"456\":[],\"789\":[]},\"username3\":{\"789\":[],\"123\":[],\"456\":[\"transaction10\",\"transaction6\",\"transaction9\",\"transaction3\"]}}";
final Gson gson = new Gson();
final Type map = new TypeToken<Map<String, User>>(){}.getType();
final Map<String, User> result = gson.fromJson(response, map);
System.out.println(result);
if (result != null) {
for (final Map.Entry<String, User> entry : result.entrySet()) {
System.out.println("username: " + entry.getKey());
final User user = entry.getValue();
System.out.println("transactions: " + user.getAccountTransactionsMap());
}
}
}
}
This yields output:
{username1=model.User#80ec1f8, username2=model.User#1445d7f, username3=model.User#6a396c1e}
username: username1
transactions: null
username: username2
transactions: null
username: username3
transactions: null
I expect output:
{username1=model.User#80ec1f8, username2=model.User#1445d7f, username3=model.User#6a396c1e}
username: username1
transactions: {123=[],456=[],789=[]}
username: username2
transactions: {123=[],456=[],789=[]}
username: username3
transactions: {123=[],456=["transaction10", "transaction6", "transaction9", "transaction3"],789=[]}
How can I parse the accountId and the list of transactionIds into its own map as a variable in my User class?
Edit: I suppose the question really becomes, how can I create a custom TypeToken for my User class?
Instead of User class you need to use Map<String, Map<String, List<String>>>:
import com.google.gson.Gson;
import com.google.gson.reflect.TypeToken;
import java.io.File;
import java.io.FileReader;
import java.lang.reflect.Type;
import java.util.List;
import java.util.Map;
public class GsonApp {
public static void main(String[] args) throws Exception {
File jsonFile = new File("./resource/test.json").getAbsoluteFile();
final Gson gson = new Gson();
final Type map = new TypeToken<Map<String, Map<String, List<String>>>>(){}.getType();
final Map<String, Map<String, List<String>>> result = gson.fromJson(new FileReader(jsonFile), map);
System.out.println(result);
if (result != null) {
for (final Map.Entry<String, Map<String, List<String>>> entry : result.entrySet()) {
System.out.println("username: " + entry.getKey());
final Map<String, List<String>> user = entry.getValue();
System.out.println("transactions: " + user);
}
}
}
}
Above code prints:
{username1={456=[], 123=[], 789=[]}, username2={123=[], 456=[], 789=[]}, username3={789=[], 123=[], 456=[transaction10, transaction6, transaction9, transaction3]}}
username: username1
transactions: {456=[], 123=[], 789=[]}
username: username2
transactions: {123=[], 456=[], 789=[]}
username: username3
transactions: {789=[], 123=[], 456=[transaction10, transaction6, transaction9, transaction3]}
If you really need, you can create User object after parsing.
Related
I want to return a map to my Java code from a stored procedure. The results from PL/SQL in a Map format like Map<key, List>. Is it possible to return data like that?
I have returned the whole data in a string and from java class I have used split logic to split the data and saved them in proper fields. Like an example:
String[] lines = resultString.split(System.lineSeparator()); List<MessageVO> resultPerList = new ArrayList<>();
for (String line : lines) {
MessageVO message = new MessageVO();
String[] fieldArray = line.split(":", 0);
String field= fieldArray[0];
message.setField(field);
resultPerList.add(message);
}
If you like PL/SQL and the resulting lists are small, your solution (structuring data in SQL) is good. In PL/SQL you can take advantage of the listagg statement
Here is functional sample:
import java.util.HashMap;
import java.util.LinkedList;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
public class sasafd {
public static void main(String[] args) {
String resultString = "key1:valuea:valueb:valuec"
+ System.lineSeparator()
+ "key2:valuea:valueb:valuec";
String[] lines = resultString.split(System.lineSeparator());
Map<String, List<String>> resultMap = new HashMap<String, List<String>>();
for (String line : lines) {
String[] fieldArray = line.split(":", 0);
for (String value : fieldArray)
if (resultMap.containsKey(fieldArray[0])) {
resultMap.get(fieldArray[0]).add(value);
} else {
resultMap.put(fieldArray[0], new LinkedList<String>());
}
}
System.out.println(convertWithStream(resultMap));
}
public static String convertWithStream(Map<String, List<String>> map) {
String mapAsString = map.keySet().stream().map(key -> key + "=" + map.get(key))
.collect(Collectors.joining(", ", "{", "}"));
return mapAsString;
}
}
Output:
{key1=[valuea, valueb, valuec], key2=[valuea, valueb, valuec]}
If you like more tabular data, from DB you can use thoughts from this code:
import java.util.HashMap;
import java.util.LinkedList;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
public class sasafd {
public static void main(String[] args) {
String[][] resultSet =
{{"key1","valuea"},{"key1","valueb"},{"key1","valuec"}
,{"key2","valuea"},{"key2","valueb"},{"key2","valuec"}};
Map<String, List<String>> resultMap = new HashMap<String, List<String>>();
for (String[] line : resultSet) {
if (resultMap.containsKey(line[0])) {
resultMap.get(line[0]).add(line[1]);
} else {
resultMap.put(line[0], new LinkedList<String>());
resultMap.get(line[0]).add(line[1]);
}
}
System.out.println(convertWithStream(resultMap));
}
public static String convertWithStream(Map<String, List<String>> map) {
String mapAsString = map.keySet().stream().map(key -> key + "=" + map.get(key))
.collect(Collectors.joining(", ", "{", "}"));
return mapAsString;
}
}
You can also try ORM. If you dislike colon-separated text from PL/SQL you can also use XML, JSON, ...
JDBC doesn't support Map results.
I receive a String with json from an abstract source and I want to reorder the fields in that json so that certain fields come first in specified order and other, uspecified fields can come in any order. Is there a library method that would allow me to do sth like this:
// String reorderFields(String json, String[] orderedFields);
String res = reorderFields("{\"b\":2, \"c\": 3, \"a\":1}", new String[] {"a", "b", "c"});
Assertions.assertEquals("{\"a\":1,\"b\":2,\"c\":3}", res);
Below is solution that should do the trick. It works for first-level fields only. It uses jackson for json handling. Since it operates on basic types only (no custom serializers/deserializers) it should be safe to use for all possible json types.
import java.util.Arrays;
import java.util.HashSet;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.Map.Entry;
import java.util.Set;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.Test;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
public class ReorderFieldsTest {
public static String reorderFieldsInJson(String json, String[] fields) {
ObjectMapper objectMapper = new ObjectMapper();
Map<String, Object> jsonMap;
try {
jsonMap = objectMapper.readValue(json, new TypeReference<Map<String, Object>>() {
});
} catch (Exception e) {
throw new RuntimeException("Error converting json to map", e);
}
Map<String, Object> resMap = new LinkedHashMap<>();
Set<String> orderedFields = new HashSet<>(Arrays.asList(fields));
for (String fieldName : fields) {
Object val = jsonMap.get(fieldName);
if (val != null) {
resMap.put(fieldName, val);
}
}
for (Entry<String, Object> entry : jsonMap.entrySet()) {
if (orderedFields.contains(entry.getKey())) {
continue;
}
resMap.put(entry.getKey(), entry.getValue());
}
try {
return objectMapper.writeValueAsString(resMap);
} catch (JsonProcessingException e) {
throw new RuntimeException("Error converting map to json", e);
}
}
#Test
public void jsonReorder_validInput_reorderedOutput() {
String src = "{\"b\":2, \"c\": 3, \"a\":1}";
String res = reorderFieldsInJson(src, new String[] {"a", "b"});
Assertions.assertEquals("{\"a\":1,\"b\":2,\"c\":3}", res);
}
}
Result on Java 11 and jackson 2.11.3:
org:
{"b":2, "c": 3, "a":1}
ordered:
{"a":1,"b":2,"c":3}
I want to bulk upload the csv file into Elasticsearch using JAVA API (without using logstash).
Elasticsearch version - 6.6
I have tried the below program using Jackson format to get source Map for IndexRequest. Because I can't predefined the POJO variables. So I used dynamic Map from CSV file
import java.io.FileInputStream;
import java.io.IOException;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.UUID;
import java.util.logging.Logger;
import org.codehaus.jettison.json.JSONArray;
import org.codehaus.jettison.json.JSONException;
import org.elasticsearch.action.admin.indices.create.CreateIndexRequest;
import org.elasticsearch.action.admin.indices.create.CreateIndexResponse;
import org.elasticsearch.action.bulk.BulkRequest;
import org.elasticsearch.action.bulk.BulkResponse;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.client.RequestOptions;
import org.elasticsearch.client.RestHighLevelClient;
import com.fasterxml.jackson.databind.DeserializationFeature;
import com.fasterxml.jackson.databind.MappingIterator;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.dataformat.csv.CsvMapper;
import com.fasterxml.jackson.dataformat.csv.CsvSchema;
import com.zoho.dedupe.connection.DukeESConnection;
public class BulkImport {
private static Logger logger = Logger.getLogger(BulkImport.class.getName());
public static void main(String args[]) {
long starttime = System.currentTimeMillis();
logger.info("ElasticSearchServiceImpl => bulkInsert Service Started");
FileInputStream fis = null;
BulkRequest request;
RestHighLevelClient client;
//elastic Search Index Name
String esIndex = "post";
try {
boolean isHeaderSet = false;
Set<String> header = new HashSet<String>();
fis = new FileInputStream("/Users/test/Documents/Test.csv");
request = new BulkRequest();
MappingIterator<Map<String, Object>> data = parse(fis);
while (data.hasNext()) {
Map<?,?> value = data.next();
if(!isHeaderSet) {
header.add("id");
header = (Set<String>) value.keySet();
isHeaderSet= true;
}
System.out.println(value);
request.add(getIndexRequest(value, esIndex));
}
fis.close();
if(request.numberOfActions()>0) {
String hostsInString = "localhost";
List<HttpHost> httpHosts = new ArrayList<HttpHost> ( );
String[] hosts = hostsInString.split (",");
for (String host : hosts)
{
HttpHost httpHost = new HttpHost (host, 9200, "http");
httpHosts.add (httpHost);
}
client = client = new RestHighLevelClient (RestClient.builder (
httpHosts.toArray(new HttpHost[]{})).setMaxRetryTimeoutMillis (10 * 60000).setRequestConfigCallback(
new RestClientBuilder.RequestConfigCallback() {
#Override
public RequestConfig.Builder customizeRequestConfig(
RequestConfig.Builder requestConfigBuilder) {
return requestConfigBuilder
.setConnectTimeout (60000)
.setSocketTimeout (10 * 60000);
}
}));
CreateIndexRequest crrequest = new CreateIndexRequest(esIndex);
Map<String, Object> jsonMap = new HashMap<>();
Map<String, Object> message = new HashMap<>();
message.put("type", "text");
Map<String, Object> keyword = new HashMap<>();
Map<String, Object> type = new HashMap<>();
type.put("type", "keyword");
type.put("ignore_above", 256);
keyword.put("keyword", type);
message.put("fields", keyword);
Map<String, Object> properties = new HashMap<>();
for (Object hdr :header) {
properties.put(hdr.toString(), message);
}
Map<String, Object> mapping = new HashMap<>();
mapping.put("properties", properties);
jsonMap.put("_doc", mapping);
crrequest.mapping("_doc", jsonMap);
CreateIndexResponse createIndexResponse = client.indices().create(crrequest, RequestOptions.DEFAULT);
boolean acknowledged = createIndexResponse.isAcknowledged();
System.out.println(acknowledged);
BulkResponse bulkResponse = client.bulk(request, RequestOptions.DEFAULT);
if(bulkResponse.hasFailures()) {
logger.info("ElasticSearchServiceImpl => bulkInsert : Some of the record has failed.Please reinitiate the process");
} else {
logger.info("ElasticSearchServiceImpl => bulkInsert : Success");
}
} else {
logger.info("ElasticSearchServiceImpl => bulkInsert : No request for BulkInsert ="+request.numberOfActions());
}
} catch (Exception e) {
logger.info("ElasticSearchServiceImpl => bulkInsert : Exception =" + e.getMessage());
e.printStackTrace();
}
long endTime = System.currentTimeMillis();
logger.info("ElasticSearchServiceImpl => bulkInsert End " + (endTime - starttime));
}
public static MappingIterator<Map<String, Object>> parse(FileInputStream input) throws Exception {
MappingIterator<Map<String, Object>> map = readObjectsFromCsv(input);
return map;
//writeAsJson(data);
}
public static MappingIterator<Map<String, Object>> readObjectsFromCsv(FileInputStream file) throws IOException {
CsvSchema bootstrap = CsvSchema.emptySchema().withHeader().withColumnSeparator(',');
CsvMapper csvMapper = new CsvMapper();
MappingIterator<Map<String, Object>> mappingIterator = csvMapper.disable(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES).reader(Map.class).with(bootstrap).readValues(file);
// System.out.println("Column names: " + mappingIterator.next().keySet());
return mappingIterator;
}
public static void writeAsJson(List<Map<?, ?>> data) throws IOException, JSONException {
ObjectMapper mapper = new ObjectMapper();
String value = mapper.writeValueAsString(data);
JSONArray json = new JSONArray(value);
System.out.println(json);
}
public static IndexRequest getIndexRequest(Map data,String index)throws Exception {
IndexRequest indexRequest = null;
indexRequest = new IndexRequest(index).id(UUID.randomUUID().toString()).source(data);
System.out.println(indexRequest.toString());
return indexRequest;
}
}
I got the below exception while running the program
{Document Name=dhjajga, Title=sdas, Name=asd, DOB=14-43-22}
index {[post][null][c2148857-87e0-4407-b5f5-b4f5f52c40d2], source[{"Document Name":"dhjajga","Title":"sdas","Name":"asd","DOB":"14-43-22"}]}
Jun 11, 2020 4:06:18 PM com.zoho.dedupe.connection.DukeESConnection connect
INFO: Client org.elasticsearch.client.RestHighLevelClient#7c51f34b
true
Jun 11, 2020 4:06:18 PM BulkImport main
INFO: ElasticSearchServiceImpl => bulkInsert : Exception =Validation Failed: 1: type is missing;2: type is missing;3: type is missing;
org.elasticsearch.action.ActionRequestValidationException: Validation Failed: 1: type is missing;2: type is missing;3: type is missing;
at org.elasticsearch.action.bulk.BulkRequest.validate(BulkRequest.java:612)
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1728)
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1694)
at org.elasticsearch.client.RestHighLevelClient.bulk(RestHighLevelClient.java:470)
at BulkImport.main(BulkImport.java:85)
Jun 11, 2020 4:06:18 PM BulkImport main
INFO: ElasticSearchServiceImpl => bulkInsert End 1432
When I try to insert the same above indexrequest its working fine.
curl -X POST "localhost:9200/post/_doc/?pretty" -H 'Content-Type: application/json' -d'
{
"Document Name":"dhjajga","Title":"sdas","Name":"asd","DOB":"14-43-22"
}
'
{
"_index" : "post",
"_type" : "_doc",
"_id" : "jBPronIB0Wb3XTTasBjG",
"_version" : 1,
"result" : "created",
"_shards" : {
"total" : 2,
"successful" : 1,
"failed" : 0
},
"_seq_no" : 0,
"_primary_term" : 1
}
Please help to fix the issue in java program. Thanks in advance
Before Elasticsearch version 7 you have to specify a type with your Indexrequest. It is recommended to use the type "_doc".
I'm trying a simple test where the code appends a few json entries, however it is getting overwritten each time (the json file will only have 1 entry in it after running). I know I need to somehow create an array in JSON using '[]', but how would I go about doing that? Also, is there a better way to be doing this? I've been searching around and every library seems clunky with lots of user written code. Thanks
public class REEEE {
private static Staff createStaff() {
Staff staff = new Staff();
staff.setName("mkyong");
staff.setAge(38);
staff.setPosition(new String[] { "Founder", "CTO", "Writer" });
Map<String, Double> salary = new HashMap() {
{
put("2010", 10000.69);
}
};
staff.setSalary(salary);
staff.setSkills(Arrays.asList("java", "python", "node", "kotlin"));
return staff;
}
public static void main(String[] args) throws IOException {
ObjectMapper mapper = new ObjectMapper();
File file = new File("src//j.json");
for(int i = 0; i < 4; i++) {
Staff staff = createStaff();
try {
// Java objects to JSON file
mapper.writeValue(file, staff);
// Java objects to JSON string - compact-print
String jsonString = mapper.writeValueAsString(staff);
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
You can add staff in List and then write the list to file as below,
List<Staff> staffList = new LinkedList<>()
for(int i = 0; i < 4; i++) {
Staff staff = createStaff();
staffList.add(staff);
}
mapper.writeValue(file, staffList);
Hope it helps.
Jackson was implemented to parse and generate JSON payloads. All extra logic related with adding new element to array and writing back to file you need to implement yourself. It should not be hard to do:
class JsonFileAppender {
private final ObjectMapper jsonMapper;
public JsonFileAppender() {
this.jsonMapper = JsonMapper.builder().build();
}
public void appendToArray(File jsonFile, Object value) throws IOException {
Objects.requireNonNull(jsonFile);
Objects.requireNonNull(value);
if (jsonFile.isDirectory()) {
throw new IllegalArgumentException("File can not be a directory!");
}
JsonNode node = readArrayOrCreateNew(jsonFile);
if (node.isArray()) {
ArrayNode array = (ArrayNode) node;
array.addPOJO(value);
} else {
ArrayNode rootArray = jsonMapper.createArrayNode();
rootArray.add(node);
rootArray.addPOJO(value);
node = rootArray;
}
jsonMapper.writeValue(jsonFile, node);
}
private JsonNode readArrayOrCreateNew(File jsonFile) throws IOException {
if (jsonFile.exists() && jsonFile.length() > 0) {
return jsonMapper.readTree(jsonFile);
}
return jsonMapper.createArrayNode();
}
}
Example usage with some usecases:
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.json.JsonMapper;
import com.fasterxml.jackson.databind.node.ArrayNode;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.Arrays;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Objects;
public class JsonPathApp {
public static void main(String[] args) throws Exception {
Path jsonTmpFile = Files.createTempFile("json", "array");
JsonFileAppender jfa = new JsonFileAppender();
// Add POJO
jfa.appendToArray(jsonTmpFile.toFile(), createStaff());
printContent(jsonTmpFile); //1
// Add primitive
jfa.appendToArray(jsonTmpFile.toFile(), "Jackson");
printContent(jsonTmpFile); //2
// Add another array
jfa.appendToArray(jsonTmpFile.toFile(), Arrays.asList("Version: ", 2, 10, 0));
printContent(jsonTmpFile); //3
// Add another object
jfa.appendToArray(jsonTmpFile.toFile(), Collections.singletonMap("simple", "object"));
printContent(jsonTmpFile); //4
}
private static Staff createStaff() {
Staff staff = new Staff();
staff.setName("mkyong");
staff.setAge(38);
staff.setPosition(new String[]{"Founder", "CTO", "Writer"});
Map<String, Double> salary = new HashMap<>();
salary.put("2010", 10000.69);
staff.setSalary(salary);
staff.setSkills(Arrays.asList("java", "python", "node", "kotlin"));
return staff;
}
private static void printContent(Path path) throws IOException {
List<String> lines = Files.readAllLines(path);
System.out.println(String.join("", lines));
}
}
Above code prints 4 lines:
1
[{"name":"mkyong","age":38,"position":["Founder","CTO","Writer"],"salary":{"2010":10000.69},"skills":["java","python","node","kotlin"]}]
2
[{"name":"mkyong","age":38,"position":["Founder","CTO","Writer"],"salary":{"2010":10000.69},"skills":["java","python","node","kotlin"]},"Jackson"]
3
[{"name":"mkyong","age":38,"position":["Founder","CTO","Writer"],"salary":{"2010":10000.69},"skills":["java","python","node","kotlin"]},"Jackson",["Version: ",2,10,0]]
4
[{"name":"mkyong","age":38,"position":["Founder","CTO","Writer"],"salary":{"2010":10000.69},"skills":["java","python","node","kotlin"]},"Jackson",["Version: ",2,10,0],{"simple":"object"}]
Below is the json file i am trying to parse.I want to print all key and corresponding values.
{
"A":{
"name":"Ram",
"gender":"male",
"designation":"engineer"
},
"B":{
"name":"Shyam",
"gender":"male",
"designation":"student"
},
"C":{
"name":"Mohan",
"gender":"male",
"designation":"manager"
}
}
I have tried the following code:
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.util.Iterator;
import org.json.JSONArray;
import org.json.JSONObject;
import org.json.simple.parser.JSONParser;
public class FetchJsonNested {
public static void main(String args[]) throws FileNotFoundException {
try {
JSONParser jp=new JSONParser();
Object obj=jp.parse(new FileReader("C:\\Users\\DELL\\Documents\\NetBeansProjects\\WaterNetwork\\web\\kusharray.json"));
JSONObject job=(JSONObject)obj;
Iterator < ? > keys = job.keys();
while (keys.hasNext()) {
String key = (String) keys.next();
System.out.println(key);
if (job.get(key) instanceof JSONObject) {
System.out.println(job.get(key));
}
}
} catch(Exception e) {
e.printStackTrace();
}
}
}
I have read stuffs from many site but no one is working like the way i want.I want to print all keys and corresponding values.
Using org.json as you did in your example :
String jsonStr = "{\"A\":{\"name\":\"Ram\",\"gender\":\"male\",\"designation\":\"engineer\"},\"B\":{\"name\":\"Shyam\",\"gender\":\"male\",\"designation\":\"student\"},\"C\":{\"name\":\"Mohan\",\"gender\":\"male\",\"designation\":\"manager\"}}";
JSONObject json = new JSONObject(jsonStr);
for (Object key : json.keySet().toArray()){
JSONObject data = json.getJSONObject(key.toString());
System.out.println("json :" + data.toString());
System.out.println("name :" +data.getString("name"));
System.out.println("gender :" +data.getString("gender"));
System.out.println("designation :" +data.getString("designation"));
}
Now you can replace my first line "String jsonStr = ..." with your file reader.
below follow not the awesome and most elegant solution but this can lead you to what you need.
import com.google.gson.JsonElement;
import com.google.gson.JsonObject;
import com.google.gson.JsonParser;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.util.Map;
import java.util.Scanner;
import java.util.Set;
public class Main {
public static void main(String[] args) throws Exception {
String jsonString = loadJSONFile();
JsonElement jsonElement = new JsonParser().parse(jsonString);
JsonObject jsonObject = jsonElement.getAsJsonObject();
print(jsonObject);
}
private static String loadJSONFile() throws FileNotFoundException {
Scanner scanner = new Scanner(new FileReader("path/to/the/json/file.ext"));
StringBuilder stringBuilder = new StringBuilder();
while (scanner.hasNext()) {
stringBuilder.append(scanner.next());
}
scanner.close();
return stringBuilder.toString();
}
private static void print(JsonObject jsonObject) {
Set<Map.Entry<String, JsonElement>> entries = jsonObject.entrySet();
for (Map.Entry<String, JsonElement> entry : entries) {
System.out.println(entry.getKey() + ": " + entry.getValue());
try {
JsonElement jsonElement = new JsonParser().parse(String.valueOf(entry.getValue()));
JsonObject innerJsonObject = jsonElement.getAsJsonObject();
print(innerJsonObject);
} catch (Exception e) {
// is not a JSON
}
}
}
}
Output example:
A: {"name":"Ram","gender":"male","designation":"engineer"}
name: "Ram"
gender: "male"
designation: "engineer"
B: {"name":"Shyam","gender":"male","designation":"student"}
name: "Shyam"
gender: "male"
designation: "student"
C: {"name":"Mohan","gender":"male","designation":"manager"}
name: "Mohan"
gender: "male"
designation: "manager"