How to get id of JSON objects in Java - java

I have previously used GSON, which automatically transfers the data as long as my custom object has a variable with the same name. However, this time, I'm also intrested in the name, or ID, of the object. The object only contains a single long. Example of how it looks:
{"1":123,
"2":124,
"4":125,
"5":126,
"6":127}
As you can see, the list don't necessarily contain all sequent IDs so I cannot just create a list. How would you solve the problem?

Instead of deserializing to a specific custom object, just deserialize to Map<String, Integer>:
Type type = new TypeToken<Map<String, Integer>>(){}.getType();
Map<String, Integer> result = gson.fromJson(jsonString, type);

Use jython:
import json
json_data = json.loads(your_string_above)
ids = json_data.keys()
# ids now contains [u'1', u'2', u'4', u'5', u'6']
Hope that helps.

Related

how to obtain Explain object from solrj?

I'm switching from lucene to solr and I'm querying it with solrj.
After a query I'd like to obtain the explain and I executed this code:
QueryResponse queryResult = solrClient.query(collection, query, POST);
List<SolrDocument> recommended = queryResult.getResults();
Map<String, String> explainMap = queryResult.getExplainMap();
Map<String, Object> debugMap = queryResult.getDebugMap();
The explainMap contains the explanation of my query as a String.
Is it possible to obtain an object of explanation to avoid parsing the string?
For example I'd like to extract all the fields that matched and their value (without score and its composition).
My documents have only one textual field and many categorical or numeric fields.

how to convert an Item (dynamodbv2.document.Item) object into JSON in java?

i'm implementing a lambda function in aws. i use dynamodb to store data and the application is written using java. the function is getting item from dynamodb and returns it as response. i want to return its values as JSON for the response. i use the following code but it returns {"empty:false"} in aws lambda test. but when i return it as String it prints the values. but i need it in Json.
Table table = dynamoDb.getTable(DYNAMODB_TABLE_NAME);
Item searchedItem = table.getItem("name", input.getName());
String name = searchedItem.getString("name");
int count = searchedItem.getInt("count");
MapjsonMap=new HashMap<>();
jsonMap.put("name",name);
jsonMap.put("count",count);
JSONObject json = new JSONObject(searchedItem.toJSONPretty());
for (String key:jsonMap.keySet()) {
json.put(key,jsonMap.get(key));
}
return json;
i expect the result to contain the values of dynamodb and it returns {"empty":false}.
finaly i got fixed the problem. the aws lambda is auto converting java model objects and json. no need to do it manually by us :-D just return the result in a corresponding java POJO object and you'll get the Json output in aws. thanks everyone for commenting. cheers.

Deserializing Primary Key Value with Underscore: Unexpected character Expected space separating root-level values

In Java, using the Jackson ObjectMapper, I'm trying to deserialize a dynamo db object being read from a dynamo db stream.
I first call:
record.getDynamodb().getNewImage().get("primaryKey").getS().toString()
to get the primaryKey value of "1_12345" back from the stream.
I then use it in the object mapper to create a new instance of the Metrics object with the primaryKey member set:objectMapper.readValue("1_12345", Metrics.class);
The problem is I get an exception on that call:
Unexpected character ('_' (code 95)): Expected space separating root-level values
Metrics.class is a simple POJO with no constructor. I'm wondering if I need any special annotations or escape characters in my readValue call. I can't seem to find any clear indications on what the solution is in the case of this error.
(Side note - the reason I can't parse it straight from the json is because the json's structure when it's parsed from the stream isn't straightforward, a value looks like this, S indicating String, N for number etc:
{primaryKey={S: 1_12345,}, rangeKey={N: xxx}... etc. })
Thank you, that was the problem, the readValue() call takes a String in the format of JSON. The solution was to convert the dynamo streamed image into lists & maps (using the dynamodbv2 libs) until it was in the correct format as below:
Map<String, AttributeValue> newImage = record.getDynamodb().getNewImage();
List<Map<String, AttributeValue>> listOfMaps = new ArrayList<Map<String, AttributeValue>>();
listOfMaps.add(newImage);
List<Item> itemList = InternalUtils.toItemList(listOfMaps);
for (Item item : itemList) {
String json = item.toJSON();
Metrics metric = objectMapper.readValue(json, Metrics.class);
}

Storing Map with Java Cloudant API

I am using Java Cloudant API to store data in IBM Cloudant.
I am trying to store null value for a key in HashMap in a document.
Following is the code to insert a HashMap in the DB.
Map<String, Object> map = new HashMap<String, Object>();
map.put("field1", "value1");
map.put("field2", null);
db.save(map);
The document is stored successfully, but the document does not contain the field2 key at all.
(i.e.) When the value of a key in a hashmap is null, the corresponding key is not stored in Cloudant.
Any ideas to this.? Where am I going wrong.?
How does Cloudant store Java Objects internally.?
Does it convert Java Objects to String internally?
Firstly, Cloudant doesn't store Java Objects, it is a JSON document store. It uses GSON to serialise Java Objects into a JSON document, so the data it sends to cloudant has the form:
{
"_id": "aDocId",
"field1": "value1"
}
Looking at the code for the library, it doesn't include the serializeNulls() method when creating the GSON builder. You can work around this by providing your own GsonBuilder instance to the ClientBuilder before calling the build method to create the client.
Example:
CloudantClient client = ClientBuilder.account("example")
.username("example")
.password("password")
.gsonBuilder(new GsonBuilder.serializeNulls())
.build();
However this being said it may cause errors and unexpected behaviour, because serialising nulls has not been tested.

Parse JSON record to extract key and value and put into Map in java

I have one column in my table which will store data in string format the sample data is
{"pre-date":{"enable":true,"days":"3","interval":"1","mail-template":"582"},"on-date":{"enabled":false},"post-date":{"enabled":false}}
and the string contains data like json data
but when i will send this record for controller to view it should be in format
enable : true
days : 3
interval : 1
so that i can set values to respective form elements how to do this in java any help
Read the complete JSON string from the database, then parse it using a JSON parser, and extract the information you're interested into from the data structure/object returned from the parsing.
There are lots of JSON parsers available. Look at this page, which lists a number of them in the Java section (you have to scroll a little bit down).
Jackson provides the best support for simple conversion of any JSON object into a Java Map comprised of only Java SE components.
Following is an example using the JSON from the original question.
// {"pre-date":{"enable":true,"days":"3","interval":"1","mail-template":"582"},"on-date":{"enabled":false},"post-date":{"enabled":false}}
String json = "{\"pre-date\":{\"enable\":true,\"days\":\"3\",\"interval\":\"1\",\"mail-template\":\"582\"},\"on-date\":{\"enabled\":false},\"post-date\":{\"enabled\":false}}";
ObjectMapper mapper = new ObjectMapper();
// To put all of the JSON in a Map<String, Object>
Map<String, Object> map = mapper.readValue(json, Map.class);
// Accessing the three target data elements
Map<String, Object> preDateMap = (Map) map.get("pre-date");
System.out.println(preDateMap.get("enable"));
System.out.println(preDateMap.get("days"));
System.out.println(preDateMap.get("interval"));

Categories

Resources