I have a test ElasticSearch 6.0 index populated with millions of records, likely to be in the billions in production. I need to search for a subset of these records, then save this subset of the original set into a secondary index for later searching. I have proven this out via querying ES on Kibana, the challenge is to find appropriate APIs in Java 8 using my Jest client (searchbox.io, version 5.3.3) to do the same. The ElasticSearch cluster is on AWS, so using a transport client is out.
POST _reindex?slices=10&wait_for_completion=false
{ "conflicts": "proceed",
"source":{
"index": "my_source_idx",
"size": 5000,
"query": { "bool": {
"filter": { "bool" : { "must" : [
{ "nested": { "path": "test", "query": { "bool": { "must":[
{ "terms" : { "test.RowKey": ["abc"]} },
{ "range" : { "test.dates" : { "lte": "2018-01-01", "gte": "2010-08-01"} } },
{ "range" : { "test.DatesCount" : { "gte": 2} } },
{ "script" : { "script" : { "id": "my_painless_script",
"params" : {"min_occurs" : 1, "dateField": "test.dates", "RowKey": ["abc"], "fromDate": "2010-08-01", "toDate": "2018-01-01"}}}}
]}}}}
]}}
}}
},
"dest": {
"index": "my_dest_idx"
},
"script": {
"source": <My painless script>
} }
I am aware I can perform a search on the source index, then create and bulk load the response records onto the new index, but I want to be able to do this all in one shot, as I do have a painless script to glean off some information that is pertinent to the queries that will search the secondary index. Performance is a concern, as the application will be chaining subsequent queries together using the destination index to query against. Does anyone know how to do accomplish this using Jest?
It appears as if this particular functionality is not yet supported in Jest. The Jest API It has a way to pass in a script (not a query) as a parameter, but I even was having problems with that.
EDIT:
After some hacking with a coworker, we found a way around this...
Step 1) Extend the GenericResultAbstractionAction class with edits to the script:
public class GenericResultReindexActionHack extends GenericResultAbstractAction {
GenericResultReindexActionHack(GenericResultReindexActionHack.Builder builder) {
super(builder);
Map<String, Object> payload = new HashMap<>();
payload.put("source", builder.source);
payload.put("dest", builder.dest);
if (builder.conflicts != null) {
payload.put("conflicts", builder.conflicts);
}
if (builder.size != null) {
payload.put("size", builder.size);
}
if (builder.script != null) {
Script script = (Script) builder.script;
// Note the script parameter needs to be formatted differently to conform to the ES _reindex API:
payload.put("script", new Gson().toJson(ImmutableMap.of("id", script.getIdOrCode(), "params", script.getParams())));
}
this.payload = ImmutableMap.copyOf(payload);
setURI(buildURI());
}
#Override
protected String buildURI() {
return super.buildURI() + "/_reindex";
}
#Override
public String getRestMethodName() {
return "POST";
}
#Override
public String getData(Gson gson) {
if (payload == null) {
return null;
} else if (payload instanceof String) {
return (String) payload;
} else {
// We need to remove the incorrect formatting for the query, dest, and script fields:
// TODO: Need to consider spaces in the JSON
return gson.toJson(payload).replaceAll("\\\\n", "")
.replace("\\", "")
.replace("query\":\"", "query\":")
.replace("\"},\"dest\"", "},\"dest\"")
.replaceAll("\"script\":\"","\"script\":")
.replaceAll("\"}","}")
.replaceAll("},\"script\"","\"},\"script\"");
}
}
public static class Builder extends GenericResultAbstractAction.Builder<GenericResultReindexActionHack , GenericResultReindexActionHack.Builder> {
private Object source;
private Object dest;
private String conflicts;
private Long size;
private Object script;
public Builder(Object source, Object dest) {
this.source = source;
this.dest = dest;
}
public GenericResultReindexActionHack.Builder conflicts(String conflicts) {
this.conflicts = conflicts;
return this;
}
public GenericResultReindexActionHack.Builder size(Long size) {
this.size = size;
return this;
}
public GenericResultReindexActionHack.Builder script(Object script) {
this.script = script;
return this;
}
public GenericResultReindexActionHack.Builder waitForCompletion(boolean waitForCompletion) {
return setParameter("wait_for_completion", waitForCompletion);
}
public GenericResultReindexActionHack.Builder waitForActiveShards(int waitForActiveShards) {
return setParameter("wait_for_active_shards", waitForActiveShards);
}
public GenericResultReindexActionHack.Builder timeout(long timeout) {
return setParameter("timeout", timeout);
}
public GenericResultReindexActionHack.Builder requestsPerSecond(double requestsPerSecond) {
return setParameter("requests_per_second", requestsPerSecond);
}
public GenericResultReindexActionHack build() {
return new GenericResultReindexActionHack(this);
}
}
}
Step 2) Use of this class with a query then requires you to pass in the query as part of the source, then remove the '\n' characters:
ImmutableMap<String, Object> sourceMap = ImmutableMap.of("index", sourceIndex, "query", qb.toString().replaceAll("\\\\n", ""));
ImmutableMap<String, Object> destMap = ImmutableMap.of("index", destIndex);
GenericResultReindexActionHack reindex = new GenericResultReindexActionHack.Builder(sourceMap, destMap)
.waitForCompletion(false)
.conflicts("proceed")
.size(5000L)
.script(reindexScript)
.setParameter("slices", 10)
.build();
JestResult result = handleResult(reindex);
String task = result.getJsonString();
return (task);
Note the reindexScript parameter is of type org.elasticsearch.script.
This is a messy, hack-y way of getting around the limitations of Jest, but it seems to work. I understand that by doing it this way there may be some limitations to what may be acceptable in the input formatting...
Related
I want to group exception with Sentry, the exception comes from different servers, but I want all exception by type together, for example, all NPE be grouped. I know you can extend EventBuilderHelper and this is how sentry group things, but sentry java doesn't provide features to send an event with fingerprints of the method, error type, etc, as others SDK like this example in docs.sentry.io
function makeRequest(method, path, options) {
return fetch(method, path, options).catch(err => {
Sentry.withScope(scope => {
// group errors together based on their request and response
scope.setFingerprint([method, path, err.statusCode]);
Sentry.captureException(err);
});
});
}
this is what I try to do, but in this scope, don't have knowledge about method, error, etc.
package com.test;
import io.sentry.SentryClient;
import io.sentry.event.EventBuilder;
import io.sentry.event.helper.ContextBuilderHelper;
public class FingerprintEventBuilderHelper extends ContextBuilderHelper {
private static final String EXCEPTION_TYPE = "exception_type";
public FingerprintEventBuilderHelper(SentryClient sentryClient) {
super(sentryClient);
}
#Override
public void helpBuildingEvent(EventBuilder eventBuilder) {
super.helpBuildingEvent(eventBuilder);
//Get the exception type
String exceptionType =
if (exceptionType != null) {
eventBuilder.withTag(EXCEPTION_TYPE, exceptionType);
}
//Get method information and params
if (paramX != null) {
eventBuilder.withTag("PARAM", paramX);
}
}
}
the json send to the server has some information about the exception, but I dont know hoy to get it
...
"release": null,
"dist": null,
"platform": "java",
"culprit": "com.sun.ejb.containers.BaseContainer in checkExceptionClientTx",
"message": "Task execution failed",
"datetime": "2019-06-26T14:13:29.000000Z",
"time_spent": null,
"tags": [
["logger", "com.test.TestService"],
["server_name", "localhost"],
["level", "error"]
],
"errors": [],
"extra": {
"Sentry-Threadname": "MainThread",
"rid": "5ff37e943-f4b4-4hc9-870b-4f8c4d18cf84"
},
"fingerprint": ["{{ default }}"],
"key_id": 3,
"metadata": {
"type": "NullPointerException",
"value": ""
},
...
You can get the type of exception that was raised, but I have my doubts about getting the parameters related to functions in the trace
EventBuilderHelper myEventBuilderHelper = new EventBuilderHelper() {
public void helpBuildingEvent(EventBuilder eventBuilder) {
eventBuilder.withMessage("Overwritten by myEventBuilderHelper!");
Map<String, SentryInterface> ifs = eventBuilder.getEvent().getSentryInterfaces();
if (ifs.containsKey("sentry.interfaces.Exception"))
{
ExceptionInterface exI = (ExceptionInterface) ifs.get("sentry.interfaces.Exception");
for (SentryException ex: exI.getExceptions()){
String exceptionType = ex.getExceptionClassName();
}
}
}
};
If you look at the sendException method of the client, it initiates the ExceptionInterface with the actual Exception
public void sendException(Throwable throwable) {
EventBuilder eventBuilder = (new EventBuilder()).withMessage(throwable.getMessage()).withLevel(Level.ERROR).withSentryInterface(new ExceptionInterface(throwable));
this.sendEvent(eventBuilder);
}
And the constructor for the same is like
public ExceptionInterface(Throwable throwable) {
this(SentryException.extractExceptionQueue(throwable));
}
public ExceptionInterface(Deque<SentryException> exceptions) {
this.exceptions = exceptions;
}
So each exception get converted to a SentryException, but the original exception is not stored. So if you need params also, you will need to throw a custom exception with those parameter and also override the sendException method, not a straightforward way
I am pretty new to Cloudant but have developed in SQL on DB2 for some time. I am running into an issue where I *think I am using the Lucene query engine and Cloudant indexes to return results from my query, but only two out of three expected fields are returning data. The field that isn't returning data is an array. Our application is running Java and executed using IBM's Bluemix and WebSphere Liberty Profile. I have packaged the cloudant-client-2.8.0.jar and cloudant-HTTP-2.8.0.jar files to access the cloudant database. We have many queries that are working so the connection itself is fine.
Here is my code that accesses the Cloudant API:
....
Search searchSaaSData = ServiceManagerSingleton.getInstance().getSaaSCapabilitiesDatabase().search("versioning-ddoc/versioning-indx").includeDocs(true);
SearchResult<DocTypeInfo> result = searchSaaSData.querySearchResult(this.getJsonSelector(deliverableId), DocTypeInfo.class);
....
Here is the this.getJsonSelector code:
private String getVersioningSearch(String deliverableId) {
String query = "";
if (deliverableId != null && !deliverableId.isEmpty()) {
// Search based on deliverableId
query += "(";
query += "deliverableId:\"" + deliverableId + "\"" + ")";
}
logger.log(Level.INFO, "Search query is: " + query);
// The query is simple, will look like this:
// deliverableId:"0B439290AB5011E6BE74C84817AAB206")
return query;
}
As you can see from the java code above I am using the java object DocTypeInfo.class to hold the data returned from the search. Here is the class:
public class DocTypeInfo {
private final String docType;
private final String version;
private String isPublishedForReports;
public DocTypeInfo(String docType, String version) {
this.docType = docType;
this.version = version;
}
/**
* #return the docType
*/
private String getDocType() {
return docType;
}
/**
* #return the version
*/
private String getVersion() {
return version;
}
/**
* #return the isPublishedForReports
*/
public String getIsPublishedForReports() {
return isPublishedForReports;
}
/**
* #param isPublishedForReports the isPublishedForReports to set
*/
public void setIsPublishedForReports(String isPublishedForReports) {
this.isPublishedForReports = isPublishedForReports;
}
}
I have setup a design doc and index using the Cloudant dashboard as follows:
{
"_id": "_design/versioning-ddoc",
"_rev": "22-0e0c0ccfc2b5fe7245352da7e5b1ebd3",
"views": {},
"language": "javascript",
"indexes": {
"versioning-indx": {
"analyzer": {
"name": "perfield",
"default": "standard",
"fields": {
"deliverableId": "whitespace",
"docType": "standard",
"version": "standard",
"isPublishedForReports": "keyword"
}
},
"index": "function (doc) {
index(\"deliverableId\", doc.deliverableId, {\"store\":true, \"boost\":1.0});
index(\"docType\", doc.docType, {\"store\":true, \"boost\":1.0});
index(\"version\", doc.version, {\"store\":true, \"boost\":1.0});
if (Array.isArray(doc.publishSettings)) {
for (var i in doc.publishSettings) {
if (doc.publishSettings[i].options) {
for (var j in doc.publishSettings[i].options) {
index(\"isPublishedForReports\", doc.publishSettings[i].options[j].optionId, {\"store\":true, \"boost\":1.0});
}
}
}
}
}"
}
}
}
When I execute the search the docType and version fields are populated, however the isPublishedForReports field is ALWAYS null or doesn't return. When I run a query in the Cloudant dashboard against the index I can see the isPublishedForReports value is returned, I don't know why it's not populated in the object? Maybe I am misunderstanding how these get built?
Here a screenshot where I query the DB and I can see the results I want:
Please help!
-Doug
I think you are accessing the docs property from each row in result.rows instead of the fields property. When you run your search with .includeDocs(true) you will see a result similar to the following:
{
"total_rows":3,
"bookmark":"g1AAA",
"rows":[
{
"id":"263a81ea76528dead3a4185df3676f62",
"order":[
1.0,
0
],
"fields":{
"docType":"xxx",
"deliverableId":"yyy",
"isPublishedForReports":"pu_1_2"
},
"doc":{
"_id":"263a81ea76528dead3a4185df3676f62",
"_rev":"3-116dd3831c182fb13c12b05a8b0996e4",
"docType":"xxx",
"deliverableId":"yyy",
"publishSettings":[...]
}
}
...
]
}
Notice how you can see the fields you defined in your search index in the fields property and the full document in the doc property. The full document does not include isPublishedForReports.
To get the isPublishedForReports value you need to access the fields property:
for (SearchResult<DocTypeInfo>.SearchResultRow row : result.getRows()) {
...
row.getFields().getIsPublishedForReports()
...
}
Also, if you don't need the whole doc you can set .includeDocs(false) and only access the fields property.
I'm trying to make my android app download images from AWS S3. However, the following exception keeps coming up:
com.amazonaws.AmazonServiceException: Request ARN is invalid (Service: AWSSecurityTokenService; Status Code: 400; Error Code: ValidationError; Request ID: 3481bd5f-1db2-11e5-8442-cb6f713243b6)
at com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:710)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:385)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:196)
at com.amazonaws.services.securitytoken.AWSSecurityTokenServiceClient.invoke(AWSSecurityTokenServiceClient.java:875)
at com.amazonaws.services.securitytoken.AWSSecurityTokenServiceClient.assumeRoleWithWebIdentity(AWSSecurityTokenServiceClient.java:496)
at com.amazonaws.auth.CognitoCredentialsProvider.populateCredentialsWithSts(CognitoCredentialsProvider.java:671)
at com.amazonaws.auth.CognitoCredentialsProvider.startSession(CognitoCredentialsProvider.java:555)
at com.amazonaws.auth.CognitoCredentialsProvider.refresh(CognitoCredentialsProvider.java:503)
at com.application.app.utils.helper.S3Utils.getCredProvider(S3Utils.java:35)
at com.application.app.utils.helper.S3Utils.getS3Client(S3Utils.java:45)
at com.application.app.integration.volley.CustomImageRequest.parseNetworkError(CustomImageRequest.java:73)
at com.android.volley.NetworkDispatcher.parseAndDeliverNetworkError(NetworkDispatcher.java:144)
at com.android.volley.NetworkDispatcher.run(NetworkDispatcher.java:135)
I have a bucket and an identity pool. Also, created required roles.
My Cognito_APPUnauth_Role has the following INLINE POLICY:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1435504517000",
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::mybucket/*"
]
}
]
}
I have a java class named S3Utils that has some helper methods.
public class S3Utils {
private static AmazonS3Client sS3Client;
private static CognitoCachingCredentialsProvider sCredProvider;
public static CognitoCachingCredentialsProvider getCredProvider(Context context){
if (sCredProvider == null) {
sCredProvider = new CognitoCachingCredentialsProvider(
context,
Definitions.AWS_ACCOUNT_ID,
Definitions.COGNITO_POOL_ID,
Definitions.COGNITO_ROLE_UNAUTH,
null,
Regions.US_EAST_1
);
}
sCredProvider.refresh();
return sCredProvider;
}
public static String getPrefix(Context context) {
return getCredProvider(context).getIdentityId() + "/";
}
public static AmazonS3Client getS3Client(Context context) {
if (sS3Client == null) {
sS3Client = new AmazonS3Client(getCredProvider(context));
}
return sS3Client;
}
public static String getFileName(String path) {
return path.substring(path.lastIndexOf("/") + 1);
}
public static boolean doesBucketExist() {
return sS3Client.doesBucketExist(Definitions.BUCKET_NAME.toLowerCase(Locale.US));
}
public static void createBucket() {
sS3Client.createBucket(Definitions.BUCKET_NAME.toLowerCase(Locale.US));
}
public static void deleteBucket() {
String name = Definitions.BUCKET_NAME.toLowerCase(Locale.US);
List<S3ObjectSummary> objData = sS3Client.listObjects(name).getObjectSummaries();
if (objData.size() > 0) {
DeleteObjectsRequest emptyBucket = new DeleteObjectsRequest(name);
List<DeleteObjectsRequest.KeyVersion> keyList = new ArrayList<DeleteObjectsRequest.KeyVersion>();
for (S3ObjectSummary summary : objData) {
keyList.add(new DeleteObjectsRequest.KeyVersion(summary.getKey()));
}
emptyBucket.withKeys(keyList);
sS3Client.deleteObjects(emptyBucket);
}
sS3Client.deleteBucket(name);
}
}
Part of the method where the exception occurs, in CustomImageRequest.java:
s3Client = S3Utils.getS3Client(context);
ObjectListing objects = s3Client.listObjects(new ListObjectsRequest().withBucketName(Definitions.BUCKET_NAME).withPrefix(this.urlToRetrieve));
List<S3ObjectSummary> objectSummaries = objects.getObjectSummaries();
//This isn't just an id, it is a full picture name in S3 bucket.
for (S3ObjectSummary summary : objectSummaries)
{
String key = summary.getKey();
if (!key.equals(this.urlToRetrieve)) continue;
S3ObjectInputStream content = s3Client.getObject(Definitions.BUCKET_NAME, key).getObjectContent();
try {
this.s3Image = IOUtils.toByteArray(content);
} catch (IOException e) {
}
return new Object();
}
What am I doing wrong that causes this exception to be thrown every time. Thanks in advance.
I'm guessing there might be an error in the role ARN you specified. A role ARN should look something like
arn:aws:cognito-identity:us-east-1:ACCOUNTNUMBER:identitypool/us-east-1:UUID
If it is misspelled, or part is left off you may get the error. You may also want to consider user the new CognitoCachingCredentialsProvider constructor.
sCredProvider = new CognitoCachingCredentialsProvider(
context,
Definitions.COGNITO_POOL_ID,
Regions.US_EAST_1
);
However note that you will have to make sure that you have specified your role ARN in the Cognito console, but it should help prevent this issue in the future.
Edited for clarity, formatting, and added that you need to modify your ARN in the console if using new constructor.
Inshort : I am trying to find some api that could just change the value by taking first parameter as jsonString , second parameter as JSONPath and third will be new value of that parameter. But, all I found is this..
https://code.google.com/p/json-path/
This api allows me to find any value in JSON String. But, I am not finding easy way to update the value of any key. For example, Here is a book.json.
{
"store":{
"book":[
{
"category":"reference",
"author":"Nigel Rees",
"title":"Sayings of the Century",
"price":8.95
},
{
"category":"fiction",
"author":"Evelyn Waugh",
"title":"Sword of Honour",
"price":12.99,
"isbn":"0-553-21311-3"
}
],
"bicycle":{
"color":"red",
"price":19.95
}
}
}
I can access color of bicycle by doing this.
String bicycleColor = JsonPath.read(json, "$.store.bicycle.color");
But I am looking for a method in JsonPath or other api some thing like this
JsonPath.changeNodeValue(json, "$.store.bicycle.color", "green");
String bicycleColor = JsonPath.read(json, "$.store.bicycle.color");
System.out.println(bicycleColor); // This should print "green" now.
I am excluding these options,
Create a new JSON String.
Create a JSON Object to deal with changing value and convert it back to jsonstring
Reason: I have about 500 different requests for different types of service which return different json structure. So, I do not want to manually create new JSON string always. Because, IDs are dynamic in json structure.
Any idea or direction is much appreciated.
Updating this question with following answer.
Copy MutableJson.java.
copy this little snippet and modify as per you need.
private static void updateJsonValue() {
JSONParser parser = new JSONParser();
JSONObject jsonObject = new JSONObject();
FileReader reader = null;
try {
File jsonFile = new File("path to book.json");
reader = new FileReader(jsonFile);
jsonObject = (JSONObject) parser.parse(reader);
} catch (Exception ex) {
System.out.println(ex.getLocalizedMessage());
}
Map<String, Object> userData = null;
try {
userData = new ObjectMapper().readValue(jsonObject.toJSONString(), Map.class);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
MutableJson json = new MutableJson(userData);
System.out.println("Before:\t" + json.map());
json.update("$.store.book[0].author", "jigish");
json.update("$.store.book[1].category", "action");
System.out.println("After:\t" + json.map().toString());
}
Use these libraries.
import org.json.simple.JSONObject;
import org.json.simple.parser.JSONParser;
import org.codehaus.jackson.map.ObjectMapper;
The thing is that the functionality you want is already an undocumented feature of JsonPath. Example using your json structure:
String json = "{ \"store\":{ \"book\":[ { \"category\":\"reference\", \"author\":\"Nigel Rees\", \"title\":\"Sayings of the Century\", \"price\":8.95 }, { \"category\":\"fiction\", \"author\":\"Evelyn Waugh\", \"title\":\"Sword of Honour\", \"price\":12.99, \"isbn\":\"0-553-21311-3\" } ], \"bicycle\":{ \"color\":\"red\", \"price\":19.95 } } }";
DocumentContext doc = JsonPath.parse(json).
set("$.store.bicycle.color", "green").
set("$.store.book[0].price", 9.5);
String newJson = new Gson().toJson(doc.read("$"));
Assuming that parsed JSON can be represented in memory as a Map, you can build an API similar to JsonPath that looks like:
void update(Map<String, Object> json, String path, Object newValue);
I've quickly done a gist of a dirty implementation for simple specific paths (no support for conditions and wildcards) that can traverse json tree, E.g. $.store.name, $.store.books[0].isbn. Here it is: MutableJson.java. It definitely needs improvement, but can give a good start.
Usage example:
import java.util.*;
public class MutableJson {
public static void main(String[] args) {
MutableJson json = new MutableJson(
new HashMap<String, Object>() {{
put("store", new HashMap<String, Object>() {{
put("name", "Some Store");
put("books", Arrays.asList(
new HashMap<String, Object>() {{
put("isbn", "111");
}},
new HashMap<String, Object>() {{
put("isbn", "222");
}}
));
}});
}}
);
System.out.println("Before:\t" + json.map());
json.update("$.store.name", "Book Store");
json.update("$.store.books[0].isbn", "444");
json.update("$.store.books[1].isbn", "555");
System.out.println("After:\t" + json.map());
}
private final Map<String, Object> json;
public MutableJson(Map<String, Object> json) {
this.json = json;
}
public Map<String, Object> map() {
return json;
}
public void update(String path, Object newValue) {
updateJson(this.json, Path.parse(path), newValue);
}
private void updateJson(Map<String, Object> data, Iterator<Token> path, Object newValue) {
Token token = path.next();
for (Map.Entry<String, Object> entry : data.entrySet()) {
if (!token.accept(entry.getKey(), entry.getValue())) {
continue;
}
if (path.hasNext()) {
Object value = token.value(entry.getValue());
if (value instanceof Map) {
updateJson((Map<String, Object>) value, path, newValue);
}
} else {
token.update(entry, newValue);
}
}
}
}
class Path {
public static Iterator<Token> parse(String path) {
if (path.isEmpty()) {
return Collections.<Token>emptyList().iterator();
}
if (path.startsWith("$.")) {
path = path.substring(2);
}
List<Token> tokens = new ArrayList<>();
for (String part : path.split("\\.")) {
if (part.matches("\\w+\\[\\d+\\]")) {
String fieldName = part.substring(0, part.indexOf('['));
int index = Integer.parseInt(part.substring(part.indexOf('[')+1, part.indexOf(']')));
tokens.add(new ArrayToken(fieldName, index));
} else {
tokens.add(new FieldToken(part));
}
};
return tokens.iterator();
}
}
abstract class Token {
protected final String fieldName;
Token(String fieldName) {
this.fieldName = fieldName;
}
public abstract Object value(Object value);
public abstract boolean accept(String key, Object value);
public abstract void update(Map.Entry<String, Object> entry, Object newValue);
}
class FieldToken extends Token {
FieldToken(String fieldName) {
super(fieldName);
}
#Override
public Object value(Object value) {
return value;
}
#Override
public boolean accept(String key, Object value) {
return fieldName.equals(key);
}
#Override
public void update(Map.Entry<String, Object> entry, Object newValue) {
entry.setValue(newValue);
}
}
class ArrayToken extends Token {
private final int index;
ArrayToken(String fieldName, int index) {
super(fieldName);
this.index = index;
}
#Override
public Object value(Object value) {
return ((List) value).get(index);
}
#Override
public boolean accept(String key, Object value) {
return fieldName.equals(key) && value instanceof List && ((List) value).size() > index;
}
#Override
public void update(Map.Entry<String, Object> entry, Object newValue) {
List list = (List) entry.getValue();
list.set(index, newValue);
}
}
A JSON string can be easily parsed into a Map using Jackson:
Map<String,Object> userData = new ObjectMapper().readValue("{ \"store\": ... }", Map.class);
Just answering for folks landing on this page in future for reference.
You could consider using a Java implementation of jsonpatch. RFC can be found here
JSON Patch is a format for describing changes to a JSON document. It can be used to avoid sending a whole document when only a part has changed. When used in combination with the HTTP PATCH method it allows partial updates for HTTP APIs in a standards compliant way.
You can specify the operation that needs to be performed (replace, add....), json path at which it has to be performed, and the value which should be used.
Again, taking example from the RFC :
[
{ "op": "test", "path": "/a/b/c", "value": "foo" },
{ "op": "remove", "path": "/a/b/c" },
{ "op": "add", "path": "/a/b/c", "value": [ "foo", "bar" ] },
{ "op": "replace", "path": "/a/b/c", "value": 42 },
{ "op": "move", "from": "/a/b/c", "path": "/a/b/d" },
{ "op": "copy", "from": "/a/b/d", "path": "/a/b/e" }
]
For Java implementation, I have not used it myself, but you can give a try to https://github.com/fge/json-patch
So in order to change a value within a JSon string, there are two steps:
Parse the JSon
Modify the appropriate field
You are trying to optimize step 2, but understand that you are not going to be able to avoid step 1. Looking at the Json-path source code (which, really, is just a wrapper around Jackson), note that it does do a full parse of the Json string before being able to spit out the read value. It does this parse every time you call read(), e.g. it is not cached.
I think this task is specific enough that you're going to have to write it yourself. Here is what I would do:
Create an object that represents the data in the parsed Json string.
Make sure this object has, as part of it's fields, the Json String pieces that you do not expect to change often.
Create a custom Deserializer in the Json framework of your choice that will populate the fields correctly.
Create a custom Serializer that uses the cached String pieces, plus the data that you expect to change
I think the exact scope of your problem is unusual enough that it is unlikely a library already exists for this. When a program receives a Json String, most of the time what it wants is the fully deserialized object - it is unusual that it needs to FORWARD this object on to somewhere else.
With Rhino 17R4, we can create properties in javascript using Object.defineProperty() method.
public class MyGlobalObject : org.mozilla.javascript.ScriptableObject
{
public static org.mozilla.javascript.Script ___compiledScript = null;
public MyGlobalObject()
{
org.mozilla.javascript.Context con = org.mozilla.javascript.Context.enter();
try
{
con.initStandardObjects(this);
string strScript = "Object.defineProperty(this,\r\n 'onload', \r\n{ set : function(val){this.set_onload(val);},\r\n get : function(){return this.get_onload();}, enumerable: true, configurable: true});";
this.defineFunctionProperties(new string[] { "set_onload", "get_onload" }, typeof(MyGlobalObject), org.mozilla.javascript.ScriptableObject.DONTENUM);
org.mozilla.javascript.Script sc = con.compileString(strScript, "", 1, null);
object result_onload = con.evaluateString(this, "this.onload == undefined;", "", 1, null); // make sure it is not defined.
Console.WriteLine("onload is undefined? : {0}", result_onload);
// Define Properties Now.
sc.exec(con, this);
con.evaluateString(this, "this.onload= function(){var t1 = 1;};", "", 1, null);
object onloadobjectXYZ = con.evaluateString(this, "this.onload;", "", 1, null); // get function now.
Console.WriteLine("Onload object : {0} is found", onloadobjectXYZ);
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
}
org.mozilla.javascript.Context.exit();
}
private object __onloadFunction;
public object get_onload()
{
Console.WriteLine("get_onload() called!");
return this.__onloadFunction;
}
//[org.mozilla.javascript.annotations.JSSetter]
public void set_onload(object _val)
{
Console.WriteLine("set_onload() called!");
this.__onloadFunction = _val;
}
public override string getClassName()
{
return "Global";
}
}
How can I create FunctionObject which is identical to "onloadobjectXYZ" in pure rhino object operation (not by using script like'strScipt')? It seems that it may be able to create FunctionObject for setter and getter, but I could not find a good example. Does anyone know how to define properties?
Thank you in advance!
defineProperty with java Method setter / getter is slightly different from object.defineProprty()
this.defineProperty("onload", null, javaonloadGetMethod, javaonloadSetMethod, ScriptableObject.PERMANENT);
This works for me as a workaround.