Is It possible to change value of Range key in DynamoDB Table? - java

I know it may be a very silly question, but I am new to DynamoDB.
My doubt is is it possible to update the value of a Range Key in DynamoDB.
Suppose My Table is "TEST"
{
ID : PK/HK
Date : RK
Name : GSI
Add : LSI
}
I want to modify Date Attribute.
Initial Values in Table was:
{
ID = "344"
Date = "5656"
Name = "ABC"
}
Running this code below. I am able to change the Name Attribute which is GSI.
Map<String,AttributeValue> item = new HashMap<String,AttributeValue>();
item.put("ID", new AttributeValue("344"));
item.put("Date", new AttributeValue("5656"));
Map<String,AttributeValueUpdate> item1 = new HashMap<String,AttributeValueUpdate>();
AttributeValueUpdate update = new AttributeValueUpdate().withValue(new AttributeValue("AMIT")).withAction("PUT");
item1.put("Name", update);
UpdateItemRequest updateItemreq = new UpdateItemRequest("Test",item,item1);
UpdateItemResult updateItemres = dynamoDBUSEast.updateItem(updateItemreq);
But When I change this line
item1.put("Name", update);
with
item1.put("Date", update);
I am getting some error as
Exception in thread "main" com.amazonaws.AmazonServiceException: One or more parameter values were invalid: Cannot update attribute Date. This attribute is part of the key (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException; Request ID: HRRP24Q7C48AMD8ASAI992L6MBVV4KQNSO5AEMVJF66Q9ASUAAJG)
at com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:820)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:439)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:245)
at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.invoke(AmazonDynamoDBClient.java:2908)
at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.updateItem(AmazonDynamoDBClient.java:1256)
So Is it possible to change the range Key value?

No, like the exception message states, you Cannot update attribute Date. This attribute is part of the key.
You can also see this under the AttributeUpdates documentation:
The names of attributes to be modified, the action to perform on each,
and the new value for each. If you are updating an attribute that is
an index key attribute for any indexes on that table, the attribute
type must match the index key type defined in the AttributesDefinition
of the table description. You can use UpdateItem to update any nonkey
attributes.
The documentation states that you can update any attribute for "an attribute that is an index key attribute for any indexes on that table", which means that when you update an attribute that is projected onto an index, even it is is part of that indexes key, that index will also be updated to reflect the original item.

From the docs of AttributeValueUpdate
You cannot use UpdateItem to update any primary key attributes.
Instead, you will need to delete the item, and then use PutItem to
create a new item with new attributes.

It's a little buried but in docs for UpdateItem it says:
"You can use UpdateItem to update any nonkey attributes."
So, currently the only way to update the primary key of an item is to delete the old item and write a new one.

Here is my implementation of updating id in .net by deleting the item and then recreating it with the new id. I assume java is very similar:
// Based on https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/LowLevelDotNetItemsExample.html
public class UpdateId
{
private static string tableName = "MyTableName";
private static AmazonDynamoDBClient client = new AmazonDynamoDBClient();
private static bool isVerbose = false;
public static void ChangeId(string currentId, string newId)
{
try
{
var deletedItem = DeleteItem(currentId);
if (deletedItem.Count == 0)
{
Console.WriteLine($"ERROR: Item to delete not found: {currentId}");
return;
}
deletedItem["Id"] = new AttributeValue
{
S = newId
};
CreateItem(deletedItem);
var updatedItem = RetrieveItem(newId);
if (updatedItem.Count > 0 && updatedItem["Id"].S == newId)
{
Console.WriteLine($"Item id successfully changed from ({currentId}) to ({newId})");
}
else
{
Console.WriteLine($"ERROR: Item id didn't change from ({currentId}) to ({newId})");
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
Console.WriteLine("To continue, press Enter");
Console.ReadLine();
}
}
private static void CreateItem(Dictionary<string, AttributeValue> item)
{
var request = new PutItemRequest
{
TableName = tableName,
Item = item
};
client.PutItem(request);
}
private static Dictionary<string, AttributeValue> RetrieveItem(string id)
{
var request = new GetItemRequest
{
TableName = tableName,
Key = new Dictionary<string, AttributeValue>()
{
{ "Id", new AttributeValue {
S = id
} }
},
ConsistentRead = true
};
var response = client.GetItem(request);
// Check the response.
var attributeList = response.Item; // attribute list in the response.
if (isVerbose)
{
Console.WriteLine("\nPrinting item after retrieving it ............");
PrintItem(attributeList);
}
return attributeList;
}
private static Dictionary<string, AttributeValue> DeleteItem(string id)
{
var request = new DeleteItemRequest
{
TableName = tableName,
Key = new Dictionary<string, AttributeValue>()
{
{ "Id", new AttributeValue {
S = id
} }
},
// Return the entire item as it appeared before the update.
ReturnValues = "ALL_OLD",
// ExpressionAttributeNames = new Dictionary<string, string>()
// {
// {"#IP", "InPublication"}
// },
// ExpressionAttributeValues = new Dictionary<string, AttributeValue>()
// {
// {":inpub",new AttributeValue {
// BOOL = false
// }}
// },
// ConditionExpression = "#IP = :inpub"
};
var response = client.DeleteItem(request);
// Check the response.
var attributeList = response.Attributes; // Attribute list in the response.
// Print item.
if (isVerbose)
{
Console.WriteLine("\nPrinting item that was just deleted ............");
PrintItem(attributeList);
}
return attributeList;
}
private static void PrintItem(Dictionary<string, AttributeValue> attributeList)
{
foreach (KeyValuePair<string, AttributeValue> kvp in attributeList)
{
string attributeName = kvp.Key;
AttributeValue value = kvp.Value;
Console.WriteLine(
attributeName + " " +
(value.S == null ? "" : "S=[" + value.S + "]") +
(value.N == null ? "" : "N=[" + value.N + "]") +
(value.SS == null ? "" : "SS=[" + string.Join(",", value.SS.ToArray()) + "]") +
(value.NS == null ? "" : "NS=[" + string.Join(",", value.NS.ToArray()) + "]")
);
}
Console.WriteLine("************************************************");
}
}
To call it just do this:
UpdateId.ChangeId("OriginalId", "NewId");

Related

How to search for all records of of type CustomRecordType in Netsuite?

I am trying to get all the records related to a custom record type. How to do it in Netsuite SOAP?
Also is there a way to search records of that custom record type by it's recordname?
Something like this returns only the first record:
CustomRecordRef customRec = new CustomRecordRef();
customRec.setInternalId("XXX");
customRec.setScriptId("customrecord_lc_mapping");
netsuiteSoapClient.getPort(true).get(customRec);
Here is an example code on how to query all the values of a custom record type using Java SOAP:
/**
* String search Cost Template Values
*
* #return internal ID
* #throws Any Exception
*/
private Map<String, String> searchCostTemplateValues() throws Exception {
CustomRecordSearchBasic customRecordSearch = new CustomRecordSearchBasic();
RecordRef recordRef = new RecordRef();
if(environment.toLowerCase().equals("test") || environment.equals("default")) {
recordRef.setInternalId("426");
}
else {
recordRef.setInternalId("426");
}
customRecordSearch.setRecType(recordRef);
SearchResult response = netsuiteSoapClient
.getPort(true)
.search(customRecordSearch);
LOGGER.info("Search Result: " + new ObjectMapper()
.writerWithDefaultPrettyPrinter().writeValueAsString(response));
RecordList costTemplateRecordList = response.getRecordList();
Record[] customRecordArray = costTemplateRecordList.getRecord();
Map<String, String> costTemplateMap = new HashMap<>();
for(Record r : customRecordArray) {
CustomRecord cr = (CustomRecord) r;
String name = cr.getName();
String internalId = cr.getInternalId();
costTemplateMap.put(name, internalId);
}
return costTemplateMap;
}

getting first level of categorisation from Notes view

I have a categorized Notes view, let say the first categorized column is TypeOfVehicle the second categorized column is Model and the third categorized column is Manufacturer.
I would like to collect only the values for the first category and return it as json object:
I am facing two problems:
- I can not read the value for the category, the column values are emptry and when I try to access the underlying document it is null
the script won't hop over to the category/sibling on the same level.
can someone explain me what am I doing wrong here?
private Object getFirstCategory() {
JsonJavaObject json = new JsonJavaObject();
try{
String server = null;
String filepath = null;
server = props.getProperty("server");
filepath = props.getProperty("filename");
Database db;
db = utils.getSession().getDatabase(server, filepath);
if (db.isOpen()) {
View vw = db.getView("transport");
if (null != vw) {
vw.setAutoUpdate(false);
ViewNavigator nav;
nav = vw.createViewNav();
JsonJavaArray arr = new JsonJavaArray();
Integer count = 0;
ViewEntry tmpentry;
ViewEntry entry = nav.getFirst();
while (null != entry) {
Vector<?> columnValues = entry.getColumnValues();
if(entry.isCategory()){
System.out.println("entry notesid = " + entry.getNoteID());
Document doc = entry.getDocument();
if(null != doc){
if (doc.hasItem("TypeOfVehicle ")){
System.out.println("category has not " + "TypeOfVehicle ");
}
else{
System.out.println("category IS " + doc.getItemValueString("TypeOfVehicle "));
}
} else{
System.out.println("doc is null");
}
JsonJavaObject row = new JsonJavaObject();
JsonJavaObject jo = new JsonJavaObject();
String TypeOfVehicle = String.valueOf(columnValues.get(0));
if (null != TypeOfVehicle ) {
if (!TypeOfVehicle .equals("")){
jo.put("TypeOfVehicle ", TypeOfVehicle );
} else{
jo.put("TypeOfVehicle ", "Not categorized");
}
} else {
jo.put("TypeOfVehicle ", "Not categorized");
}
row.put("request", jo);
arr.put(count, row);
count++;
tmpentry = nav.getNextSibling(entry);
entry.recycle();
entry = tmpentry;
} else{
//tmpentry = nav.getNextCategory();
//entry.recycle();
//entry = tmpentry;
}
}
json.put("data", arr);
vw.setAutoUpdate(true);
vw.recycle();
}
}
} catch (Exception e) {
OpenLogUtil.logErrorEx(e, JSFUtil.getXSPContext().getUrl().toString(), Level.SEVERE, null);
}
return json;
}
What you're doing wrong is trying to treat any single view entry as both a category and a document. A single view entry can only be one of a category, a document, or a total.
If you have an entry for which isCategory() returns true, then for the same entry:
isDocument() will return false.
getDocument() will return null.
getNoteID() will return an empty string.
If the only thing you need is top-level categories, then get the first entry from the navigator and iterate over entries using nav.getNextSibling(entry) as you're already doing, but:
Don't try to get documents, note ids, or fields.
Use entry.getColumnValues().get(0) to get the value of the first column for each category.
If the view contains any uncategorised documents, it's possible that entry.getColumnValues().get(0) might throw an exception, so you should also check that entry.getColumnValues().size() is at least 1 before trying to get a value.
If you need any extra data beyond just top-level categories, then note that subcategories and documents are children of their parent categories.
If an entry has a subcategory, nav.getChild(entry) will get the first subcategory of that entry.
If an entry has no subcategories, but is a category which contains documents, nav.getChild(entry) will get the first document in that category.

Debezium - Update Operation Emits Change Event into Kafka Topic with Both Before & After Struct Values ,But Ignores null Column/Field in Before Struct

I'm using debezium to synchronize the data between two postgres DB server & i'm facing an issue with the update event/operation as it's recording the change event into kafka topic by ignoring the null value column/field(refer below infodetcode field missing in the before struct as it was null in DB ), same column/field is avail in after struct as the value changed from "null" to "some value", with the null value column missing in before struct, when i compare before with after struct to find out which are the field/column values are unique/duplicate to construct dynamic query,query is constructed with the missing column,and it is necessary to put that column in the query(please find the below configuration & comparison implementation of before and after struct which returns result without null value column),I'd gladly take suggestions/help on this issue.
Note : REPLICA IDENTITY is set to "FULL"
Version:
PostgreSQL - 10.9, debezium - 1.1.1.Final
Before & After Struct-Topic Record(Actual):
before=struct{accountno=01,currencycode=USD,seqno=1,informationcode=S}
after=struct{accountno=01,currencycode=USD,seqno=1 ,informationcode=M ,infodetcode=N}
Before & After Struct-Topic Record(Expected):
before=struct{accountno=01,currencycode=USD,seqno=1,informationcode=S,infodetcode=null}
after=struct{accountno=01,currencycode=USD,seqno=1 ,informationcode=M ,infodetcode=N}
Debezium configuration:
#Bean
public io.debezium.config.Configuration postgreConnectorConfiguration() {
return io.debezium.config.Configuration.create()
.with("name", "postgres-connector")
.with("snapshot.mode", SnapshotMode.CUSTOM)
.with("snapshot.custom.class", "postgresql.snapshot.CustomSnapshotter")
.with("connector.class", "io.debezium.connector.postgresql.PostgresConnector")
.with("database.history", "io.debezium.relational.history.FileDatabaseHistory")
.with("database.history.file.filename", "/debezium/dbhistory.dat")
.with("offset.storage", "org.apache.kafka.connect.storage.FileOffsetBackingStore")
.with("offset.storage.file.filename", "/debezium/offset/postgre-offset.dat")
.with("offset.flush.interval.ms", 60000)
.with("snapshot.isolation.mode", "read_committed")
.with("key.converter.schemas.enable",true)
.with("value.converter.schemas.enable", true)
.with("plugin.name", "pgoutput")
.with("slot.name", "debeziumtest")
.with("database.server.name", "server-c")
.with("database.hostname", databaseHost)
.with("database.port", databasePort)
.with("database.user", databaseUserName)
.with("database.password", databasePassword)
.with("database.dbname", databaseName)
.with("table.whitelist", TABLES_TO_MONITOR).build();
}
Comparison of Struct(Before & After):
private void handleEvent(SourceRecord sourceRecord) {
Struct sourceRecordEntry = (Struct) sourceRecord.value();
if (sourceRecordEntry != null) {
Struct sourceStruct = (Struct) sourceRecordEntry.get(FieldName.SOURCE);
String tableName = sourceStruct.getString(TABLE);
Date transactionDate = new Date(System.currentTimeMillis());
Long transctionTime = (Long) sourceStruct.get(FieldName.TIMESTAMP);
Time txnTime = new Time(transctionTime);
Long transactionCode = (Long) sourceStruct.get(TRANSACTION_ID);
Operation operation = Operation.forCode(sourceRecordEntry.getString(OPERATION));
if (operation == Operation.UPDATE) {
Map<String, Object> beforeEntryHash;
Map<String, Object> afterEntryHash;
List preFieldList = new ArrayList();
List preValueList = new ArrayList();
List postFieldList = new ArrayList();
List postValueList = new ArrayList();
Integer preFieldcount = 0, preValuecount = 0, postFieldcount = 0, postValuecount = 0;
Struct beforeStruct = (Struct) sourceRecordEntry.get(BEFORE);
Struct afterStruct = (Struct) sourceRecordEntry.get(AFTER);
beforeEntryHash = beforeStruct.schema().fields().stream().map(Field::name).filter(fieldName->beforeStruct.get(fieldName)!=null).map(fieldName-> Pair.of(fieldName, beforeStruct.get(fieldName))).collect(toMap(Pair::getKey,Pair::getValue));
afterEntryHash = afterStruct.schema().fields().stream().map(Field::name).filter(fieldName->afterStruct.get(fieldName)!=null).map(fieldName-> Pair.of(fieldName, afterStruct.get(fieldName))).collect(toMap(Pair::getKey,Pair::getValue));
MapDifference<String, Object> rowDifferenceHash = Maps.difference(beforeEntryHash, afterEntryHash);
for(Entry<String, ValueDifference<Object>> rowEntry : rowDifferenceHash.entriesDiffering().entrySet()) {
preFieldList.add(PR_PREFIX + rowEntry.getKey());
postFieldList.add(PO_PREFIX + rowEntry.getKey());
preValueList.add(SQ + rowEntry.getValue().leftValue() + SQ);
postValueList.add(SQ + rowEntry.getValue().rightValue() + SQ);
LOGGER.info("Key : " + rowEntry.getKey() + " Left Value : " + rowEntry.getValue().leftValue() + " Right Value : " + rowEntry.getValue().rightValue());
}
}
}
}
Message:
SourceRecord{sourcePartition={server=server-c}, sourceOffset={transaction_id=null, lsn_proc=4921004793408, lsn=4921004793408, txId=81939856, ts_usec=1588212060567019}} ConnectRecord{topic='server-c.a.accinfo', kafkaPartition=null, key=Struct{accountno=01 ,currencycode=USD,seqno=1 }, keySchema=Schema{server-c.a.accinfo.Key:STRUCT}, value=Struct{before=Struct{accountno=01 ,currencycode=USD,seqno=1 ,informationcode=S },after=Struct{accountno=01 ,currencycode=USD,seqno=1 ,informationcode=P ,infodetcode=I},source=Struct{version=1.2.0.Alpha1,connector=postgresql,name=server-c,ts_ms=1588212060567,db=OTATEMP,schema=a,table=accinfo,txId=81939856,lsn=4921004793408},op=u,ts_ms=1588213782961}, valueSchema=Schema{server-c.a.accinfo.Envelope:STRUCT}, timestamp=null, headers=ConnectHeaders(headers=)}
Schema:
[Field{name=before, index=0, schema=Schema{server-c.aeota.accinfo.Value:STRUCT}}, Field{name=after, index=1, schema=Schema{server-c.aeota.accinfo.Value:STRUCT}}, Field{name=source, index=2, schema=Schema{io.debezium.connector.postgresql.Source:STRUCT}}, Field{name=op, index=3, schema=Schema{STRING}}, Field{name=ts_ms, index=4, schema=Schema{INT64}}, Field{name=transaction, index=5, schema=Schema{STRUCT}}]

Read all Response header and store in String and print them in rest assured java

I have requirement where need to read the all response header and assert them with pre-define values store in the Hashmap. I have written the following codes but getting error as Indexout of bound
Boolean IsPass = false;
Map.Entry<String, String> item : AssertHeaderFromExcel.entrySet()) {
try {
String key = item.getKey();
String value = item.getValue();
Headers headers = response.getHeaders();
String strjson = headers.getValues(key).size() > 0 ? headers.getValues(key).get(0) : "";
System.out.println("test" +headers.getValues(key));
strjson = strjson == null ? "" : strjson;
Assert.assertEquals(strjson, value);
if (value.equalsIgnoreCase(strjson)) {
if (IsPass) {
IsPass = true;
}
} else {
IsPass = false;
}
The String value is blank for the first value read from the Hashmap. Hence cannot assert the key and value and IsPass remains as Fail only

How to save ODocument in map field in Vertex with Java

Assuming I have a Vertex (let's call it "PokemonMaster")
PokemonMaster
{
name, (STRING)
age, (INTEGER)
pokemons, (EMBEDDEDMAP) of Pokemon
}
in my DB containing an EMBEDDEDMAP (I also tried with LINKMAP but I'm not sure of what i'm doing) of a class "Pokemon".
I'm trying with Java to create the Vertex and put in the field "pokemons", some pokemons.
let's say a Pokemon looks like :
Pokemon
{
name, (STRING)
}
I'm doing something like :
Vertex v = graph.addVertex("class:PokemonMaster",
"name", "Sacha",
"age", "42",
"pokemons", new ODocument("Pokemon").field("name", "Pikachu"));
I assume this would create a first element (Pikachu) in the map. And I was hoping to be able to add some Pokemons to my map later by doing something like :
v.setProperty("pokemons", new ODocument("Pokemon").field("name", "Raichu"));
All of this is actually not working and that's why i'm here, am I totally wrong?
I get the error :
The field 'PokemonMaster.pokemons' has been declared as EMBEDDEDMAP but an incompatible type is used. Value: Pokemon{name:Pikachu}
Thank you !
Edit
I found the solution.
Creating a map like :
Map<String, ODocument> foo = new HashMap<>();
Putting some pokemons in it :
ODocument doc = new ODocument("Pokemon").field("name", "Pikachu");
ODocument doc2 = new ODocument("Pokemon").field("name", "Raichu");
foo.put("pikachu", doc);
foo.put("raichu", doc2);
doc.save();
doc2.save();
and simply giving the map as parameter :
Vertex v = graph.addVertex("class:PokemonMaster",
"name", "Sacha",
"age", "42",
"pokemons", foo);
Hope it will help someone !
UPDATE:
In case of embeddedmap, to create the schema:
OrientGraphNoTx graphOne = new OrientGraphNoTx(URL, USER, USER);
try {
OSchema schema = graphOne.getRawGraph().getMetadata().getSchema();
OClass pokemon = schema.createClass("Pokemon");
pokemon.createProperty("name", OType.STRING);
OClass vClass = schema.getClass("V");
OClass pokemonMaster = schema.createClass("PokemonMaster");
pokemonMaster.setSuperClass(vClass);
pokemonMaster.createProperty("name", OType.STRING);
pokemonMaster.createProperty("age", OType.INTEGER);
pokemonMaster.createProperty("pokemons", OType.EMBEDDEDMAP, pokemon);
} finally {
graphOne.shutdown();
}
Create a master with a pokemon:
String pmRID = "";
OrientGraph graphTwo = new OrientGraph(URL, USER, USER);
try {
ODocument pokemon = new ODocument("Pokemon");
pokemon.field("name", "Pikachu");
Map<String,ODocument> foo = new HashMap();
foo.put("pikachu", pokemon);
OrientVertex v = graphTwo.addVertex("class:PokemonMaster",
"name", "Sacha",
"age", "42",
"pokemons", foo);
graphTwo.commit();
pmRID = v.getIdentity().toString();
} catch (Exception e) {
// ...
} finally {
graphTwo.shutdown();
}
Add a second pokemon:
OrientGraph graphThree = new OrientGraph(URL, USER, USER);
try {
ODocument pokemon = new ODocument("Pokemon");
pokemon.field("name", "Raichu");
OrientVertex v = graphThree.getVertex(pmRID);
Map<String, ODocument> pokemons = v.getProperty("pokemons");
if (pokemons == null) {
pokemons = new HashMap();
}
pokemons.put("raichu", pokemon);
v.setProperty("pokemons", pokemons);
graphThree.commit();
} catch (Exception e) {
// ...
} finally {
graphThree.shutdown();
}
You could also use an embeddedlist. See here.

Categories

Resources