how can i do this query on dynamoDB + Android - java

I've got a DynamoDB table containing 'n' Book title in only one dynamoDB Item (and this freak me out)
in this structure, rentBook is the table on dynamoDB, with 2 attributes:
String StudentID
StringSet BookTitle // something like:["title1","title2","title3"]
In SQL I would have write something like that
SELECT StudentID
FROM RentBook
WHERE BookTitle = "title1" OR BookTitle = "title2"
but with Dynamo i can't get the right results :(
anyone can help me?
question 2: Is this table structure appropriate when the number of rows increases?

Try this:
ValueMap vm = new ValueMap().withString(":val1", "title1");
ScanRequest scanRequest = new ScanRequest()
.withTableName("RentBook")
.withFilterExpression("contains(BookTitle, :val1)")
.withExpressionAttributeValues(vm);

Related

Arithmetic operation in condition expression DynamoDb Using the AWS SDK for Java

I have a dynamodb Table and I want to do a conditional update operation on that using aws sdk java library. I have the table with hashkey named as "Id" and sortkey named as "Sk" . I have 5 other fields in the table and i have to update 2 fields based on the condition on the remaining fields. The condition is "total_record_count = record_passed + records_failed" , where total_record_count, record_passed and records_failed are the fields in the table.
Below is the code
DynamoDB dynamoDB = new DynamoDB(amazonDynamoDB);
Table table = dynamoDB.getTable("ciw_ocrResponse_upload");
Map<String, String> expressionAttributeNames = new HashMap<String, String>();
expressionAttributeNames.put("#updatedAt", "updatedAt");
expressionAttributeNames.put("#statusOfTracker","statusOfTracker");
Timestamp timestamp = new Timestamp(System.currentTimeMillis());
Map<String, Object> eav = new HashMap<>();
eav.put(":val1", value);
eav.put(":val2", timestamp.getTime());
UpdateItemSpec updateItemSpec = new UpdateItemSpec().withPrimaryKey("Id", requestId, "Sk", sortKey)
.withUpdateExpression("set #updatedAt = :val2, #statusOfTracker = :val1")
.withValueMap(eav)
.withNameMap(expressionAttributeNames)
.withConditionExpression(" total_record_count = record_passed + records_failed")
.withReturnValues(ReturnValue.UPDATED_NEW);
table.updateItem(updateItemSpec);
I am getting the error as Invalid ConditionExpression: Syntax error; token: "+", near: "record_passed + records_failed"
According to the docs this arithmetic operation is not supported.
Workaround is to create a total_record_count field which you need to calculate and add on every new item created. Also to add it to all existing items.

How to AND multiple setQuery in elasticsearch using java?

I am trying to create query for search filter using elasticsearch. I created query that shows results based on search term, price range and brand list. Results shown for searchterm and price range is right but when brand list is provided all results related to selected brand is shown.
I want results for searchterm AND price AND brands
This is my query
BoolQueryBuilder query = QueryBuilders.boolQuery();
for (String key : brands) {
query.must(QueryBuilders.matchQuery("brand", key));
}
SearchResponse searchresponse = client
.prepareSearch("product")
.setTypes("product")
.setQuery(
QueryBuilders.matchPhraseQuery("name", pSearchTerm))
.setPostFilter(
QueryBuilders.rangeQuery("unit_price").from(min)
.to(max))
.setQuery(query).setExplain(true)
.execute().actionGet();
what am i doing wrong here?
You have two setQuery() calls so the second one is overriding the first one. You need to combine all your constraints into one query like this:
// brand list
BoolQueryBuilder query = QueryBuilders.boolQuery();
for (String key : brands) {
query.must(QueryBuilders.matchQuery("brand", key));
}
// search term
query.must(QueryBuilders.matchPhraseQuery("name", pSearchTerm));
// price range
query.filter(QueryBuilders.rangeQuery("unit_price").from(min).to(max));
SearchResponse searchresponse = client
.prepareSearch("product")
.setTypes("product")
.setQuery(query)
.setExplain(true)
.execute().actionGet();

How to perform an updateItem operation with a JSON payload on a Dynamo DB table

The Dynamo DB document API allows put operation using a json payload
Item item = Item.fromJSON(payload);
table.putItem(item);
However I couldn't find a similar way for performing an updateItem with a Json payload.
Is there a Dynamo DB support for that?
Here is a snippet as to how you would do it in Javascript.
let updateExpression = 'SET ';
let expressionAttributeValues = {};
const keys = Object.keys(payload);
for (const key of keys) {
updateExpression += `${key}=:${key},`;
expressionAttributeValues[`:${key}`] = payload[key];
}
updateExpression=updateExpression.slice(0,-1);
let params = {
TableName : 'your_table_name',
Key: {
id: id
},
UpdateExpression : updateExpression,
ExpressionAttributeValues : expressionAttributeValues,
ReturnValues: "ALL_NEW"
}
db.update(params,(err, result)=>{});
I was struggling with this for a while in the end you have to use a map.
UpdateItemOutcome updateItemOutcome = table.updateItem(
new UpdateItemSpec()
.withPrimaryKey("id", "yourId")
.withUpdateExpression("SET document.field = :field")
.withValueMap(new ValueMap()
.withMap(":field", "map of key value pairs that will get serialized to json")));
The UpdateItemSpec currently doesn't accept Item as input -- you will have to parse the json content first (presumably via Item.fromJSON) and then use either AttributeUpdate or UpdateExpression to specify the update action on each individual field (SET, ADD etc.)
Here is the documentation to using UpdateExpression to update an item http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/JavaDocumentAPIItemCRUD.html#JavaDocumentAPIItemUpdate

Parse a csv String and map to a java object

I am trying to parse a csv string like this
COL1,COL2,COL3
1,2,3
2,4,5
and map columns to a java object-
Class Person{
COL1,
COL2,
COL3;
}
Most of the libraries I found on google are for csv files but I am working with google app engine so can't write or read files. currently I am using split method but problems with this approach is
column that I am getting in csv string could vary as
COL1,COL3,COL2
don't want to use boiler plate code of splitting and getting each column.so what I need is list of column header and read all columns in a collection using header mapper. While iterating, map column value to a java object.
There are several question based on similar type of requirement but none of them helped me.
If anyone has done this before please could you share the idea? Thanks!
After searching and trying several libraries, I am able to solve it. I am sharing the code if anyone needs it later-
public class CSVParsing {
public void parseCSV() throws IOException {
List<Person> list = Lists.newArrayList();
String str = "COL1,COL2,COL3\n" +
"A,B,23\n" +
"S,H,20\n";
CsvSchema schema = CsvSchema.emptySchema().withHeader();
ObjectReader mapper = new CsvMapper().reader(Person.class).with(schema);
MappingIterator<Person> it = mapper.readValues(str);
while (it.hasNext()) {
list.add(it.next());
}
System.out.println("stored list is:" + (list != null ? list.toString() : null));
}}
Most of the libraries I found on google are for csv files but I am
working with google app engine so can't write or read files
You can read file (in project file system).
You can read and write file in blobstore, google cloud storage
Use a Tokenizer to split the string into objects then set them to the object.
//Split the string into tokens using a comma as the token seperator
StringTokenizer st = new StringTokenizer(lineFromFile, ",");
while (st.hasMoreTokens())
{
//Collect each item
st.nextElement();
}
//Set to object
Person p = new Person(item1, item2, item3);
If the columns can be reversed, you parse the header line, save it's values and and use it to decide which column each token falls under using, say, a Map
String columns[] = new String[3]; //Fill these with column names
Map<String,String> map = new HashMap<>();
int i=0;
while (st.hasMoreTokens())
{
//Collect each item
map.put(columns[i++], st.nextElement());
}
Then just, create the Person
Person p = new Person(map.get("COL1"), map.get("COL2"), map.get("COL3"));

How to get column names from HQL query&result to json with Key as columns

How to convert hql query result directly to Json.I tried this one
Query query=session.createQuery(SQLUtilsConstants.fQuery);
query.setParameter(StringConstants.TRPID, trId);
List list=query.list();
gson = new Gson();
String jsonStudents = gson.toJson(list);
System.out.println("jsonStudents = " + jsonStudents);
When i converted.I get a json with list of values without properties.Query result contains data from multiple table .I want to generate result with properties as key and value as output.If query contains customerid and customername then i need a result like this.
[{"customerid" : "abc", "customername" : "rose"}]
But using above code i getting like this..
[{"abc", "rose"}]
How can i do this???
You should create a Map, how ever you want it. then parse the map into json.
Map<String,WhatEver> newMap = new HashMap<String,WhatEver>();
foreach(WhatEver item: list){
//create your map how ever you like it to be
}
String jsonStudents = gson.toJson(newMap);

Categories

Resources