So I was looking at the #PutMapping example at the Spring website https://spring.io/guides/tutorials/rest/
I noticed that they call the database to get the Employee entity with that id and then update the entity from the repository with the name and role from the request.
#PutMapping("/employees/{id}")
Employee replaceEmployee(#RequestBody Employee newEmployee, #PathVariable Long id) {
return repository.findById(id)
.map(employee -> {
employee.setName(newEmployee.getName());
employee.setRole(newEmployee.getRole());
return repository.save(employee);
})
.orElseGet(() -> {
newEmployee.setId(id);
return repository.save(newEmployee);
});
}
That's great for a small example demo, but how do you handle this on a more complex entity?
What if Employee had a list of Laptops
Lets suppose that the list in JSON looks something like
{
"name": "John",
"role": "MyRole",
"laptops": [
{
"model": "abc",
"serial": "123"
},
{
"model": "xyz",
"serial": "789"
},
]
}
Of course if your mapping is correct from the repository you'll get back an Employee entity with the list of laptops and the laptops id on the Java side.
But if the user request for the laptop looks something like:
{
"id": 1,
"name": "John",
"role": "MyRole",
"laptops": [
{
"model": "abcModified",
"serial": "123"
},
{
"model": "newModel-xyz was actually removed from the list",
"serial": "456"
},
]
}
What would you do in this scenario? Were we suppose to send back the foreign keys?
If we were to send the foreign key what would stop someone from referencing foreign that didn't belong to the entity?
How do you property map a complex object that may contain list of objects that has other list that was modified.
Edit: I'm calling the Employee path because lets say I need to update the role and the list of laptops
First when you want to update the child object with parent without child entity id, then you have to delete previous children and create again.
You can send child id when in get API so that you can send them when update. Then for put request, you can find those children by id and update them and add new one if available.
But the best way is to write separate API for the child entity also. Update existing child with id and reference with foreign key and create new with separate API.
Related
I'm trying to get all the records from 'Messages' table but the problem is 'roles' are saving as an array in dynamo DB and I need to get all records which having the "INSTRUCTOR" role of each record.
Available records are as follows in dynamoDB.
Record 1
{
"product": "Maths",
"messageSummary": "test message",
"roles": [
"ADMIN",
"INSTRUCTOR"
],
"title": "My course1",
"createdBy": 0,
"authorName": "test author",
"id": "1"
}
Record 2
{
"product": "Maths",
"messageSummary": "test message",
"roles": [
"STUDENT"
],
"title": "My course2",
"createdBy": 0,
"authorName": "test author",
"id": "2"
}
Record 3
{
"product": "Maths",
"messageSummary": "test message",
"roles": [
"INSTRUCTOR",
"STUDENT"
],
"title": "My course3",
"createdBy": 0,
"authorName": "test author",
"id": "3"
}
Message model class as follows which referring "Messages" table
#DynamoDBTable(tableName = "Messages")
public class Message {
#Id
#ApiModelProperty(accessMode = ApiModelProperty.AccessMode.READ_ONLY, position = 1)
private String id;
/* authorName, messageSummary, title .. attributes goes here */
#ApiModelProperty(required = true, allowableValues = "maths", position = 6)
private Product product;
#ApiModelProperty(allowableValues = "student, instructor, admin", position = 7)
private List<Role> roles;
// getters and setters
}
Message repository as follows which extends CrudRepository
#EnableScan
public interface MessageRepository extends CrudRepository<Message, String> {
List<Message> findMessageByProductAndRoles(Product product, List roles); // Need to filter by given role
}
Since I need to get all the records that have INSTRUCTOR role, Record 1 and Record 3 should be in the result list. However, I can filter it only using the product value but am not able to filter using roles.
Additionally, I tried this using some keywords such as Contains, Like, In etc. but those are not succeeded for me. as per my observation, those keywords are not supporting to filter a specific value from an array.
And getting this error:
{
"timestamp": "2022-01-27T08:42:54.786+0000",
"status": 500,
"error": "Internal Server Error",
"message": "Request processing failed; nested exception is com.amazonaws.services.dynamodbv2.model.AmazonDynamoDBException: One or more parameter values were invalid: ComparisonOperator CONTAINS is not valid for SS AttributeValue type (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException; Request ID: xxxxxxx)",
"path": "/api/xxx/my route"
}
How can I achieve that and implement such conditions using the CrudRepository?
Like you pointed out, you cannot use those operators (CONTAINS, LIKE, IN) in a filter expression.
One or more parameter values were invalid: ComparisonOperator CONTAINS is not valid for SS AttributeValue type
FilterExpression criteria follows that of the KeyConditionExpression, with the addition of the not-equals operator.
It seems like your current table structure is incompatible with this access pattern. DynamoDB only works as a solution when the base table is designed around the access patterns, extending the base table's capabilities when necessary with the addition secondary indexes.
One option would be to simply query by the partition key (and sort key if applicable) and let your application do the filtering. DynamoDB FilterExpressions are applied after the initial query (or scan) is completed anyway, so you pay for all the data you've read. The filter simply returns less than that to your application. There's no real performance or cost benefit besides whatever comes from offloading the burden from your application to DyanamoDB.
Another option, rather than adjusting your key structure, would be to leverage secondary indexes. More specifically, sparse indexes (multiple in your case, one for each role), but only if you have the flexibility break out that String Set attribute into several "key friendly" values. You cannot use Sets as keys. Keep in mind, the practicality of a secondary indexes entirely depends entirely on the nature of the access pattern.
When thinking about creating an index for your base table, you should always ask yourself:
Does this query happen frequently enough to justify the cost of a secondary index?
If the answer to that is yes, you should then also ask yourself:
Am I projecting enough of the right attributes to the new index so that I can avoid charges for updating unused projections, and for fetches of missing projections?
After those questions, there is plenty to more to consider, including Secondary Indexes fundamentals and best practices.
After trying many options to query for the database, I was able to get relevant records lists by implementing GSI in DynamoDB. Since I'm using java I have used QuerySpec to get them.
QuerySpec spec = new QuerySpec()
.withKeyConditionExpression("lifeCycle = :life_cycle AND startDate < :s_date")
.withFilterExpression("contains (#user_roles, :roles))
.withNameMap(expressionAttributeNames)
.withValueMap(new ValueMap()
.withString(":life_cycle", lifeCycle)
.withString(":roles", userRole)
.with(":s_date", startDate)
);
ItemCollection<QueryOutcome> items = index.query(spec);
I am trying to check if two JSON file is equal or not using java :
This is the first json
{
"filters": [
{
"name": "Data_Type",
"value": "Database"
},
{
"name": "Begin_Date",
"value": "2019-05-01"
},
{
"name": "End_Date",
"value": "2019-10-31"
}
]
}
and this is the secound one :
{
"filters": [
{
"name": "End_Date",
"value": "2019-10-31"
},
{
"name": "Begin_Date",
"value": "2019-05-01"
},
{
"name": "Data_Type",
"value": "Database"
}
]
}
I use this library zjsonpatch
This library is great but the issue for me I want to ignore the order for the array so In my two JSON file should match
Alo I don't need only check the match, also I need report with the defferant if exists as zjsonpatch provide
any suggestion ??
Your problem is relatively simple to solve. I won't write real code because you didn't specify what language you are using so you will get pseudocode.
create a method to determine if a json object is equal to another according to your rules
parse json a
place every individual item of json a into a collection
parse json b
place every individual item of json a into another collection
iterate over collection a removing item by item and trying to removing the current item from collection b
when a item from A can't be removed from B or if when you finish iterating collection B is not empty it means the jsons were different
I have a json like shown below (this is just representational):
The issue I am facing is that the Person object can be at different levels in the json. E.g. in below case it is at level 2 in case of RootNode1, at level 1 in case of RootNode2 and at level 0. Of course these levels are not limited to 2 and neither are they tied to RootNode in any way. (And these node values ar enot preknown. Only thing fixed and unique to identify Person object is "Type": "Person")
I have to extract Person object in all cases.
Is there a way to achieve this through traversal in JsonPath library : https://github.com/json-path/JsonPath?
[
{
"RootNode1": [
{
"ABC": [
{
"DEF": ""
},
{
"Name": "John",
"Type": "Person"
....
}
]
},
{
"DAC": {}
}
]
},
{
"RootNode2": [
{
"Name": "Williams",
"Type": "Person"
....
}
]
},
{
{
"Name": "Sam",
"Type": "Person"
....
}
}
]
Yes, it is completely possible.
If the readme is to be believed, he make
JsonPath.read(document, "$.RootNode1.ABC[1].Type");
For RootNode1
And
JsonPath.read(document, "$.RootNode2[1].Type");
For RootNode2
And
JsonPath.read(document, "$.Type");
it is the only way to do that I see. but there may be simpler, I'm used to using org.json personally
If I understand you correctly, then this jsonpath expression
$..Type
should output
[
"Person",
"Person",
"Person"
]
at whatever level Type is.
The correct way to do that is with the expression: $..[?(#.Type == 'Person')], like:
ptx.parse(json).read("$..[?(#.Type == 'Person')]", List.class)
// ptx is ParseContext object of com.jayway.jsonpath
With this I can get the List of persons without caring about the level the Person object was present.
This is the Object of CcStorePartnerclass, There is an another Object of Partner class inside this class.
Here i want to make filter on attributes of CcStorePartner obj and attributes of Partner Class,
This is the web service body tag. where i have decleared the filter on the objects which are placed in Mongodb. I'm using MongoTemplate
{
"target":"stores",
"filter":
[
{
"storeId" : "a487c" ,
"Type": "contains"
},
{
"partner.partnerCode": "ucb",
"Type":"contains"
}
]
}
What will be the mongo query. which will provide the List
Here m using this..
query.addCriteria(Criteria.where("storeId").regex(".*a487c.*","i"));
query.addCriteria(Criteria.where("partner.partnerCode").regex(".*ucb.*","i"));
I'm receiving this error
Pojo Class
and this is the Mongo Template Code which m using- Dynamic Query.
I've a database and some classes. These classes are linked with OneToMany, and so on.
If I print the object itself with spring it contains everything. But if I print it with the Resource feature, it contains only the variables, which are no collections or linked otherwise with an other class.
How can I add the collections to the output?
By default Spring Data REST does not show associated resources except as links. If you want that you have to define projections that describe the fields you want to see, whether they're simple fields like the ones you describe or associated resources. See
http://docs.spring.io/spring-data/rest/docs/current/reference/html/#projections-excerpts
For example say you have a Service resource with associations to resources like serviceType, serviceGroup, owner, serviceInstances and docLinks. If you want those to show up in the response body you can create a projection:
package my.app.entity.projection;
import org.springframework.data.rest.core.config.Projection;
...
#Projection(name = "serviceDetails", types = Service.class)
public interface ServiceDetails {
String getKey();
String getName();
ServiceType getType();
ServiceGroup getGroup();
Person getOwner();
List<ServiceInstance> getServiceInstances();
List<DocLink> getDocLinks();
String getPlatform();
}
Then GET your URL with the projection:
http://localhost:8080/api/services/15?projection=serviceDetails
The result will include the projected properties:
{
"name" : "MegaphoneService",
"key" : "megaphone",
"type" : {
"key" : "application",
"name" : "User Application",
"description" : "A service that allows users to use a megaphone."
},
"owner" : null,
"serviceInstances" : [ {
"key" : "megaphone-a-dr",
"description" : null,
"loadBalanced" : true,
"minCapacityDeploy" : null,
"minCapacityOps" : 50
}, ... ],
...
}