I have this type of records present inside my collection
{
"_id": {
"$oid": "537470dce4b067b395ba47f2"
},
"symbol": "CMC",
"tvol": 76.97
}
While retrieving results , i am unable to map to to my custom object
This is my client program
public class Data {
public static void main(String[] args) {
try {
String textUri = "mongodb://admin:password1#ds043388.mongolab.com:43388/stocks";
MongoURI uri = new MongoURI(textUri);
Mongo m = new Mongo(uri);
DB db = m.getDB("stocks");
DBCollection table = db.getCollection("stock");
BasicDBObject searchQuery = new BasicDBObject();
DBCursor cursor = table.find(searchQuery);
while (cursor.hasNext()) {
Security sec = (Security) cursor.next();
System.out.println(sec.getSymbol());
}
} catch (UnknownHostException e) {
e.printStackTrace();
} catch (MongoException e) {
e.printStackTrace();
}
}
}
Security.java
package com;
import com.mongodb.BasicDBObject;
public class Security extends BasicDBObject {
public Security() {
}
private String symbol;
public String getSymbol() {
return symbol;
}
public void setSymbol(String symbol) {
this.symbol = symbol;
}
}
This is the exception i am getting .
Exception in thread "main" java.lang.ClassCastException: com.mongodb.BasicDBObject cannot be cast to com.Security
at com.Data.main(Data.java:25)
You need to tell the driver what class to instantiate, try adding this line before you run your find():
table.setObjectClass(Security.class);
Your getters on your Security class aren't really complete here, you'd need to do something like this:
public String getSymbol() {
return getString("symbol");
}
When MongoDB Driver returns it return as BasicDBObject for an Object by default which is kind of Map.
if you want to retrive as your own Java POJO , you have to implement DbObject Interface
Reference
You are extending BasicDBObject which is internally implementing DBObject , so you are fine in that context.
However you have explicitly tell to Mongo to return as Security as
collection.setObjectClass(Security.class);
Related
I try to save an entity with spring data mongodb repository. I have an EventListener that cascades saves.
The problem is, that I need to save an entity to get its internal id and perform further state mutations and saving the entity afterwards.
#Test
void testUpdate() {
FooDto fooDto = getResource("/json/foo.json", new TypeReference<FooDto>() {
});
Foo foo = fooMapper.fromDTO(fooDto);
foo = fooService.save(foo);
log.info("Saved foo: " + foo);
foo.setState(FooState.Bar);
foo = fooService.save(foo);
log.info("Updated foo: " + foo);
}
I have an index on a child collection of foo. It will not update children but will try to insert them twice which leads to org.springframework.dao.DuplicateKeyException.
Why does it not save but tries to insert it again?
Related:
Spring Data MongoRepository save causing Duplicate Key error
Edit: versions:
mongodb 4,
spring boot 2.3.3.RELEASE
Edit more details:
Repository:
public interface FooRepository extends MongoRepository<Foo, String>
Entity:
#Document
public class Foo {
#Id
private String id;
private FooState state;
#DBRef
#Cascade
private Collection<Bar> bars = new ArrayList<>();
...
}
CascadeMongoEventListener:
//from https://mflash.dev/blog/2019/07/08/persisting-documents-with-mongorepository/#unit-tests-for-the-accountrepository
public class CascadeMongoEventListener extends AbstractMongoEventListener<Object> {
private #Autowired
MongoOperations mongoOperations;
public #Override void onBeforeConvert(final BeforeConvertEvent<Object> event) {
final Object source = event.getSource();
ReflectionUtils
.doWithFields(source.getClass(), new CascadeSaveCallback(source, mongoOperations));
}
private static class CascadeSaveCallback implements ReflectionUtils.FieldCallback {
private final Object source;
private final MongoOperations mongoOperations;
public CascadeSaveCallback(Object source, MongoOperations mongoOperations) {
this.source = source;
this.mongoOperations = mongoOperations;
}
public #Override void doWith(final Field field)
throws IllegalArgumentException, IllegalAccessException {
ReflectionUtils.makeAccessible(field);
if (field.isAnnotationPresent(DBRef.class) && field.isAnnotationPresent(Cascade.class)) {
final Object fieldValue = field.get(source);
if (Objects.nonNull(fieldValue)) {
final var callback = new IdentifierCallback();
final CascadeType cascadeType = field.getAnnotation(Cascade.class).value();
if (cascadeType.equals(CascadeType.PERSIST) || cascadeType.equals(CascadeType.ALL)) {
if (fieldValue instanceof Collection<?>) {
((Collection<?>) fieldValue).forEach(mongoOperations::save);
} else {
ReflectionUtils.doWithFields(fieldValue.getClass(), callback);
mongoOperations.save(fieldValue);
}
}
}
}
}
}
private static class IdentifierCallback implements ReflectionUtils.FieldCallback {
private boolean idFound;
public #Override void doWith(final Field field) throws IllegalArgumentException {
ReflectionUtils.makeAccessible(field);
if (field.isAnnotationPresent(Id.class)) {
idFound = true;
}
}
public boolean isIdFound() {
return idFound;
}
}
}
Edit: expected behaviour
From the docs in org.springframework.data.mongodb.core.MongoOperations#save(T):
Save the object to the collection for the entity type of the object to
save. This will perform an insert if the object is not already
present, that is an 'upsert'.
Edit - new insights:
it might be related to the index on the Bar child collection. (DbRef and Cascade lead to mongoOperations::save being called from the EventListener)
I created another similar test with another entity and it worked.
The index on the child "Bar" entity (which is held as collection in parent "Foo" entity):
#CompoundIndex(unique = true, name = "fooId_name", def = "{'fooId': 1, 'name': 1}")
update: I think I found the problem. Since I am using a custom serialization/deserialization in my Converter (Document.parse()) the id field is not mapped properly. This results in id being null and therefore this leads to an insert instead of an update.
I will write an answer if I resolved this properly.
public class MongoResultConversion {
#Component
#ReadingConverter
public static class ToResultConverter implements Converter<Document, Bar> {
private final ObjectMapper mapper;
#Autowired
public ToResultConverter(ObjectMapper mapper) {
this.mapper = mapper;
}
public MeasureResult convert(Document source) {
String json = toJson(source);
try {
return mapper.readValue(json, new TypeReference<Bar>() {
});
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
}
protected String toJson(Document source) {
return source.toJson();
}
}
#Component
#WritingConverter
public static class ToDocumentConverter implements Converter<Bar, Document> {
private final ObjectMapper mapper;
#Autowired
public ToDocumentConverter(ObjectMapper mapper) {
this.mapper = mapper;
}
public Document convert(Bar source) {
String json = toJson(source);
return Document.parse(json);
}
protected String toJson(Bar source) {
try {
return mapper.writeValueAsString(source);
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
}
}
}
As stated in my last edit the problem was with the custom serialization/deserialization and mongo document conversion. This resulted in id being null and therefore an insert was done instead of an upsert.
The following code is my implementation of my custom converter to map the objectid:
public class MongoBarConversion {
#Component
#ReadingConverter
public static class ToBarConverter implements Converter<Document, Bar> {
private final ObjectMapper mapper;
#Autowired
public ToBarConverter(ObjectMapper mapper) {
this.mapper = mapper;
}
public Bar convert(Document source) {
JsonNode json = toJson(source);
setObjectId(source, json);
return mapper.convertValue(json, new TypeReference<Bar>() {
});
}
protected void setObjectId(Document source, JsonNode jsonNode) {
ObjectNode modifiableObject = (ObjectNode) jsonNode;
String objectId = getObjectId(source);
modifiableObject.put(ID_FIELD, objectId);
}
protected String getObjectId(Document source) {
String objectIdLiteral = null;
ObjectId objectId = source.getObjectId("_id");
if (objectId != null) {
objectIdLiteral = objectId.toString();
}
return objectIdLiteral;
}
protected JsonNode toJson(Document source) {
JsonNode node = null;
try {
String json = source.toJson();
node = mapper.readValue(json, JsonNode.class);
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
return node;
}
}
#Component
#WritingConverter
public static class ToDocumentConverter implements Converter<Bar, Document> {
private final ObjectMapper mapper;
#Autowired
public ToDocumentConverter(ObjectMapper mapper) {
this.mapper = mapper;
}
public Document convert(Bar source) {
try {
JsonNode jsonNode = toJson(source);
setObjectId(source, jsonNode);
String json = mapper.writeValueAsString(jsonNode);
return Document.parse(json);
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
}
protected void setObjectId(Bar source, JsonNode jsonNode) throws JsonProcessingException {
ObjectNode modifiableObject = (ObjectNode) jsonNode;
JsonNode objectIdJson = getObjectId(source);
modifiableObject.set("_id", objectIdJson);
modifiableObject.remove(ID_FIELD);
}
protected JsonNode getObjectId(Bar source) throws JsonProcessingException {
ObjectNode _id = null;
String id = source.getId();
if (id != null) {
_id = JsonNodeFactory.instance.objectNode();
_id.put("$oid", id);
}
return _id;
}
protected JsonNode toJson(Bar source) {
return mapper.convertValue(source, JsonNode.class);
}
}
}
So to conclude: two subsequent saves should (and will) definitely lead to an upsert if the id is non null. The bug was in my code.
All MongoDB drivers include functionality to generate ids on the client side. If you only save to get the id, research how to use client-side id generation and remove the first save entirely.
I believe you are facing this issue as you try to save for the second time without fetching from db. You are changing the object returned by the save, not the object saved into the db. Try retrieving existing foo by using a method like findById and then perform next steps and saving it
I'm trying to map a json object from my dynamodb table using DynamoDbMapper and with the latest aws android sdk: com.amazonaws:aws-android-sdk-ddb-mapper:2.13.0, I'm seeing this exception: "DynamoDBMappingException: Expected S in value...
The json object in my table has 3 attributes, 2 of which are string and the third is a list of complex objects. I've created an object using the #DynamoDbDocument annotation for the complex object and used the proper marshaling annotation but it doesn't seem to be unmarshaling the json object into a java object correctly.
The complex object is a json object in this format:
{
"allCitiesList": [
{
"city": "Auckland, New Zealand",
"times": {
"recTimes": [
"Jan1",
"Jan2"
]
}
}
}
public class CitiesDO {
private String city;
private String country;
private List<AllCitiesObject> allCitiesList;
...get/setters for other fields...
#DynamoDBMarshalling(marshallerClass =
AllCitiesJSONMarshaller.class)
public List<AllCitiesObject> getAllCitiesList() {
return allCitiesList;
}
public void setAllCitiesList(List<AllCitiesObject> allCitiesList) {
this.allCitiesList = allCitiesList;
}
}
#DynamoDBDocument
public class AllCitiesObject {
#DynamoDBAttribute(attributeName = "allCitiesList")
private String data;
public AllCitiesObject(){}
public String getData() {
return data.toString();
}
public void setData(String data) {
this.data = data;
}
}
class AllCitiesJSONMarshaller extends JsonMarshaller<AllCitiesObject> {}
Have also tried this approach with a custom marshaller but no success:
public class MyCustomMarshaller implements DynamoDBMarshaller<List<AllCitiesObject>> {
private static final ObjectMapper mapper = new ObjectMapper();
private static final ObjectWriter writer = mapper.writer();
#Override
public String marshall(List<AllCitiesObject> obj) {
try {
return writer.writeValueAsString(obj);
} catch (JsonProcessingException e) {
throw new RuntimeException(
"Unable to marshall the instance of " + obj.getClass()
+ "into a string", e);
}
}
#Override
public List<AllCitiesObject> unmarshall(Class<List<AllCitiesObject>> clazz, String json) {
final CollectionType
type =
mapper.getTypeFactory().constructCollectionType(List.class, AllCitiesObject.class);
try {
return mapper.readValue(json, type);
} catch (Exception e) {
throw new RuntimeException("Unable to unmarshall the string " + json
+ "into " + clazz, e);
}
}
}
The exception is:
DynamoDBMappingException: Expected S in value {L: [{M: {times={M: {recTimes={L: [{S: Jan1,}, {S: Jan2,}
I'm having difficulty unmarshalling the json to a string although I think I have it set up correctly. Can anyone please help me understand what I'm missing and how to approach this issue? I would really appreciate your help!
DynamoDBMarshalling is deprecated, so I suggest using the newer DynamoDBTypeConverted annotation.
There are some useful notes on Mapping Arbitrary Data.
You can also see an example of mine in this answer
In summary, you create an AllCities plain java object. You then write a simple converter class which tells DynamoDB how to turn your AllCities object into a string to get into DynamoDB. Similarly, the converter class tells your application how to turn the string back into a Java object.
If anyone else is absolutely stuck on this issue with the ddbMapper, consider using the ddbClient to explicitly convert and map your DO object with your ddb table data. Due to time constraints, I'll come back to this and figure out the mapping issue at a later time and post the answer here in case it helps anyone else.
I have a collection "documentDev" present in the database with sharding key as 'dNumber'
Sample Document :
{
"_id" : "12831221wadaee23",
"dNumber" : "115",
"processed": false
}
If I try to update this document through any query tool using a command like -
db.documentDev.update({
"_id" : ObjectId("12831221wadaee23"),
"dNumber":"115"
},{
$set:{"processed": true}},
{ multi: false, upsert: false}
)}`
It updates the document properly.
But if I do use spring boot's mongorepository command like
DocumentRepo.save(Object)
it throws an exception
Caused by: com.mongodb.MongoCommandException: Command failed with error 61: 'query in command must target a single shard key' on server by3prdddc01-docdb-3.documents.azure.com:10255. The full response is { "_t" : "OKMongoResponse", "ok" : 0, "code" : 61, "errmsg" : "query in command must target a single shard key", "$err" : "query in command must target a single shard key" }
This is my DocumentObject:
#Document(collection = "documentDev")
public class DocumentDev
{
#Id
private String id;
private String dNumber;
private String fileName;
private boolean processed;
}
This is my Repository Class -
#Repository
public interface DocumentRepo extends MongoRepository<DocumentDev,
String> { }
and value i am trying to update
Value : doc :
{
"_id" : "12831221wadaee23",
"dNumber" : "115",
"processed": true
}
the function I am trying to execute :
#Autowired
DocumentRepo docRepo;
docRepo.save(doc); // Fails to execute
Note: I have sharding enabled on dNumber field. And I am successfully able to update using Native queries on NoSQL Tool.
I was also able to execute the Repository save operation on Non sharded collection.
Update: I am able to update the document by creating native query using MongoTemplate - My Query looks like this -
public DocumentDev updateProcessedFlag(DocumentDev request) {
Query query = new Query();
query.addCriteria(Criteria.where("_id").is(request.getId()));
query.addCriteria(Criteria.where("dNumber").is(request.getDNumber()));
Update update = new Update();
update.set("processed", request.isProcessed());
mongoTemplate.updateFirst(query, update, request.getClass());
return request;
}
But this is not a generic solution as any other field might have update and my document may have other fields as well.
I had the same issue, solved with following hack:
#Configuration
public class ReactiveMongoConfig {
#Bean
public ReactiveMongoTemplate reactiveMongoTemplate(ReactiveMongoDatabaseFactory reactiveMongoDatabaseFactory,
MongoConverter converter, MyService service) {
return new ReactiveMongoTemplate(reactiveMongoDatabaseFactory, converter) {
#Override
protected Mono<UpdateResult> doUpdate(String collectionName, Query query, UpdateDefinition update,
Class<?> entityClass, boolean upsert, boolean multi) {
query.addCriteria(new Criteria("shardKey").is(service.getShardKey()));
return super.doUpdate(collectionName, query, update, entityClass, upsert, multi);
}
};
}
}
Would be nice to have an annotation #ShardKey to mark document field as shard and have it added to query automatically.
Following the custom repository approach, I got an error because spring is expecting a Cosmos entity to be available in the custom implementation {EntityName}CustomRepositoryImpl, so I renamed the implementation. I also added code for:
The case when entity has inherited fields
Shard key is not always the Id, we should add it along with the id: { "shardkeyName": "shardValue" }
Adding generated ObjectId to the entity for new documents
public class DocumentRepositoryImpl<T> implements CosmosRepositoryCustom<T> {
#Autowired
protected MongoTemplate mongoTemplate;
#Override
public T customSave(T entity) {
WriteResult writeResult = mongoTemplate.upsert(createQuery(entity), createUpdate(entity), entity.getClass());
setIdForEntity(entity,writeResult);
return entity;
}
#Override
public T customSave(T entity, String collectionName) {
WriteResult writeResult = mongoTemplate.upsert(createQuery(entity), createUpdate(entity), collectionName);
setIdForEntity(entity,writeResult);
return entity;
}
#Override
public void customSave(List<T> entities) {
if(CollectionUtils.isNotEmpty(entities)){
entities.forEach(entity -> customSave(entity));
}
}
public <T> Update createUpdate(T entity){
Update update = new Update();
for (Field field : getAllFields(entity)) {
try {
field.setAccessible(true);
if (field.get(entity) != null) {
update.set(field.getName(), field.get(entity));
}
} catch (IllegalArgumentException | IllegalAccessException e) {
LOGGER.error("Error creating update for entity",e);
}
}
return update;
}
public <T> Query createQuery(T entity) {
Criteria criteria = new Criteria();
for (Field field : getAllFields(entity)) {
try {
field.setAccessible(true);
if (field.get(entity) != null) {
if (field.getName().equals("id")) {
Query query = new Query(Criteria.where("id").is(field.get(entity)));
query.addCriteria(new Criteria(SHARD_KEY_NAME).is(SHARD_KEY_VALUE));
return query;
}
criteria.and(field.getName()).is(field.get(entity));
}
} catch (IllegalArgumentException | IllegalAccessException e) {
LOGGER.error("Error creating query for entity",e);
}
}
return new Query(criteria);
}
private <T> List<Field> getAllFields(T entity) {
List<Field> fields = new ArrayList<>();
fields.addAll(Arrays.asList(entity.getClass().getDeclaredFields()));
Class<?> c = entity.getClass().getSuperclass();
if(!c.equals(Object.class)){
fields.addAll(Arrays.asList(c.getDeclaredFields()));
}
return fields;
}
public <T> void setIdForEntity(T entity,WriteResult writeResult){
if(null != writeResult && null != writeResult.getUpsertedId()){
Object upsertId = writeResult.getUpsertedId();
entity.setId(upsertId.toString());
}
}
}
I am using spring-boot-starter-mongodb:1.5.1 with spring-data-mongodb:1.9.11
i am hacking this by create a custom repository:
public interface CosmosCustomRepository<T> {
void customSave(T entity);
void customSave(T entity, String collectionName);
}
the implement for this repository:
public class CosmosCustomRepositoryImpl<T> implements CosmosCustomRepository<T> {
#Autowired
private MongoTemplate mongoTemplate;
#Override
public void customSave(T entity) {
mongoTemplate.upsert(createQuery(entity), createUpdate(entity), entity.getClass());
}
#Override
public void customSave(T entity, String collectionName) {
mongoTemplate.upsert(createQuery(entity), createUpdate(entity), collectionName);
}
private Update createUpdate(T entity) {
Update update = new Update();
for (Field field : entity.getClass().getDeclaredFields()) {
try {
field.setAccessible(true);
if (field.get(entity) != null) {
update.set(field.getName(), field.get(entity));
}
} catch (IllegalArgumentException | IllegalAccessException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
return update;
}
private Query createQuery(T entity) {
Criteria criteria = new Criteria();
for (Field field : entity.getClass().getDeclaredFields()) {
try {
field.setAccessible(true);
if (field.get(entity) != null) {
if (field.getName().equals("id")) {
return new Query(Criteria.where("id").is(field.get(entity)));
}
criteria.and(field.getName()).is(field.get(entity));
}
} catch (IllegalArgumentException | IllegalAccessException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
return new Query(criteria);
}
}
your DocumentRepo will extends this new custom repository.
#Repository
public interface DocumentRepo extends MongoRepository<DocumentDev, String>, CosmosCustomRepository<DocumentDev> { }
To save new document, just use new customSave
#Autowired
DocumentRepo docRepo;
docRepo.customSave(doc);
A more recent and simple approach I've figured out recently is to use the #Sharded({shardKey1, shardKey2, ...}) spring annotation like so:
#Document(collection = "documentDev")
#Sharded(shardKey = {"dNumber"})
public class DocumentDev {
#Id private String id;
private String dNumber;
private String fileName;
private boolean processed;
}
This is going to be picked up by MongoRepository automatically.
And here is the doc: https://docs.spring.io/spring-data/mongodb/docs/current/reference/html/#sharding
Enjoy coding!
I just want to be shure when inputting new DBObject to DB that it is really unique and Collection doesn't contain key field duplicates .
Here is how it looks now:
public abstract class AbstractMongoDAO<ID, MODEL> implements GenericDAO<ID, MODEL> {
protected Mongo client;
protected Class<MODEL> model;
protected DBCollection dbCollection;
/**
* Contains model data : unique key name and name of get method
*/
protected KeyField keyField;
#SuppressWarnings("unchecked")
protected AbstractMongoDAO() {
ParameterizedType genericSuperclass = (ParameterizedType) this.getClass().getGenericSuperclass();
model = (Class<MODEL>) genericSuperclass.getActualTypeArguments()[1];
getKeyField();
}
public void connect() throws UnknownHostException {
client = new MongoClient(Config.getMongoHost(), Integer.parseInt(Config.getMongoPort()));
DB clientDB = client.getDB(Config.getMongoDb());
clientDB.authenticate(Config.getMongoDbUser(), Config.getMongoDbPass().toCharArray());
dbCollection = clientDB.getCollection(getCollectionName(model));
}
public void disconnect() {
if (client != null) {
client.close();
}
}
#Override
public void create(MODEL model) {
Object keyValue = get(model);
try {
ObjectMapper mapper = new ObjectMapper();
String requestAsString = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(model);
// check if not presented
BasicDBObject dbObject = new BasicDBObject((String) keyValue, requestAsString);
dbCollection.ensureIndex(dbObject, new BasicDBObject("unique", true));
dbCollection.insert(new BasicDBObject((String) keyValue, requestAsString));
} catch (Throwable e) {
throw new RuntimeException(String.format("Duplicate parameters '%s' : '%s'", keyField.id(), keyValue));
}
}
private Object get(MODEL model) {
Object result = null;
try {
Method m = this.model.getMethod(this.keyField.get());
result = m.invoke(model);
} catch (Exception e) {
throw new RuntimeException(String.format("Couldn't find method by name '%s' at class '%s'", this.keyField.get(), this.model.getName()));
}
return result;
}
/**
* Extract the name of collection that is specified at '#Entity' annotation.
*
* #param clazz is model class object.
* #return the name of collection that is specified.
*/
private String getCollectionName(Class<MODEL> clazz) {
Entity entity = clazz.getAnnotation(Entity.class);
String tableName = entity.value();
if (tableName.equals(Mapper.IGNORED_FIELDNAME)) {
// think about usual logger
tableName = clazz.getName();
}
return tableName;
}
private void getKeyField() {
for (Field field : this.model.getDeclaredFields()) {
if (field.isAnnotationPresent(KeyField.class)) {
keyField = field.getAnnotation(KeyField.class);
break;
}
}
if (keyField == null) {
throw new RuntimeException(String.format("Couldn't find key field at class : '%s'", model.getName()));
}
}
KeyFeld is custom annotation:
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.FIELD)
public #interface KeyField {
String id();
String get();
String statusProp() default "ALL";
But I'm not shure that this solution really prove this. I'm newly at Mongo.
Any suggestions?
A uniqueness can be maintained in MonboDb using _id field. If we will not provide the value of this field, MongoDB automatically creates a unique id for that particuler collection.
So, in your case just create a property called _id in java & assign your unique field value here. If duplicated, it will throw an exception.
With Spring Data MongoDB (the question was tagged with spring-data, that's why I suggest it), all you need is that:
// Your types
class YourType {
BigInteger id;
#Indexed(unique = true) String emailAddress;
…
}
interface YourTypeRepository extends CrudRepository<YourType, BigInteger> { }
// Infrastructure setup, if you use Spring as container prefer #EnableMongoRepositories
MongoOperations operations = new MongoTemplate(new MongoClient(), "myDatabase");
MongoRepositoryFactory factory = new MongoRepositoryFactory(operations);
YourTypeRepository repository = factory.getRepository(YourTypeRepository.class);
// Now use it…
YourType first = …; // set email address
YourType second = …; // set same email address
repository.save(first);
repository.save(second); // will throw an exception
The crucial part that's most related to your original question is #Indexed as this will cause the required unique index created when you create the repository.
What you get beyond that is:
no need to manually implement any repository (deleted code does not contain bugs \o/)
automatic object-to-document conversion
automatic index creation
powerful repository abstraction to easily query data by declaring query methods
For more details, check out the reference documentation.
I have recently started my first Software Dev job out of university and one of my tasks at the moment is to convert raw JDBC Code to use JdbcTemplate in order to get rid of boilerplate code.
I have written an example of a DAO class using JdbcTemplate which retrieves a users address.
Would someone be able to tell me if this looks like the right pattern/approach... or am I
missing anything here ?
public class accountsDAO extends StoredProcedure {
private static final STOREDPROC = "accounts.getAccountDetails";
public accountsDAO() {
super(jdbcTemplate,STOREDPROC)
declareParameter(new SqlParameter("client_id"), OracleTypes.VARCHAR, );
declareParameter(new SqlOutParameter("accounts_csr"), OracleTypes.CURSOR,new AccountAddressExtractor());
setFunction(false)
complie()
}
private List<String> getAccountAddress(String account) {
Map<String,Object> params = new HashMap<String,Object>;
Map<String,Object results;
List<String> data = new LinkedList<String>();
try{
params.put("client_id",account);
results = execute(params);
data = (List<String) results.get(0);
}catch (Exception e) {
// report error
}
return data;
}
private class AccountAddressExtractor implements RowMapper<List<String>> {
#Override
public List<String> mapRow(Resultset rs, int i) throws SQLException{
List<String> data = new ArrayList<String>();
data.add(rs.getString(1));
data.add(rs.geString(2));
data.add(rs.getString(3));
return data;
}
}
}