How can I retrieve a mongodb collection using spring-data? - java

I want to retrieve List<Document> (as an example) of all documents in a MongoDB collection for given mongo shell query.

You can retrieve a collection without mapping Document to a domain model.
Not sure whats the purpose you are chasing, but here you have an example:
package com.answers.stackoverflow.spring.mondbretrievedata.data;
import com.mongodb.client.MongoClient;
import com.mongodb.client.MongoCollection;
import org.bson.Document;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Repository;
import java.util.ArrayList;
import java.util.List;
#Repository
public class MongoRepository {
private static final String DatabaseName = "EXAMPLE";
private static final String CollectionName = "example";
#Autowired
private MongoClient client;
public List<String> allDocuments() {
final List<String> list = new ArrayList<>();
final MongoCollection<Document> data = client.getDatabase(DatabaseName).getCollection(CollectionName);
data.find().map(Document::toJson).forEach(list::add);
return list;
}
}

When you use the MongoRepository, you have to give a PersistentEntity. So use your model class which is to be extended by MongoRepository
public interface YOUR_MODEL_Repository extends MongoRepository<MODEL_CLASS, String> {
}

See example official on Product -> getAttributes() for more details visit Spring Data - Mongo DB

Related

Insert embeded document without reading whole document - spring, mongo

I have this factory collection :
#Document(collection = "factory")
public class Factory
{
Private List<Product> products;
}
which embeds the Product as products.
When I have to add a product to an existing factory :
#Autowired
private FactoryRepository factoryRepository;
public void addProduct(Long id, Product product) {
Factory f = factoryRepository.findById(id);
f.addProduct(product);
factoryRepository.save(f);
}
However, the issue is that product is a large object which contains a set of heavy attributes and the factory can have 2000 products.
So, the retrieved factory causes large memory consumption although it is not required in this phase. Is there a way to append a new product object directly into the factory document without reading the whole object?
EDIT:
As for the comments, I tried :
public void addProduct(Long id, Product product) {
Document find = new Document("_id",id);
Document listItem = new Document("products",product);
Document push = new Document("$push", listItem);
collection.updateOne(find,push);
}
This gives error :
org.bson.codecs.configuration.CodecConfigurationException:
Can't find a codec for class product
So I modified to convert it to a string before push :
public void addProduct(Long id, Product product) {
Document find = new Document("_id",id);
ObjectWriter ow = new ObjectMapper().writer().withDefaultPrettyPrinter();
Document listItem = new Document("products",ow.writeValueAsString(product));
Document push = new Document("$push", listItem);
collection.updateOne(find,push);
}
This pushed object correctly but when reading :
org.springframework.core.convert.ConverterNotFoundException:
No converter found capable of converting from type [java.lang.String] to type [Product]
Still, I got nowhere here. Any ideas on fixing this issue?
You should use MongoTemplate to update product with push to add to existing products. Something like
package com.example.demo;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.ApplicationRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import java.util.List;
#SpringBootApplication
public class So62173077Application {
public static void main(String[] args) {
SpringApplication.run(So62173077Application.class, args);
}
#Autowired
private MongoTemplate mongoTemplate;
#Document(collection = "factory")
public class Factory
{
private Long id;
private List<Product> products;
}
public Long createFactory() {
Factory factory = new Factory();
factory.id = 1L;
return mongoTemplate.insert(factory).id;
}
public void addProduct(Long id) {
Query query = new Query();
query.addCriteria(Criteria.where("id").is(id));
Update update = new Update();
Product product = new Product();
product.name = "stackoverflow";
update.push("products", product);
mongoTemplate.updateFirst(query, update, Factory.class);
}
private class Product {
private String name;
}
#Bean
public ApplicationRunner runner() {
return args -> {
//Long id = createFactory();
addProduct(1L);
};
}
}
MongoDB large array may be discouraged (fetching, atomic operations, complex querying).
Depending on your needs but I would suggest you to deal with Product in a brand new "product" collection. Your current issue will be solved as well.
Regards.

[Spark Dataset]: CreateDataset API fails with Java object containing Abstract Class

I am trying to convert JavaRDD to Dataset using createDataFrame(RDD<T> data, Encoder<T> evidence) function, but I am getting the following error. I have used a subset of my use case.
My Usecase:
I am trying to convert a JavaRDD with a complex nested object(with abstract classes) to Dataset So that I can write the data in ORC/Parquet format(JavaRDD doesn't support ORC/Parquet)
The input data is in Avro format, there is an infinite recursion problem in createDataFrame for Avro types, referring to this https://issues.apache.org/jira/browse/SPARK-25789, that's why I am loading the data in JavaRDD first.
Requirements:
Encoders.kryo(), Encoders.javaSerialization() works here but i want to use Encoders.bean()
Encoders.bean(T) leverages the structure of an object, to provide class-specific storage layout, as I am using parquet(columnar storage) format for storage, each class variable can be stored in a different column using Encoders.bean(T), whereas
Encoders.Kryo(T) and EncodersjavaSerialization(T), these encoder maps T into a single byte array (binary) field. and thus store the object into one single column.
If a custom serializer is required then please elaborate on the solution.
Classes Used:
import lombok.AllArgsConstructor;
import lombok.NoArgsConstructor;
import java.io.Serializable;
import java.util.Map;
#AllArgsConstructor
#NoArgsConstructor
#lombok.Data
public class Data implements Serializable {
private String id;
private Map<Type, Aclass> data;
}
import com.fasterxml.jackson.annotation.JsonSubTypes;
import com.fasterxml.jackson.annotation.JsonTypeInfo;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.io.Serializable;
#AllArgsConstructor
#NoArgsConstructor
#Data
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include = JsonTypeInfo.As.PROPERTY, property = "uad")
#JsonSubTypes(value = {#JsonSubTypes.Type(name = "UADAffinity", value = Bclass.class)})
public abstract class Aclass implements Serializable {
String t;
}
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.util.Map;
#AllArgsConstructor
#NoArgsConstructor
#Data
public class Bclass extends Aclass {
private Map<String, String> data;
public Bclass(String t, Map<String, String> data) {
super(t);
this.data = data;
}
}
public enum Type {
A, B;
}
Logic:
import com.flipkart.ads.neo.schema.Type;
import com.google.common.collect.Lists;
import com.google.common.collect.Maps;
import com.xyz.schema.Aclass;
import com.xyz.schema.Bclass;
import com.xyz.schema.Data;
import org.apache.spark.SparkConf;
import org.apache.spark.SparkContext;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Encoders;
import org.apache.spark.sql.SparkSession;
import java.util.List;
import java.util.Map;
public class DataSetConverter {
private SparkSession session;
public DataSetConverter() {
session = initSpark();
}
SparkSession initSpark() {
SparkConf conf = new SparkConf().setMaster("local[*]").setAppName("123");
return SparkSession.builder()
.sparkContext(new SparkContext(conf))
.getOrCreate();
}
public void dataset_test() {
List<Data> dataList = prepareData();
JavaSparkContext jsc = new JavaSparkContext(session.sparkContext());
JavaRDD<Data> rowsrdd = jsc.parallelize(dataList);
Dataset<Data> rows = session.createDataset(rowsrdd.rdd(), Encoders.bean(Data.class));
System.out.println(rows.takeAsList(3));
}
public static void main(String[] args) {
new DataSetConverter().dataset_test();
}
private List<Data> prepareData() {
List<Data> dataList = Lists.newArrayList();
Data sample1 = getData();
Data sample2 = getData();
dataList.add(sample1);
dataList.add(sample2);
return dataList;
}
private Data getData() {
Map<Type, Aclass> data = getUadData("ppv");
return new Data("123", data);
}
private Map<Type, Aclass> getUadData(String id) {
Map<Type, Aclass> result = Maps.newHashMap();
Map<String, String> data = Maps.newHashMap();
data.put(id + "11", "11");
result.put(Type.A, new Bclass("123", data));
return result;
}
}
Error:
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 150, Column 11: Cannot instantiate abstract "com.xyz.schema.Aclass"
at org.codehaus.janino.UnitCompiler.compileError(UnitCompiler.java:12124)
at org.codehaus.janino.UnitCompiler.compileGet2(UnitCompiler.java:5260)
at org.codehaus.janino.UnitCompiler.access$9800(UnitCompiler.java:215)
at org.codehaus.janino.UnitCompiler$16.visitNewClassInstance(UnitCompiler.java:4433)
at org.codehaus.janino.UnitCompiler$16.visitNewClassInstance(UnitCompiler.java:4394)
at org.codehaus.janino.Java$NewClassInstance.accept(Java.java:5179)
at org.codehaus.janino.UnitCompiler.compileGet(UnitCompiler.java:4394)
at org.codehaus.janino.UnitCompiler.compileGetValue(UnitCompiler.java:5575)
at org.codehaus.janino.UnitCompiler.compileGet2(UnitCompiler.java:4703)
at org.codehaus.janino.UnitCompiler.access$8800(UnitCompiler.java:215)
at org.codehaus.janino.UnitCompiler$16.visitConditionalExpression(UnitCompiler.java:4418)
at org.codehaus.janino.UnitCompiler$16.visitConditionalExpression(UnitCompiler.java:4394)
at org.codehaus.janino.Java$ConditionalExpression.accept(Java.java:4504)
at org.codehaus.janino.UnitCompiler.compileGet(UnitCompiler.java:4394)
at org.codehaus.janino.UnitCompiler.compileGetValue(UnitCompiler.java:5575)
at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:2580)
at org.codehaus.janino.UnitCompiler.access$2700(UnitCompiler.java:215)
at org.codehaus.janino.UnitCompiler$6.visitLocalVariableDeclarationStatement(UnitCompiler.java:1503)
at org.codehaus.janino.UnitCompiler$6.visitLocalVariableDeclarationStatement(UnitCompiler.java:1487)
at org.codehaus.janino.Java$LocalVariableDeclarationStatement.accept(Java.java:3511)
at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:1487)
at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1567)
at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:3388)
at org.codehaus.janino.UnitCompiler.compileDeclaredMethods(UnitCompiler.java:1357)
at org.codehaus.janino.UnitCompiler.compileDeclaredMethods(UnitCompiler.java:1330)
at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:822)
at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:981)
at org.codehaus.janino.UnitCompiler.access$700(UnitCompiler.java:215)
at org.codehaus.janino.UnitCompiler$2.visitMemberClassDeclaration(UnitCompiler.java:414)
at org.codehaus.janino.UnitCompiler$2.visitMemberClassDeclaration(UnitCompiler.java:406)
at org.codehaus.janino.Java$MemberClassDeclaration.accept(Java.java:1295)
at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:406)
at org.codehaus.janino.UnitCompiler.compileDeclaredMemberTypes(UnitCompiler.java:1306)
at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:848)
at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:432)
Please Help!!

How to post a list to Spring Data Rest?

I followed this example, which allows to post a unique Person object. I want a REST service where I can post a collection of Person at once, e.g. a list/any collection named Team with numerous Person objects in just one call.
I mean, my question is not exactly about the OneToMany relationship, where you send each person in a REST request. This topic is well answered.
I want to send a collection of Person objects taking advantage of #RepositoryRestResource or another feature from Spring Data Rest. Is this possible with Spring Data Rest or should I workaround by creating a controller, receive the list and parse the Team list to insert each Person?
I found this feature request, which seems to answer that nowadays Spring Rest Data is missing what I am looking for, but I am not sure.
In my business requirement, application A will post a list of orders to application B and I have to save it in database for future processing, so, after reading about Spring Data Rest and making some samples, I found its clean architecture amazing and very suitable for my requirement except for the fact that I didn't figure out how to post a list.
Well, AFAIK you can't do that with spring data rest, just read the docs and you will see, that there is no mention about posting a list to collection resource.
The reason for this is unclear to me, but for one thing - the REST itself doesn't really specify how you should do batch operations.
So it's unclear how one should approach that feature, like should you POST a list to collection resource? Or should you export resource like /someentity/batch that would be able to patch, remove and add entities in one batch? If you will add list how should you return ids? For single POST to collection spring-data-rest return id in Location header. For batch add this cannot be done.
That doesn't justify that spring-data-rest is missing batch operations. They should implement this IMHO, but at least it can help to understand why are they missing it maybe.
What I can say though is that you can always add your own Controller to the project that would handle /someentity/batch properly and you can even probably make a library out of that, so that you can use it in another projects. Or even fork spring-data-rest and add this feature. Although I tried to understand how it works and failed so far.
But you probably know all that, right?
There is a feature request for this.
Based on user1685095 answer, You can make custom Controller PersonRestController and expose post collection of Person as it seem not exposed yet by Spring-date-rest
#RepositoryRestController
#RequestMapping(value = "/persons")
public class PersonRestController {
private final PersonRepository repo;
#Autowired
public AppointmentRestController(PersonRepository repo) {
this.repo = repo;
}
#RequestMapping(method = RequestMethod.POST, value = "/batch", consumes = "application/json", produces = "application/json")
public #ResponseBody ResponseEntity<?> savePersonList(#RequestBody Resource<PersonWrapper<Person>> personWrapper,
PersistentEntityResourceAssembler assembler) {
Resources<Person> resources = new Resources<Person>(repo.save(personWrapper.getContent()));
//TODO add extra links `assembler`
return ResponseEntity.ok(resources);
}
}
PersonWrapper to fix:
Can not deserialize instance of org.springframework.hateoas.Resources out of START_ARRAY token\n at [Source: java.io.PushbackInputStream#3298b722; line: 1, column: 1]
Update
public class PersonWrapper{
private List<Person> content;
public List<Person> getContent(){
return content;
}
public void setContent(List<Person> content){
this.content = content;
}
}
public class Person{
private String name;
private String email;
// Other fields
// GETTER & SETTER
}
I tried to use #RequestBody List<Resource<MyPojo>>.
When the request body does not contain any links, it works well, but
if the element carries a link, the server could not deserialize the request body.
Then I tried to use #RequestBody Resources<MyPojo>, but I could not figure out the default name of a list.
Finally, I tried a wrapper which contained List<Resource<MyPojo>>, and it works.
Here is my solution:
First create a wrapper class for List<Resource<MyPojo>>:
public class Bulk<T> {
private List<Resource<T>> bulk;
// getter and setter
}
Then use #RequestBody Resource<Bulk<MyPojo>> for parameters.
Finally, example json with links for create bulk data in one request:
{
"bulk": [
{
"title": "Spring in Action",
"author": "http://localhost:8080/authors/1"
},
{
"title": "Spring Quick Start",
"author": "http://localhost:8080/authors/2"
}
]
}
#RequestMapping(method=RequestMethod.POST, value="/batchInsert", consumes = "application/json", produces = "application/json")
#ResponseBody
public ResponseEntity<?> batchInsert(#RequestBody Resources<Person> people, PersistentEntityResourceAssembler assembler) throws Exception {
Iterable<Person> s = repo.save( people.getContent() ); // save entities
List<PersistentEntityResource> list = new ArrayList<PersistentEntityResource>();
Iterator<Sample> itr = s.iterator();
while(itr.hasNext()) {
list.add( assembler.toFullResource( itr.next() ) );
}
return ResponseEntity.ok( new Resources<PersistentEntityResource>(list) );
}
Base the answer of totran, this is my code:
There are dependencies:
springBootVersion = '2.4.2'
springDependencyManagement = '1.0.10.RELEASE'
implementation 'org.springframework.boot:spring-boot-starter-data-jpa'
implementation 'org.springframework.boot:spring-boot-starter-data-rest'
testImplementation 'org.springframework.boot:spring-boot-starter-test'
The codes:
import icu.kyakya.rest.jpa.model.Address;
import org.springframework.data.jpa.repository.Modifying;
import org.springframework.data.jpa.repository.Query;
import org.springframework.data.repository.PagingAndSortingRepository;
import org.springframework.data.repository.query.Param;
import org.springframework.data.rest.core.annotation.RepositoryRestResource;
import org.springframework.data.rest.core.annotation.RestResource;
import org.springframework.transaction.annotation.Transactional;
import java.util.List;
#RepositoryRestResource(collectionResourceRel = "address", path = "address")
public interface AddressRepository extends PagingAndSortingRepository<Address, Long> {
//...
}
import lombok.Data;
import java.util.List;
#Data
public class Bulk<T> {
private List<T> bulk;
}
import lombok.RequiredArgsConstructor;
import org.springframework.data.rest.webmvc.BasePathAwareController;
import org.springframework.data.rest.webmvc.RepositoryRestController;
import org.springframework.hateoas.EntityModel;
import org.springframework.hateoas.server.ExposesResourceFor;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import java.util.List;
#BasePathAwareController // if base url exists, it needs to be added
#RepositoryRestController
#RequiredArgsConstructor
#ExposesResourceFor(Address.class)
public class AddressController {
private final AddressRepository repo;
#PostMapping("/address/saveAll")
public ResponseEntity<Iterable<Address>> saveAll(#RequestBody EntityModel<Bulk<Address>> bulk) {
List<Address> addresses = Objects.requireNonNull(bulk.getContent()).getBulk();
Iterable<Address> resp = repo.saveAll(addresses);
return new ResponseEntity<>(resp,HttpStatus.CREATED);
}
}
The way more like Spring data rest:
import lombok.RequiredArgsConstructor;
import org.springframework.data.rest.webmvc.BasePathAwareController;
import org.springframework.data.rest.webmvc.RepositoryRestController;
import org.springframework.data.rest.webmvc.support.RepositoryEntityLinks;
import org.springframework.hateoas.CollectionModel;
import org.springframework.hateoas.EntityModel;
import org.springframework.hateoas.Link;
import org.springframework.hateoas.server.ExposesResourceFor;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import java.util.ArrayList;
import java.util.List;
import java.util.Objects;
#BasePathAwareController // if base url exists, it needs to be added
#RepositoryRestController
#RequiredArgsConstructor
#ExposesResourceFor(Address.class)
public class AddressController {
private final AddressRepository repo;
private final RepositoryEntityLinks entityLinks; //get link
/**
* curl -i -X POST -H "Content-Type:application/json" -d '{ "bulk": [ {"country" : "Japan" , "city" : "Tokyo" }, {"country" : "Japan" , "city" : "Osaka" }]} ' http://localhost:8080/api/v1/address/saveAll
*
* #return 201 https://docs.spring.io/spring-data/rest/docs/current/reference/html/#repository-resources.default-status-codes
*/
#PostMapping("/address/saveAll")
public ResponseEntity<CollectionModel<EntityModel<Address>>> List<Address> data = Objects.requireNonNull(bulk.getContent()).getBulk();
Iterable<Address> addresses = repo.saveAll(data);
ArrayList<EntityModel<Address>> models = new ArrayList<>();
addresses.forEach(i->{
Link link = entityLinks.linkToItemResource(Address.class, i.getId()).withRel("self");
models.add(EntityModel.of(i).add(link));
});
return new ResponseEntity<>(CollectionModel.of(models),HttpStatus.CREATED);
}
}

querying couchbase with spring-data-couchbase, using multiple columns

I am using couchbase3 with spring-data-couchbase, and want to query data using spring data repository with multiple columns.
public interface UserAccountRepository extends CrudRepository<UserAccount, Long> {
public UserAccount findByEmail(Query eMail);
public UserAccount findByEmailAndStatus(Query query); // fn with multiple column, but not getting the result
}
How should I write Map function and Reduce function for the same?
For the function findByEmail(Query eMail); to work, I have added the view with Map fn()
function (doc, meta) {
emit(doc.email,doc);
}
This view have email as key, and value is the document.
But if i need to query using email and status? How should the view look like ?
I have seen this link, but not very clear.
https://stackoverflow.com/questions/28938755
I was able make springdata function to invoke a compound Key view.
My Document name is : Data
Compound Key View
function (doc, meta) {
if(doc && doc._class == "com.couchbase.entity.Data"){
emit([doc.key1, doc.key2], doc);
}
}
SpringData Repository Inferface shown below :
package com.couchbase.repository;
import java.util.List;
import org.springframework.data.couchbase.core.view.View;
import org.springframework.data.repository.CrudRepository;
import org.springframework.data.repository.query.Param;
import com.couchbase.client.protocol.views.Query;
import com.couchbase.entity.Data;
public interface DataRepository extends CrudRepository<Data, String> {
#View(designDocument="Data",viewName="findByKey1AndKey2")
public List<Data> findByKey1AndKey2(Query query);
}
Test Class shown below :
import com.couchbase.client.protocol.views.ComplexKey;
import com.couchbase.client.protocol.views.Query;
public class DataTest extends WebAppConfigurationAware{
#Autowired
private DataRepository dataRepository;
#Test
public void testStringDataCompoundQuery(){
Object[] objArr = new Object[2];
objArr[0] = "aaa";
objArr[1] = 1;
Query query = new Query();
query.setKey(ComplexKey.of(objArr));
System.out.println(dataRepository.findByKey1AndKey2(query));
}
}
If this was useful for you, please up vote
You could use compound key like describe in the documentation here: http://docs.couchbase.com/developer/dev-guide-3.0/compound-keys.html

Morphia List<Map<String,Object>>> return Embedded element isn't a DBObject on find operation

I have tried to do something like this:
package org.dnylabs.kosh.data;
import java.net.UnknownHostException;
import java.util.HashMap;
import java.util.LinkedList;
import java.util.List;
import java.util.Map;
import com.google.code.morphia.Datastore;
import com.google.code.morphia.Morphia;
import com.google.code.morphia.annotations.Entity;
import com.google.code.morphia.annotations.Id;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
#Entity
public class Temp {
#Id String _id;
List<Map<String,Object>> strings;
public Temp(){
strings=new LinkedList<Map<String,Object>>();
}
public static void main(String []args) throws UnknownHostException, MongoException{
Mongo mongo=null;
Morphia morphia=null;
Datastore ds=null;
mongo = new Mongo();
morphia = new Morphia();
morphia.map(Temp.class);
ds = morphia.createDatastore(mongo, "test");
Temp t = new Temp();
t._id ="hi";
Map<String, Object> m = new HashMap<String, Object>();
m.put("Hi","1");
m.put("Hi2",2);
t.strings.add(m);
ds.save(t);
t=ds.get(t);
ds.ensureIndexes();
}
}
When I try to do a findAll(9 operation I get this exception:
Caused by: java.lang.RuntimeException: org.mongodb.morphia.mapping.MappingException: Embedded element isn't a DBObject! How can it be that is a class java.lang.String
at org.mongodb.morphia.mapping. here`dedMapper.fromDBObject(EmbeddedMapper.java:172)
at org.mongodb.morphia.mapping.Mapper.readMappedField(Mapper.java:602)
at org.mongodb.morphia.mapping.Mapper.fromDb(Mapper.java:559)
at org.mongodb.morphia.mapping.EmbeddedMapper.readMapOrCollectionOrEntity(EmbeddedMapper.java:256)
at org.mongodb.morphia.mapping.EmbeddedMapper.readCollection(EmbeddedMapper.java:203)
at org.mongodb.morphia.mapping.EmbeddedMapper.fromDBObject(EmbeddedMapper.java:144)
... 16 more
After numerous attempts I have found that the problem is the grafted map.
Can anyone help me understand where I'm wrong? The statement seems correct.
Morphia sees Map as a DB reference to another document rather than seeing it as an embedded class and treating as a document. The solution would be to annotate the Map #Embedded, but this is not possible as you can't edit the Map class.
There is a way to achieve something similar to what you are trying by creating another class and defining the Map as a property of this class and annotate it as #Embedded.
Change the Temp class:
public class Temp {
#Id String _id;
#Embedded // CHANGE HERE
List<MapProxy> strings; // CHANGE HERE
public Temp(){
strings=new LinkedList<MapProxy>(); // CHANGE HERE
}
public static void main(String...args) throws UnknownHostException, MongoException{
Mongo mongo=null;
Morphia morphia=null;
Datastore ds=null;
mongo = new Mongo();
morphia = new Morphia();
morphia.map(Temp.class);
ds = morphia.createDatastore(mongo, "test2");
Temp t = new Temp();
t._id ="hi";
MapProxy mp = new MapProxy(); // CHANGE HERE
mp.m.put("Hi","1"); // CHANGE HERE
mp.m.put("Hi2",2); // CHANGE HERE
t.strings.add(mp); // CHANGE HERE
ds.save(t);
t=ds.get(t);
ds.ensureIndexes();
}
}
and create a new class:
#Embedded
public class MapProxy {
public Map<String,Object> m = new HashMap<String, Object>();
}
I have marked the changes I have made.
The structure that this produces is like this:
{
"_id" : "hi",
"className" : "YOUR CLASS NAME HERE",
"strings" :
[ {
"m" :
{
"Hi" : "1" ,
"Hi2" : 2
}
} ]
}
So here's what's going on. Morphia will attempt to serialize any non-transient, non-static field on a class. Now if that field is of a type annotated with #Entity, morphia will go through and introspect things and set up proper mappings. If not, it will some default basic serializers when constructing the DBObjects to hand over to the Java driver. In this case, you have a raw type (Object) and morphia has to make certain assumptions as to its type and structure. Upon finding out that your structure violates those assumptions, it bails. This is why it works when you break things out like Alex has shown. This is probably fixable in one way or another but as I'm planning on redoing the mapping code top to bottom, I don't foresee trying to fix this in the current code. Hope this helps.

Categories

Resources