I have this factory collection :
#Document(collection = "factory")
public class Factory
{
Private List<Product> products;
}
which embeds the Product as products.
When I have to add a product to an existing factory :
#Autowired
private FactoryRepository factoryRepository;
public void addProduct(Long id, Product product) {
Factory f = factoryRepository.findById(id);
f.addProduct(product);
factoryRepository.save(f);
}
However, the issue is that product is a large object which contains a set of heavy attributes and the factory can have 2000 products.
So, the retrieved factory causes large memory consumption although it is not required in this phase. Is there a way to append a new product object directly into the factory document without reading the whole object?
EDIT:
As for the comments, I tried :
public void addProduct(Long id, Product product) {
Document find = new Document("_id",id);
Document listItem = new Document("products",product);
Document push = new Document("$push", listItem);
collection.updateOne(find,push);
}
This gives error :
org.bson.codecs.configuration.CodecConfigurationException:
Can't find a codec for class product
So I modified to convert it to a string before push :
public void addProduct(Long id, Product product) {
Document find = new Document("_id",id);
ObjectWriter ow = new ObjectMapper().writer().withDefaultPrettyPrinter();
Document listItem = new Document("products",ow.writeValueAsString(product));
Document push = new Document("$push", listItem);
collection.updateOne(find,push);
}
This pushed object correctly but when reading :
org.springframework.core.convert.ConverterNotFoundException:
No converter found capable of converting from type [java.lang.String] to type [Product]
Still, I got nowhere here. Any ideas on fixing this issue?
You should use MongoTemplate to update product with push to add to existing products. Something like
package com.example.demo;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.ApplicationRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import java.util.List;
#SpringBootApplication
public class So62173077Application {
public static void main(String[] args) {
SpringApplication.run(So62173077Application.class, args);
}
#Autowired
private MongoTemplate mongoTemplate;
#Document(collection = "factory")
public class Factory
{
private Long id;
private List<Product> products;
}
public Long createFactory() {
Factory factory = new Factory();
factory.id = 1L;
return mongoTemplate.insert(factory).id;
}
public void addProduct(Long id) {
Query query = new Query();
query.addCriteria(Criteria.where("id").is(id));
Update update = new Update();
Product product = new Product();
product.name = "stackoverflow";
update.push("products", product);
mongoTemplate.updateFirst(query, update, Factory.class);
}
private class Product {
private String name;
}
#Bean
public ApplicationRunner runner() {
return args -> {
//Long id = createFactory();
addProduct(1L);
};
}
}
MongoDB large array may be discouraged (fetching, atomic operations, complex querying).
Depending on your needs but I would suggest you to deal with Product in a brand new "product" collection. Your current issue will be solved as well.
Regards.
Related
I want to retrieve List<Document> (as an example) of all documents in a MongoDB collection for given mongo shell query.
You can retrieve a collection without mapping Document to a domain model.
Not sure whats the purpose you are chasing, but here you have an example:
package com.answers.stackoverflow.spring.mondbretrievedata.data;
import com.mongodb.client.MongoClient;
import com.mongodb.client.MongoCollection;
import org.bson.Document;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Repository;
import java.util.ArrayList;
import java.util.List;
#Repository
public class MongoRepository {
private static final String DatabaseName = "EXAMPLE";
private static final String CollectionName = "example";
#Autowired
private MongoClient client;
public List<String> allDocuments() {
final List<String> list = new ArrayList<>();
final MongoCollection<Document> data = client.getDatabase(DatabaseName).getCollection(CollectionName);
data.find().map(Document::toJson).forEach(list::add);
return list;
}
}
When you use the MongoRepository, you have to give a PersistentEntity. So use your model class which is to be extended by MongoRepository
public interface YOUR_MODEL_Repository extends MongoRepository<MODEL_CLASS, String> {
}
See example official on Product -> getAttributes() for more details visit Spring Data - Mongo DB
I am currently writing a Netbeans platform application for persistence I use JPA 2.1 and eclipse link as provider. I have the following entity class:
import java.util.UUID;
import javafx.beans.property.SimpleStringProperty;
import javax.persistence.Access;
import javax.persistence.AccessType;
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.ManyToOne;
import javax.persistence.Transient;
/**
*
* #author forell
*/
#Entity
#Access(AccessType.FIELD)
public class MaterialAbc {
#Id
private String id;
//... several additional fields, that all work fine
private String substanceInstances = "";
public MaterialAbc(){
id = UUID.randomUUID().toString();
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
//.. getters and setters for the omitted fields
public String getSubstanceInstances() {
return substanceInstances;
}
public void setSubstanceInstances(String substanceInstances) {
this.substanceInstances = substanceInstances;
}
/* IF I UNCOMMENT THESE LINES THE DATATABLE IS NOT CREATED ANYMORE
IS THERE A WAY TO SOLVE THIS?
public List<SubstanceInstance2> getSubs() {
List<SubstanceInstance2> subs = new ArrayList<>();
if (!"".equals(substanceInstances)){
List<String> values = Arrays.asList(substanceInstances.split("|"));
values.forEach(value->{
subs.add(SubstanceInstance2.unSerialize(value));
});
}
return subs;
}
public void setSubs(List<SubstanceInstance2> subs) {
substanceInstances = "";
subs.forEach(sub->{
substanceInstances =substanceInstances + sub.serialize() + "|";
});
substanceInstances = substanceInstances.substring(0, substanceInstances.length()-1);
}
*/
As is the class works fine, as soon as I uncomment the two methods at the bottom that "unserialize" an object that is nested in the string substanceInstances, eclipselink is not creating the datatables anymore. Is there a way to solve this, or do need to add an extra layer.
In the meantime I actually found a solution to the problem. It seems eclipselink does not convert the Entity-bean into a table, if there are lambda expressions used in the methods. In the end I converted
values.forEach(value->{
subs.add(SubstanceInstance2.unSerialize(value));
});
into
for (String value:values){
subs.add(SubstanceInstance2.unSerialize(value));
}
And everthing works nicely. As to the reason why, no idea!
I'm totally new to Spring MVC but still trying to understand its methods and its way of referencing things. There's a video tutorial course I'm following.
I'm trying to implement a Model through a class.
ProductDaoImpl.java
import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Repository;
import java.util.List;
#Repository
public class ProductDaoImpl implements ProductDao {
#Autowired
private SessionFactory sessionFactory;
#Override
public void addProduct(Product product) {
Session session = sessionFactory.getCurrentSession();
session.saveOrUpdate(product);
session.flush();
}
#Override
public Product getProductById(String id) {
Session session = sessionFactory.getCurrentSession();
Product product = (Product) session.get(Product.class.productId);
// unable to resolve the productId on get()
return product;
}
#Override
public List<Product> getAllProducts() {
return null;
}
#Override
public void deleteProduct(String id) {
}
}
ProductDao.java
import java.util.List;
public interface ProductDao {
void addProduct(Product product);
Product getProductById(String id);
List<Product> getAllProducts();
void deleteProduct(String id);
}
Product.java Model
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
#Entity
public class Product {
#Id
#GeneratedValue(strategy = GenerationType.AUTO) //tells the system that when an instance is put to database, it will be numbered automatically starting from 1
private String productName;
private String productCategory;
private String productDescription;
private double productPrice;
private String productDimension;
private String productStatus;
private int unitInStock;
private String productManufacturer;
private String productId;
}
However, it's unable to resolve the productId argument in
#Override
public Product getProductById(String id) {
Session session = sessionFactory.getCurrentSession();
Product product = (Product) session.get(Product.class.productId);
return product;
}
Is get() method able to access the fields of Product.class?
Is it asking for a field in Product.class.fieldHere?
I don't understand why it can't resolve the productId
I hope you can help.
Thanks.
The expression Product.class.productId doesn't make sense in Java. Product.class is a class literal, meaning that it's a constant value that represents the Product class, an instance of java.lang.Class. Class has no productId field.
Additionally, you should read the Javadoc for the method--it takes two parameters, a Class object (to tell it what sort of thing you're getting) and an ID. Your call should therefore be session.get(Product.class, id), and if you're using Hibernate 5, you don't need to cast to a Product.
All of this is rather moot, however--instead of hand-rolling a DAO targeting Hibernate, use JPA (which provides advantages such as a generic API, eliminating the need for casting) and Spring Data (which will autogenerate this entire DAO for you from nothing but an empty interface declaration). Additionally, you're dealing with topics that can get complicated, and you would do well to go through some exercises to learn core Java before tackling something like ORM.
session.get(Product.class.productId);
That's not valid Java code. get() expects two arguments: the entity class, and the ID of the entity to get. Arguments in Java are separated by a comma.
session.get(Product.class, productId);
But your variable isn't even named productId. It's named id. So the code should be
session.get(Product.class, id);
This is beginner Java stuff. I strongly suggest you practice with simpler Java exercises before using Spring and Hibernate, which are complex stuff.
session.get accepts two parameters, one is the class of your entity and the other one is the identifier
Product product = (Product) session.get(Product.class,id);
I am using Jackson ObjectMapper to (de)serialize a class with polymorphic nested class. The deserialization of JSON to the class is working fine but when I try to serialize the class to JSON using writeValueAsString function I observe duplicate values in the output
public class Movie {
private String movieName;
#JsonTypeInfo(use=Id.NAME,include=As.EXTERNAL_PROPERTY,property="movieName")
#JsonSubTypes({#JsonSubTypes.Type(value = StarWarsParams.class, name = "starwars")})
private MovieParams movieParams;
/* Getters and setters follow */
}
/* Empty class */
public class MovieParams {
}
public class StarWarsParams extends MovieParams{
private String characterName;
#JsonTypeInfo(use=Id.NAME,include=As.EXTERNAL_PROPERTY,property="characterName")
#JsonSubTypes({#JsonSubTypes.Type(value = SithParameters.class, name = "Darth Vader")})
private CharacterParams characterParams;
/* Getters and setters follow */
}
/* Empty class */
public class CharacterParams {
}
public class SithParameters extends CharacterParams {
private boolean canShootLightning;
}
The code snippet where the conversion is done as follows:
Movie movie = new Movie();
movie.setMovieName("starwars");
StarWarsParams starWarsParams = new StarWarsParams();
starWarsParams.setCharacterName("Darth Vader");
SithParameters sithParameters = new SithParameters();
sithParameters.setCanShootLightning(false);
starWarsParams.setCharacterParams(sithParameters);
movie.setMovieParams(starWarsParams);
ObjectMapper mapper = new ObjectMapper();
String jsonStringSample = mapper.writeValueAsString(movie);
System.out.println(jsonStringSample);
The output, in which movieName and characterName have duplicates are as follows:
{"movieName":"starwars","movieParams":{"characterName":"Darth Vader","characterParams":{"canShootLightning":false},"characterName":"Darth Vader"},"movieName":"starwars"}
This problem appears with older versions of Jackson e.g. 1.9.2 but not the latest ones from com.fasterxml. Jackson identifies 2 fields one from the #JsonTypeInfo annotation and one from the getter. Two solutions :
Use a more recent version of Jackson from com.fasterxml
Move the #JsonTypeInfo annotation over the getter instead of over the field e.g.
#JsonTypeInfo(use = Id.NAME, include = As.EXTERNAL_PROPERTY, property = "characterName")
public String getCharacterName() {
return characterName;
}
Customized JSON Object using Serialization is Very Simple.
I have wrote a class in my project to get Serialized JSONObject. i am giving u a Idea to how to Implement this in Project.
Application (POJO Class)
import java.io.Serializable;
import java.util.List;
import org.webservice.business.serializer.ApplicationSerializer;
import com.fasterxml.jackson.databind.annotation.JsonSerialize;
#JsonSerialize(using=ApplicationSerializer.class)
public class Application implements Serializable {
private static final long serialVersionUID = 1L;
private double amount;
private String businessType;
private String currency;
private int duration;
}
Now ApplicationSerializer class that contains the Customization using Serialization Logic................
package org.webservice.business.serializer;
import java.io.IOException;
import org.webservice.business.dto.Application;
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonSerializer;
import com.fasterxml.jackson.databind.SerializerProvider;
public class ApplicationSerializer extends JsonSerializer<Application> {
#Override
public void serialize(Application prm_objObjectToSerialize, JsonGenerator prm_objJsonGenerator, SerializerProvider prm_objSerializerProvider) throws IOException, JsonProcessingException {
if (null == prm_objObjectToSerialize) {
} else {
try {
prm_objJsonGenerator.writeStartObject();
prm_objJsonGenerator.writeNumberField("amount", prm_objObjectToSerialize.getAmount());
prm_objJsonGenerator.writeNumberField("duration", prm_objObjectToSerialize.getDuration());
prm_objJsonGenerator.writeStringField("businesstype", prm_objObjectToSerialize.getBusinessType());
prm_objJsonGenerator.writeStringField("currency", prm_objObjectToSerialize.getCurrency());
} catch (Exception v_exException) {
v_exException.printStackTrace()
} finally {
prm_objJsonGenerator.writeEndObject();
}
}
}
I have tried to do something like this:
package org.dnylabs.kosh.data;
import java.net.UnknownHostException;
import java.util.HashMap;
import java.util.LinkedList;
import java.util.List;
import java.util.Map;
import com.google.code.morphia.Datastore;
import com.google.code.morphia.Morphia;
import com.google.code.morphia.annotations.Entity;
import com.google.code.morphia.annotations.Id;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
#Entity
public class Temp {
#Id String _id;
List<Map<String,Object>> strings;
public Temp(){
strings=new LinkedList<Map<String,Object>>();
}
public static void main(String []args) throws UnknownHostException, MongoException{
Mongo mongo=null;
Morphia morphia=null;
Datastore ds=null;
mongo = new Mongo();
morphia = new Morphia();
morphia.map(Temp.class);
ds = morphia.createDatastore(mongo, "test");
Temp t = new Temp();
t._id ="hi";
Map<String, Object> m = new HashMap<String, Object>();
m.put("Hi","1");
m.put("Hi2",2);
t.strings.add(m);
ds.save(t);
t=ds.get(t);
ds.ensureIndexes();
}
}
When I try to do a findAll(9 operation I get this exception:
Caused by: java.lang.RuntimeException: org.mongodb.morphia.mapping.MappingException: Embedded element isn't a DBObject! How can it be that is a class java.lang.String
at org.mongodb.morphia.mapping. here`dedMapper.fromDBObject(EmbeddedMapper.java:172)
at org.mongodb.morphia.mapping.Mapper.readMappedField(Mapper.java:602)
at org.mongodb.morphia.mapping.Mapper.fromDb(Mapper.java:559)
at org.mongodb.morphia.mapping.EmbeddedMapper.readMapOrCollectionOrEntity(EmbeddedMapper.java:256)
at org.mongodb.morphia.mapping.EmbeddedMapper.readCollection(EmbeddedMapper.java:203)
at org.mongodb.morphia.mapping.EmbeddedMapper.fromDBObject(EmbeddedMapper.java:144)
... 16 more
After numerous attempts I have found that the problem is the grafted map.
Can anyone help me understand where I'm wrong? The statement seems correct.
Morphia sees Map as a DB reference to another document rather than seeing it as an embedded class and treating as a document. The solution would be to annotate the Map #Embedded, but this is not possible as you can't edit the Map class.
There is a way to achieve something similar to what you are trying by creating another class and defining the Map as a property of this class and annotate it as #Embedded.
Change the Temp class:
public class Temp {
#Id String _id;
#Embedded // CHANGE HERE
List<MapProxy> strings; // CHANGE HERE
public Temp(){
strings=new LinkedList<MapProxy>(); // CHANGE HERE
}
public static void main(String...args) throws UnknownHostException, MongoException{
Mongo mongo=null;
Morphia morphia=null;
Datastore ds=null;
mongo = new Mongo();
morphia = new Morphia();
morphia.map(Temp.class);
ds = morphia.createDatastore(mongo, "test2");
Temp t = new Temp();
t._id ="hi";
MapProxy mp = new MapProxy(); // CHANGE HERE
mp.m.put("Hi","1"); // CHANGE HERE
mp.m.put("Hi2",2); // CHANGE HERE
t.strings.add(mp); // CHANGE HERE
ds.save(t);
t=ds.get(t);
ds.ensureIndexes();
}
}
and create a new class:
#Embedded
public class MapProxy {
public Map<String,Object> m = new HashMap<String, Object>();
}
I have marked the changes I have made.
The structure that this produces is like this:
{
"_id" : "hi",
"className" : "YOUR CLASS NAME HERE",
"strings" :
[ {
"m" :
{
"Hi" : "1" ,
"Hi2" : 2
}
} ]
}
So here's what's going on. Morphia will attempt to serialize any non-transient, non-static field on a class. Now if that field is of a type annotated with #Entity, morphia will go through and introspect things and set up proper mappings. If not, it will some default basic serializers when constructing the DBObjects to hand over to the Java driver. In this case, you have a raw type (Object) and morphia has to make certain assumptions as to its type and structure. Upon finding out that your structure violates those assumptions, it bails. This is why it works when you break things out like Alex has shown. This is probably fixable in one way or another but as I'm planning on redoing the mapping code top to bottom, I don't foresee trying to fix this in the current code. Hope this helps.