Related
I have a table "class" which is linked to tables "student" and "teachers".
A "class" is linked to multiple students and teachers via foriegn key relationship.
When I use hibernate associations and fetch large number of entities(tried for 5000) i am seeing that it is taking 4 times more memory than if i just use foreign key place holders.
Is there something wrong in hibernate association?
Can i use any memory profiler to figure out what's using too much memory?
This is how the schema is:
class(id,className)
student(id,studentName,class_id)
teacher(id,teacherName,class_id)
class_id is foreign key..
Case #1 - Hibernate Associations
1)in Class Entity , mapped students and teachers as :
#Entity
#Table(name="class")
public class Class {
private Integer id;
private String className;
private Set<Student> students = new HashSet<Student>();
private Set<Teacher> teachers = new HashSet<Teacher>();
#OneToMany(fetch = FetchType.EAGER, mappedBy = "classRef")
#Cascade({ CascadeType.ALL })
#Fetch(FetchMode.SELECT)
#BatchSize(size=500)
public Set<Student> getStudents() {
return students;
}
2)in students and teachers , mapped class as:
#Entity
#Table(name="student")
public class Student {
private Integer id;
private String studentName;
private Class classRef;
#ManyToOne
#JoinColumn(name = "class_id")
public Class getClassRef() {
return classRef;
}
Query used :
sessionFactory.openSession().createQuery("from Class where id<5000");
This however was taking a Huge amount of memory.
Case #2- Remove associations and fetch seperately
1)No Mapping in class entity
#Entity
#Table(name="class")
public class Class {
private Integer id;
private String className;
2)Only a placeholder for Foreign key in student, teachers
#Entity
#Table(name="student")
public class Student {
private Integer id;
private String studentName;
private Integer class_id;
Queries used :
sessionFactory.openSession().createQuery("from Class where id<5000");
sessionFactory.openSession().createQuery("from Student where class_id = :classId");
sessionFactory.openSession().createQuery("from Teacher where class_id = :classId");
Note - Shown only imp. part of the code. I am measuring memory usage of the fetched entities via JAMM library.
I also tried marking the query as readOnly in case #1 as below, which does not improve memory usage very much ; just a very little. So that's not the solve.
Query query = sessionFactory.openSession().
createQuery("from Class where id<5000");
query.setReadOnly(true);
List<Class> classList = query.list();
sessionFactory.getCurrentSession().close();
Below are the heapdump snapshots sorted by sizes. Looks like the Entity maintained by hibernate is creating the problem..
Snapshot of Heapdump for hibernate associations program
Snapshot of heapdump for fetching using separate entities
You are doing a EAGER fetch with the below annotation. This will in turn fetch all the students without even you accessing the getStudents(). Make it lazy and it will fetch only when needed.
From
#OneToMany(fetch = FetchType.EAGER, mappedBy = "classRef")
To
#OneToMany(fetch = FetchType.LAZY, mappedBy = "classRef")
When Hibernate loads a Class entity containing OneToMany relationships, it replaces the collections with its own custom version of them. In the case of a Set, it uses a PersistentSet. As can be seen on grepcode, this PersistentSet object contains quite a bit of stuff, much of it inherited from AbstractPersistentCollection, to help Hibernate manage and track things, particularly dirty checking.
Among other things, the PersistentSet contains a reference to the session, a boolean to track whether it's initialized, a list of queued operations, a reference to the Class object that owns it, a string describing its role (not sure what exactly that's for, just going by the variable name here), the string uuid of the session factory, and more. The biggest memory hog among the lot is probably the snapshot of the unmodified state of the set, which I would expect to approximately double memory consumption by itself.
There's nothing wrong here, Hibernate is just doing more than you realized, and in more complex ways. It shouldn't be a problem unless you are severely short on memory.
Note, incidentally, that when you save a new Class object that Hibernate previously was unaware of, Hibernate will replace the simple HashSet objects you created with new PersistentSet objects, storing the original HashSet wrapped inside the PersistentSet in its set field. All Set operations will be forwarded to the wrapped HashSet, while also triggering PersistentSet dirty tracking and queuing logic, etc. With that in mind, you should not keep and use any external references to the Set from before saving, and should instead fetch a new reference to Hibernate's PersistentSet instance and use that if you need to make any changes (to the set, not to the students or teachers within it) after the initial save.
Regarding the huge memory consumption you are noticing, one potential reason is Hibernate Session has to maintain the state of each entity it has loaded the form of EntityEntry object i.e., one extra object, EntityEntry, for each loaded entity. This is needed for hibernate automatic dirty checking mechanism during the flush stage to compare the current state of entity with its original state (one that is stored as EntityEntry).
Note that this EntityEntry is different from the object that we get to access in our application code when we call session.load/get/createQuery/createCriteria. This is internal to hibernate and stored in the first level cache.
Quoting form the javadocs for EntityEntry :
We need an entry to tell us all about the current state of an object
with respect to its persistent state Implementation Warning: Hibernate
needs to instantiate a high amount of instances of this class,
therefore we need to take care of its impact on memory consumption.
One option, assuming the intent is only to read and iterate through the data and not perform any changes to those entities, you can consider using StatelessSession instead of Session.
The advantage as quoted from Javadocs for Stateless Session:
A stateless session does not implement a first-level cache nor
interact with any second-level cache, nor does it implement
transactional write-behind or automatic dirty checking
With no automatic dirty checking there is no need for Hibernate to create EntityEntry for each entity of loaded entity as it did in the earlier case with Session. This should reduce pressure on memory utilization.
Said that, it does have its own set of limitations as mentioned in the StatelessSession javadoc documentation.
One limitation that is worth highlighting is, it doesn't lazy loading the collections. If we are using StatelessSession and want to load the associated collections we should either join fetch them using HQL or EAGER fetch using Criteria.
Another one is related to second level cache where it doesn't interact with any second-level cache, if any.
So given that it doesn't have any overhead of first-level cache, you may want to try with Stateless Session and see if that fits your requirement and helps in reducing the memory consumption as well.
Yes, you can use a memory profiler, like visualvm or yourkit, to see what takes so much memory. One way is to get a heap dump and then load it in one of these tools.
However, you also need to make sure that you compare apples to apples. Your queries in case#2 sessionFactory.openSession().createQuery("from Student where class_id = :classId");
sessionFactory.openSession().createQuery("from Teacher where class_id = :classId");
select students and teachers only for one class, while in case #1 you select way more. You need to use <= :classId instead.
In addition, it is a little strange that you need one student and one teacher record per one class. A teacher can teach more than one class and a student can be in more than one class. I do not know what exact problem you're solving but if indeed a student can participate in many classes and a teacher can teach more than one class, you will probably need to design your tables differently.
Try #Fetch(FetchMode.JOIN), This generates only one query instead of multiple select queries. Also review the generated queries. I prefer using Criteria over HQL(just a thought).
For profiling, use freewares like visualvm or jconsole. yourkit is good for advanced profiling, but it is not for free. I guess there is a trail version of it.
You can take the heapdump of your application and analyze it with any memory analyzer tools to check for any memory leaks.
BTW, I am not exactly sure about the memory usage for current scenario.
Its likely the reason is the bi-directional link from Student to Class and Class to Students. When you fetch Class A (id 4500), The Class object must be hydrated, in turn this must go and pull all the Student objects (and teachers presumably) associated with this class. When this happens each Student Object must be hydrated. Which causes the fetch of every class the Student is a part of. So although you only wanted class A, you end up with:
Fetch Class A (id 4900)
Returns Class A with reference to 3 students, Student A, B, C.
Student A has ref to Class A, B (id 5500)
Class B needs hydrating
Class B has reference to Students C,D
Student C needs hydrating
Student C only has reference to Class A and B
Student C hydration complete.
Student D needs hydrating
Student D only has reference to Class B
Student B hydration complete
Class B hydration complete
Student B needs hydrating (from original class load class A)
etc... With eager fetching, this continues until all links are hydrated. The point being that its possible you end up with Classes in memory that you didn't actually want. Or whose id is not less than 5000.
This could get worse fast.
Also, you should make sure you are overriding the hashcode and equals methods. Otherwise you may be getting redundant objects, both in memory and in your set.
One way to improve is either change to LAZY loading as other have mentioned or break the bidirectional links. If you know you will only ever access students per class, then don't have the link from student back to class. For student/class example it makes sense to have the bidirectional link, but maybe it can be avoided.
as you say you "I want "all" the collections". so lazy-loading won't help.
Do you need every field of every entity? In which case use a projection to get just the bits you want. See when to use Hibernate Projections.
Alternatively consider having minimalist Teacher-Lite and Student-Lite entity that the full-fat versions extend.
If do not have time please have a look at the example
I have two types of users, temporary users and permanent users.
Temporary users use the system as guest just provide their name and use it but system needs to track them.
Permanent users are those that are registered and permanent.
Once user create a permanent record for himself, I need to copy all the information that has been tracked while user was a guest to his permanent record.
Classes are as following,
#Entity
public class PermUser{
#Id
#GeneratedValue
private long id;
#OneToMany
private List Favorites favorites;
....
}
#Entity
public class Favorites {
#Id
#GeneratedValue
private long id;
#OneToMany (cascade = CascadeType.ALL)
#LazyCollection(LazyCollectionOption.FALSE)
private List <FavoriteItems> items;
...
}
#Entity
public class FavoriteItems {
#Id
#GeneratedValue
private long id;
private int quantity;
#ManyToOne
private Ball ball;
..
}
#Entity
public class TempUser extends PermUser{
private String date;
....
}
Problems is :
If I clone the tempUser object, I am copying the id parameters as well so when saving the perm user object it shows a message like "Duplicate entry '10' for key ...", I can not remove the tempUser first then save the permUser as if saving permUser failed I will miss the data. If I try to copy each ball of favoriteitems separately without id of item it would not be an efficient way.
Example (Question in one sentence: As shown blew a user may have more than one TempUser record and just one PermUser record, therefore I need to add information of all the TempUser records to that single PermUser record.)
Type of record | name | favorites | date
| | |
1)TempUser | Jack | 2 items | 1/1/2013
2)TempUser | Jack | 3 items | 1/4/2013
---------------------------------------------------------------------------
PermUser | Jack | 5 items ( 2 + 3 items from his temp records)
*Please note, I need to find a solution, and do not care if try a new solution rather than cloning the object.
The reason that I have two different classes is that tempUser has few additional attributes, I may also need to add favorites of few tempUsers to favorites list of one permUser. and also as mentioned above a user may have many different not related temp records
Forgive me if I'm missing something, but I don't think that TempUser and PermUser should be different classes. TempUser extends PermUser, which is an "is-a" relationship. Clearly, temporary users are not a type of permanent user. Your question doesn't give enough information to justify making them different -- perhaps they're the same class, and the difference can be expressed as a few new attributes? Eg:
#Entity
public class User{
#OneToMany(cascade = CascadeType.ALL)
private List Favorites favorites;
private boolean isTemporary;
....
}
The "transition" from temporary to permanent can be handled by some controller, making sure that isTemporary = false and that the other properties of a permanent user are appropriately set. This would completely side-step the cloning issue and would be much easier on your database.
I just had the same problem. I've been digging through many interesting articles and questions in boards like SO untill I had enough inspiration.
At first I also wanted to have sub classes for different types of users. It turns out that the idea itself is a design flaw:
Don't use inheritance for defining roles!
More Information here Subtle design: inheritance vs roles
Think of an user as a big container which just harbors other entities like credentials, preferences, contacts, items, userinformation, etc.
With this in mind you can easily change certain abilities/behaviour of certain users,
Of course you can define a role many users can play. Users playing the same role will have the same features.
If you have many entities/objects depending on each other you shoukd think of a building mechanism/pattern that sets up a certain user role in a well defined way.
Some thoughts: A proper way for JPA entities instantiation
If you had a builder/factory for users, your other problem wouldn't be that complex anymore.
Example (really basic, do not expect too much!)
public void changeUserRoleToPermanent (User currentUser) {
UserBuilder builder = new UserBuilder();
builder.setRole(Role.PERMANENT); // builder internally does all the plumping
// copy the stuff you want to keep
builder.setId(user.getId);
builder.setPrefences();
// ...
User newRoleUser = builder.build();
newRoleUser = entityManager.merge(newRoleUser);
entitymanager.detach(currentUser);
// delete old stuff
entityManager.remove(currentUser.getAccountInfo()); // Changed to different implementaion...
}
I admit, it is some work but you will have many possibilities once you have the infrastructure ready! You can then "invent" new stuff really fast!
I hope I could spread some ideas. I'm sorry for my miserable english.
As I agree with prior comments that if it is possible you should reevaluate these entities, but if that is not possible I suggest that you return a general User from the database and then caste that user as either PermUser or TempUser which both would be extensions of User, based on the presence of certain criteria.
For part 2 of your problem:
You are using CascadeType.ALL for the favorites relation. This includes CascadeType.REMOVE, which means a remove operation on the user will cascade to that entity. So specify an array of CascadeType values that doesn't include CascadeType.REMOVE.
See http://webarch.kuzeko.com/2011/11/hibernate-understanding-cascade-types/.
What I am going to suggest might not be that OO but hope will be effective. I am happy to keep PermUser and TempUser separate do not extend it, not binding them into is-a relationship also. So I will have two separate tables in database one for TempUser and one for PermUser thereby treating them as two seperate entities. Many will find it to be redundant.. but read on... we all know.. sometimes redundancy is good.. So now...
1) I don't know when a TempUser would want to become PermUser. So I will always have all TempUsers in separate table.
2) What would I do if a user always wants to be TempUser..? I still have separate TempUser table to refer to..
3) I am assuming that when a TempUser wants to become a PermUser you are reading his TempUser name to get his records as TempUser.
So now your job is easy. So now when a TempUser want to become PermUser all you would do is copy TempUser objects,populate your required attributes and create a new PermUser object with it. After that you can keep your TempUser record if you want to or delete it.. :)
Also you would have a history how many of your TempUsers actually become permanent if you keep it and also know in what average time a TempUser becomes permanent.
I think you should do a manual deep clone. Not exactly a clone since you have to merge data from several tempUsers to a single permUser. You can use reflection and optionally annotations to automate the copy of information.
To automatically copy fields from an existing object to a new one you can follow this example. It is not a deep clone but may help you as starting point.
Class 'c' is used as reference. src and dest must be instances of 'c' or instance of subclases of 'c'. The method will copy the attributes defined in 'c' and superclasses of 'c'.
public static <E> E copyObject(E dest, E src, Class<?> c) throws IllegalArgumentException, IllegalAccessException{
// TODO: You may want to create new instance of 'dest' here instead of receiving one as parameter
if (!c.isAssignableFrom(src.getClass()))
{
throw new IllegalArgumentException("Incompatible classes: " + src.getClass() + " - " + c);
}
if (!c.isAssignableFrom(dest.getClass()))
{
throw new IllegalArgumentException("Incompatible classes: " + src.getClass() + " - " + c);
}
while (c != null && c != Object.class)
{
for (Field aField: c.getDeclaredFields())
{
// We skip static and final
int modifiers = aField.getModifiers();
if ( Modifier.isStatic(modifiers) || Modifier.isFinal(modifiers))
{
continue;
}
// We skip the fields annotated with #Generated and #GeneratedValue
if (aField.getAnnotation(GeneratedValue.class) == null &&
aField.getAnnotation(Generated.class) == null)
{
aField.setAccessible(true);
Object value = aField.get(src);
if (aField.getType().isPrimitive() ||
String.class == aField.getType() ||
Number.class.isAssignableFrom(aField.getType()) ||
Boolean.class == aField.getType() ||
Enum.class.isAssignableFrom(aField.getType()))
{
try
{
// TODO: You may want to recursive copy value too
aField.set(dest, value);
}
catch(Exception e)
{
e.printStackTrace();
}
}
}
}
c = c.getSuperclass();
}
return dest;
}
Like some have already suggested I would tackle this problem using inheritance + either shallow copies (to share references) or deep cloning with libraries that let me exclude / manipulate the auto-generated ids (when you want to duplicate items).
Since you don't want to bend your database model too much, start with a Mapped Superclass with common attributes. This will not be reflected in your database at all. If you could I would go with Single Table Inheritance which maps close to your model (but may require some adjusts on the database layer).
#MappedSuperclass
public abstract class User {
#Id
#GeneratedValue
private long id;
// Common properties and relationships...
Then have both PermUser and TempUser inherit from User, so that they will have a lot of common state:
#Entity
#Table(name="USER")
public class PermUser extends User {
// Specific properties
}
Now there are several possible approaches, if your classes don't have a lot of state, you can, for instance, make a constructor that builds a PermUser collecting data of a List of TempUsers.
Mock code:
#Entity
#Table(name="PERMANENT_USER")
public class PermUser extends User {
public PermUser() {} // default constructor
public PermUser(List<TempUser> userData) {
final Set<Favorites> f = new LinkedHashSet<>();
// don't set the id
for(TempUser u : userData) {
this.name = u.getName();
// Shallow copy that guarants uniqueness and insertion order
// Favorite must override equals and hashCode
f.addAll(u.getFavorites());
}
this.favorites = new ArrayList<>(f);
// Logic to conciliate dates
}
}
When you persist the PermUser it will generate a new id, cascaded unidirectional relationships should work fine.
On the other hand, if your class have a lot of attributes and relationships, plus there are a lot of situations in which you really need to duplicate objects, then you could use a Bean Mapping library such as Dozer (but be warned, cloning objects is a code smell).
Mapper mapper = new DozerBeanMapper();
mapper.map(tempUser.getFavorites(), user.getFavorites());
With dozer you can configure Mappings through annotations, API or XML do to such things as excluding fields, type casting, etc.
Mock mapping:
<mapping>
<class-a>my.object.package.TempUser</class-a>
<class-b>my.object.package.PermUser</class-b>
<!-- common fields with the same name will be copied by convention-->
<!-- exclude ids and fields exclusive to temp
<field-exclude>
<a>fieldToExclude</a>
<b>fieldToExclude</b>
</field-exclude>
</mapping>
You can, for example, exclude ids, or maybe copy permUser.id to all of the cloned bidirectional relationships back to User (if there is one), etc.
Also, notice that cloning collections is a cumulative operation by default.
From Dozer documentation:
If you are mapping to a Class which has already been initialized, Dozer will either 'add' or 'update' objects to your List. If your List or Set already has objects in it dozer checks the mapped List, Set, or Array and calls the contains() method to determine if it needs to 'add' or 'update'.
I've used Dozer in several projects, for example, in one project there were a JAXB layer that needed to be mapped to a JPA model layer. They were close enough, but unfortunately I couldn't bend neither. Dozer worked quite well, was easy to learn and spare me from writing 70% of the boring code. I can deeply clone recommend this library out of personal experience.
From a pure OO perspective it does not really make sense for an instance to morph from one type into another, Hibernate or not. It sounds like you might want to reconsider the object model independently of its database representation. FourWD seems more like a property of a car than a specialization, for example.
A good way to model this is to create something like a UserData class such that TempUser has-a UserData and PermUser has-a UserData. You could also make TempUser has-a PermUser, though that's going to be less clear. If your application needs to use them interchangeably (something you'd get with the inheritance you were using), then both classes can implement an interface that returns the UserData (or in the second option, getPermUser, where PermUser returns itself).
If you really want to use inheritance, easiest might be to map it using the "Table per class hierarchy" and then using straight JDBC to update the discriminator column directly.
In my java application I am using equal objects multiple times at different places. That means the equals method returns true, when comparing theses objects. Now I want to update one object and make the changes to all objects that are equal. Do you know if there is a pattern for that?
My concrete use case is:
I am using JSF, JPA and CDI. A user is on web page that allows him to edit the detached entity EntityA. The page is sessionscoped. EntityA has two references to an EntityB (also detached). These objects can be same. Not the same reference, but they may be equal.
#Entity
public class EntityA {
#OneToOne()
private EntityB entity1;
#OneToOne();
private EntityB entity2;
}
The JSF view lets the uses select entity1 and entity2 from a selection list. It also shows some details of theses EntityBs and the user is allowed to edit entity1 and entity2 seperately. Everything works fine, except the user has choses the same (equal) EntityB for entity1 and entity2. Then, only the references to these objects are updated. Of course entity1 and entity2 are two different JPA entites, and are not the same reference. But I want to distribute the changes to all detached instances of EntityB. I have this situation hundreds of times in my application, so I dont want to take care about, which objects have to be updated in which situations. I need some solutation the does it for me automatically. One Idea was to keep all objects I use in this session in special list and every time a request was submitted and processed iterate over this map and change alle equal objects. But his sounds very dirty. Maybe there is a build in JPA function to make all equal objects the same reference. I dont know if this is possible. Do you have a solution for this? Thanks.
I'm going to abstract your problem out a bit here: if a change to one object requires changing a number of other objects, consider putting the field that you're changing in a separate object, and have all those objects reference it.
For example, if you have:
class MyClass {
String info;
int id;
}
and two instances of MyClass with the same 'id' should both be updated when the 'info' field changes then use this:
class myClass {
myInfoClass info;
int id
}
class myInfoClass {
String value;
}
and give all instances of myClass that are equal the same instances of myInfoClass. Changing myClass.info.value will effectively change all instances of myClass, because they all hold the same instance of myInfoClass.
Sorry if I've got the syntax slightly wrong, I jump between languages a lot.
I use this technique in a game I wrote recently where a switch activates a door- both the switch and door have a Circuit object that holds a boolean powered field. The doors 'isOpen()' method simply returned circuit.powered, and when the switch is activated I just call switch.circuit.powered = true, and the door is automatically considered 'open'. Previously, I had it searching the game's map for all doors with the same circuit id, and changing the powered field on each.
this is classic form handling logic
if the user clicks the save button manipulate the data in the database
reload the data every time you create the web page
you should not cache the data in the web session
if you need caching, activate it in the persistence layer (ex. hibernate cache)
Suppose I have to develop a simple data model in Java for Order, which contains Order Items. It looks like Order should hold a reference to a collection Order Items Now what if Order and Order Items are stored in a database? Should the Order still hold a reference to the collection or just a simple function retrieveItemsByOrderId should be provided instead?
Now what if Order and Order Items are stored in a database? Should the Order still hold a reference to the collection or just a simple function retrieveItemsByOrderId should be provided instead?
This would depend on how your object model is used by the persistence layer to map classes to the database tables. If you are using Hibernate/JPA/EclipseLink/Toplink or a similar ORM framework, you would merely have a getter method in your Order class that would return the collection of OrderItem instances. Partial code representation would be:
class Order
{
private long id;
private Set<OrderItem> orderItems;
...
public Set<OrderItem> getOrderItems()
{
return orderItems;
}
public void setOrderItems(Set<OrderItem> orderItems)
{
this.orderItems = orderItems;
}
}
class OrderItem
{
private Order order;
...
public Order getOrder()
{
return order;
}
public void setOrder(Order order)
{
this.order = order;
}
}
I haven't listed all annotations in use by the frameworks, including the keys for each entity class, but you'll need to do this to get things working. The salient points however are:
each instance of an Order class contains the Id, which may be the natural key (or may be a generated one).
invoking the getOrderItems method will result in the Set of order items associated with an order to be returned. Note that most ORMs will lazily fetch collections, so you'll need to understand a few more concepts like working with managed and detached entities to actually get this to work; you might need to write an application service to do the work of merging detached entities and then fetch the collection.
One of the comments stated that there is no need to reference the Order from the OrderItem class. This would lead to a unidirectional relationship instead of a bidirectional one. You can use unidirectional relationships in most ORM frameworks, but consider the following:
It would not be trivial to maintain referential integrity (using foreign keys) for unidirectional relationships; this would depend on your ORM framework. Some ORM frameworks might allow you to not have a reference to Order from OrderItem without any further effort on your part, while others might require you to use a Join table. If you are persisting an object graph in the database, then it is imperative to know which OrderItem maps to an Order; by removing the reference from the OrderItem, you will be forced to map this information elsewhere, in a different entity and usually resulting in a different table; this is the Join table that are referred to previously.
Unidirectional relationships are sufficient for most uses. If the Order is responsible for accessing OrderItem instances, then you do not need bidirectional relationships. But if you find yourself needing to access the Order for an OrderItem, then you will need a bidirectional relationship. I would suggest reading the Mutual Registration Pattern, so that you will always be able to maintain referential integrity irrespective of any mutation operations performed on Order or OrderItem classes in such a case. Without that pattern, you are almost always going to find yourself seeing vague, unexplained and incorrect object graphs resulting in an inconsistent database state.
If you are not using ORM or you don't intend to, then it would depend on you are accessing the OrderItem instances; in short, it depends on how you are writing your persistence layer. If you are using the DAO pattern, then adding a new method retrieveItemsByOrderId into your DAO interface would be the solution.
I think you should keep a reference to OrderItem collection in your Order model class. Then, you could implement the method getOrderItems() that retrieves the items from the db based on the order id.
This query should be performed only if you need to access the order items (search for LAZY LOADING) and not every time you load the Order entity from DB.
Using a reference to OrderItem collection in your Order model class will leverage your application in case you need to access twice the order items in the same request-response flow.
A skeleton of getOrderItems() method would be like this:
public List<OrderItem> getOrderItems(){
if(this.orderItems==null)
// perform the query
// set the this.orderItems values
}
return this.orderItems
}
I'm new to JPA/Hibernate and I'm wondering, what is usually the best way of updating a complex entity?
For example, consider the entity below:
#Entity
public class Employee {
#Id
private long id;
#Column
private String name;
#ManyToMany
private List<Positions> positions;
// Getters and setters...
}
What is the best way to update the references to positions? Currently a service is passing me a list of positions that the employee should have. Creating them is easy:
for (long positionId : positionIdList) {
Position position = entityManager.find(positionId);
employee.getPositions.add(position);
}
entityManager.persist(employee);
However, when it comes to updating the employee, I'm not sure what the best way of updating the employees positions would be. I figure there is two options:
I parse through the list of position id's and determine if the position needs to be added/deleted (this doesn't seem like a fun process, and may end up with many delete queries)
I delete all positions and then re-add the specified positions. Is there a way in JPA/Hibernate to delete all children (in this case positions) with one sql command?
Am I thinking about this the wrong way? What do you guys recommend?
How about
employee.getPositions.clear(); // delete all existing one
// add all of them again
for (long positionId : positionIdList) {
Position position = entityManager.find(positionId);
employee.getPositions.add(position);
}
although it may not be the most efficient approach. For a detail discussion see here.
Cascading won't help here much because in ManyToMany relation the positions may not get orphaned as they may be attached to other employee (s), or even they shouldn't be deleted at all, because they can exists on their own.
JPA/Hibernate has support for this. It's called cascading. By using #ManyToMany(cascade=CascadeType.ALL) (or limit the cascade type to PERSIST and MERGE), you specify that the collection should be persisted (merged/deleted/etc) when the owning object is.
When deletion is concerned, there is a special case, when objects become "orphans" in the database. This is handled by setting orphanRemoval=true