Populating an existing entity using hibernate - java

Its sort of the same question as this: Populating an existing entity using NHibernate only in regular Hibernate and not NHibernate. Also that question was not fully answerd IMO.
Too sum up, is it possible (in hibernate) to fill/populate a new entity by a previously attached entity? Something like:
daoService.getEntityManager().find(Student.class, 123L);
Student student = new Student(123) // Creating a new student
daoService.fill(student);
The result supposed to be that the student instance will have the exact value as the one found by the entity manager.
I can't use merge, as it returns a new instance (instead of filling the given instance) and I can't use load() if the ID belongs to an already attached entity.
I want the code to be generic to every entity, so its not practical to just set the values manually. I'd rather not use refelection in this context because of all the potential pitfalls (Lazy initialization, inheritance, get methods with no corresponding set methods and vice versa, etc.)
I'm working on a legacy system that uses both JPA entities and non entities (with JDBC) and this can solve me some major problems.
Thanks in advance for the help!

the syntax is not completly accurate since i'm writing of the top of my head
Student student = new Student();
metadata = factory.getClassMetaData(Student.class);
metadata.getIdproperty();
// copy id
for(Property prop in metadata.getPropertyClosureIterator)
{
prop.setValue(student, prop.getValue(managedStudent));
}

Related

Hibernate associations using too much memory

I have a table "class" which is linked to tables "student" and "teachers".
A "class" is linked to multiple students and teachers via foriegn key relationship.
When I use hibernate associations and fetch large number of entities(tried for 5000) i am seeing that it is taking 4 times more memory than if i just use foreign key place holders.
Is there something wrong in hibernate association?
Can i use any memory profiler to figure out what's using too much memory?
This is how the schema is:
class(id,className)
student(id,studentName,class_id)
teacher(id,teacherName,class_id)
class_id is foreign key..
Case #1 - Hibernate Associations
1)in Class Entity , mapped students and teachers as :
#Entity
#Table(name="class")
public class Class {
private Integer id;
private String className;
private Set<Student> students = new HashSet<Student>();
private Set<Teacher> teachers = new HashSet<Teacher>();
#OneToMany(fetch = FetchType.EAGER, mappedBy = "classRef")
#Cascade({ CascadeType.ALL })
#Fetch(FetchMode.SELECT)
#BatchSize(size=500)
public Set<Student> getStudents() {
return students;
}
2)in students and teachers , mapped class as:
#Entity
#Table(name="student")
public class Student {
private Integer id;
private String studentName;
private Class classRef;
#ManyToOne
#JoinColumn(name = "class_id")
public Class getClassRef() {
return classRef;
}
Query used :
sessionFactory.openSession().createQuery("from Class where id<5000");
This however was taking a Huge amount of memory.
Case #2- Remove associations and fetch seperately
1)No Mapping in class entity
#Entity
#Table(name="class")
public class Class {
private Integer id;
private String className;
2)Only a placeholder for Foreign key in student, teachers
#Entity
#Table(name="student")
public class Student {
private Integer id;
private String studentName;
private Integer class_id;
Queries used :
sessionFactory.openSession().createQuery("from Class where id<5000");
sessionFactory.openSession().createQuery("from Student where class_id = :classId");
sessionFactory.openSession().createQuery("from Teacher where class_id = :classId");
Note - Shown only imp. part of the code. I am measuring memory usage of the fetched entities via JAMM library.
I also tried marking the query as readOnly in case #1 as below, which does not improve memory usage very much ; just a very little. So that's not the solve.
Query query = sessionFactory.openSession().
createQuery("from Class where id<5000");
query.setReadOnly(true);
List<Class> classList = query.list();
sessionFactory.getCurrentSession().close();
Below are the heapdump snapshots sorted by sizes. Looks like the Entity maintained by hibernate is creating the problem..
Snapshot of Heapdump for hibernate associations program
Snapshot of heapdump for fetching using separate entities
You are doing a EAGER fetch with the below annotation. This will in turn fetch all the students without even you accessing the getStudents(). Make it lazy and it will fetch only when needed.
From
#OneToMany(fetch = FetchType.EAGER, mappedBy = "classRef")
To
#OneToMany(fetch = FetchType.LAZY, mappedBy = "classRef")
When Hibernate loads a Class entity containing OneToMany relationships, it replaces the collections with its own custom version of them. In the case of a Set, it uses a PersistentSet. As can be seen on grepcode, this PersistentSet object contains quite a bit of stuff, much of it inherited from AbstractPersistentCollection, to help Hibernate manage and track things, particularly dirty checking.
Among other things, the PersistentSet contains a reference to the session, a boolean to track whether it's initialized, a list of queued operations, a reference to the Class object that owns it, a string describing its role (not sure what exactly that's for, just going by the variable name here), the string uuid of the session factory, and more. The biggest memory hog among the lot is probably the snapshot of the unmodified state of the set, which I would expect to approximately double memory consumption by itself.
There's nothing wrong here, Hibernate is just doing more than you realized, and in more complex ways. It shouldn't be a problem unless you are severely short on memory.
Note, incidentally, that when you save a new Class object that Hibernate previously was unaware of, Hibernate will replace the simple HashSet objects you created with new PersistentSet objects, storing the original HashSet wrapped inside the PersistentSet in its set field. All Set operations will be forwarded to the wrapped HashSet, while also triggering PersistentSet dirty tracking and queuing logic, etc. With that in mind, you should not keep and use any external references to the Set from before saving, and should instead fetch a new reference to Hibernate's PersistentSet instance and use that if you need to make any changes (to the set, not to the students or teachers within it) after the initial save.
Regarding the huge memory consumption you are noticing, one potential reason is Hibernate Session has to maintain the state of each entity it has loaded the form of EntityEntry object i.e., one extra object, EntityEntry, for each loaded entity. This is needed for hibernate automatic dirty checking mechanism during the flush stage to compare the current state of entity with its original state (one that is stored as EntityEntry).
Note that this EntityEntry is different from the object that we get to access in our application code when we call session.load/get/createQuery/createCriteria. This is internal to hibernate and stored in the first level cache.
Quoting form the javadocs for EntityEntry :
We need an entry to tell us all about the current state of an object
with respect to its persistent state Implementation Warning: Hibernate
needs to instantiate a high amount of instances of this class,
therefore we need to take care of its impact on memory consumption.
One option, assuming the intent is only to read and iterate through the data and not perform any changes to those entities, you can consider using StatelessSession instead of Session.
The advantage as quoted from Javadocs for Stateless Session:
A stateless session does not implement a first-level cache nor
interact with any second-level cache, nor does it implement
transactional write-behind or automatic dirty checking
With no automatic dirty checking there is no need for Hibernate to create EntityEntry for each entity of loaded entity as it did in the earlier case with Session. This should reduce pressure on memory utilization.
Said that, it does have its own set of limitations as mentioned in the StatelessSession javadoc documentation.
One limitation that is worth highlighting is, it doesn't lazy loading the collections. If we are using StatelessSession and want to load the associated collections we should either join fetch them using HQL or EAGER fetch using Criteria.
Another one is related to second level cache where it doesn't interact with any second-level cache, if any.
So given that it doesn't have any overhead of first-level cache, you may want to try with Stateless Session and see if that fits your requirement and helps in reducing the memory consumption as well.
Yes, you can use a memory profiler, like visualvm or yourkit, to see what takes so much memory. One way is to get a heap dump and then load it in one of these tools.
However, you also need to make sure that you compare apples to apples. Your queries in case#2 sessionFactory.openSession().createQuery("from Student where class_id = :classId");
sessionFactory.openSession().createQuery("from Teacher where class_id = :classId");
select students and teachers only for one class, while in case #1 you select way more. You need to use <= :classId instead.
In addition, it is a little strange that you need one student and one teacher record per one class. A teacher can teach more than one class and a student can be in more than one class. I do not know what exact problem you're solving but if indeed a student can participate in many classes and a teacher can teach more than one class, you will probably need to design your tables differently.
Try #Fetch(FetchMode.JOIN), This generates only one query instead of multiple select queries. Also review the generated queries. I prefer using Criteria over HQL(just a thought).
For profiling, use freewares like visualvm or jconsole. yourkit is good for advanced profiling, but it is not for free. I guess there is a trail version of it.
You can take the heapdump of your application and analyze it with any memory analyzer tools to check for any memory leaks.
BTW, I am not exactly sure about the memory usage for current scenario.
Its likely the reason is the bi-directional link from Student to Class and Class to Students. When you fetch Class A (id 4500), The Class object must be hydrated, in turn this must go and pull all the Student objects (and teachers presumably) associated with this class. When this happens each Student Object must be hydrated. Which causes the fetch of every class the Student is a part of. So although you only wanted class A, you end up with:
Fetch Class A (id 4900)
Returns Class A with reference to 3 students, Student A, B, C.
Student A has ref to Class A, B (id 5500)
Class B needs hydrating
Class B has reference to Students C,D
Student C needs hydrating
Student C only has reference to Class A and B
Student C hydration complete.
Student D needs hydrating
Student D only has reference to Class B
Student B hydration complete
Class B hydration complete
Student B needs hydrating (from original class load class A)
etc... With eager fetching, this continues until all links are hydrated. The point being that its possible you end up with Classes in memory that you didn't actually want. Or whose id is not less than 5000.
This could get worse fast.
Also, you should make sure you are overriding the hashcode and equals methods. Otherwise you may be getting redundant objects, both in memory and in your set.
One way to improve is either change to LAZY loading as other have mentioned or break the bidirectional links. If you know you will only ever access students per class, then don't have the link from student back to class. For student/class example it makes sense to have the bidirectional link, but maybe it can be avoided.
as you say you "I want "all" the collections". so lazy-loading won't help.
Do you need every field of every entity? In which case use a projection to get just the bits you want. See when to use Hibernate Projections.
Alternatively consider having minimalist Teacher-Lite and Student-Lite entity that the full-fat versions extend.

How to use OptaPlanner ValueRange from planning entity?

I'm attempting to limit the planning variables that can be associated with a particular entity. In the OptaPlanner manual in section 4.3.4.2.2, an example is shown, but it isn't clear how the list of variables should be generated. What should the list contain? Are these planning variables themselves? Can they be copies? If copies are allowed, then how are they compared? If not, the planning variable is not in scope when defining the planning entity - I realize that this is a Java question, but it isn't apparent how to access the list of planning variables from the planning entity definition.
Is this is a 6.1 feature that was not supported in earlier versions?
Will the Working Memory size be constrained by using this feature? That is my goal.
Your assistance is greatly appreciated!
Here's the example from the manual:
#PlanningVariable
#ValueRange(type = ValueRangeType.FROM_PLANNING_ENTITY_PROPERTY, planningEntityProperty = "possibleRoomList")
public Room getRoom() {
return room;
}
public List<Room> getPossibleRoomList() {
return getCourse().getTeacher().getPossibleRoomList();
}
Let's set the terminology straight first: The planning variable (for example getRoom() in the example) has a value range (which is a list of planning values) which different from entity instance to entity instance.
About such a List of planning values:
Each entity has it's own List instance, although multiple entities can share the same List instance if they have the exact same value range.
No copies: A planning value instance should only exists once in a Solution. So 2 entities with different value ranges, but with the same planning value in their value ranges, should be using the same planning value instance.

Hibernate: Doman Model to JPA Entity/DTO & Merge() Design pattern or best practice

The recommended way of using merge() is to first get the DTO first before inputting the changes.
public void merge(PersonModel model) {
Person inputDTO = PersonBuilder.build(model)
Person dto = get(pk)
dto.setName(inputDTO.getName())
dto.getChildren().clear()
Iterator<Child> iter = inputDTO .getChildren().Iterator();
while(iter.hasNext()){
dto.getChildren().add(iter.next());
}
dto.merge();
}
Is there a more elegant way of performing such operation translating domain model to dto and merging it so that no data are accidentally deleted.
Example of problem:
Hibernate: prevent delete orphan when using merge();
I find the need to clear the list and adding it very wasteful.
Can someone recommend me a design pattern or a way to code it properly?
Thank you
ADD ON:
1) Possible to use Hibernate Hashset to replace List? Will hibernate hashset replace elements base on primary keys?
any help?
"The recommended way of using merge() is to first get the DTO first before inputting the changes"
Who recommended you to do this?
"Is there a more elegant way of performing such operation translating domain model to dto and merging it so that no data are accidentally deleted."
I don't think you can translate domain objects to DTOs. A DTO is just about data, a domain object is data, behaviour and context. Completely different.
If you don't have behaviour and context in your domain objects (a.k.a. anemic domain model), you don't need an extra DTO layer that just duplicates the objects.
Because you tagged this question with Hibernate and mentioned it in your question, you don't need to call merge yourself because you just got the object from the database and Hibernate will flush the session to synchronize the changes with the database.
"Possible to use Hibernate Hashset to replace List? Will hibernate hashset replace elements base on primary keys?"
I would replace the List with a Hashset, since the table where the data is going to be stored is a set, not a list (you can't have duplicate records). A hashset will not replace elements based on primary keys. A set (any set, Hibernate's implementation is no different) works by preventing duplicates. It uses your equals() and getHashCode() implementation to find out if there is already an object in that set. If that is the case, it won't be added but it keeps the original.

How to handle Target Unreachable in an elegant way

I have 2 questions regarding the so common Target Unreachable Exception.
What's the best practice to handle it, for example you have:
Country has City, City has Street.
- do you put in Country's constructor new City(), and in City's constructor new Street()
(so that you somehow have them into a centralized place, but always make objects which you might not need)
OR you initialize the objects in various places in your code where you need them ? (spread all over your code)
- and if the user doesn't type anything say for Street, in order to prevent the insertion of a blank row in the DB
you put the street back to null. Where's the best place to put it back to null ?
(say you have Cascade.ALL or Extended Context, otherwise you'd just not save it if you knew it's empty)
PS: why isn't JSF just instantiating what it needs, and Hibernate not persisting entities that have all persistent fields empty ?
For performance or why ? Yet again, is it bad to have empty rows in db, just with PK and FKs ?
I think it depends on the relationship between the entities in your app. In some cases I do load a related instance of another object in the constructor but only in the case where you wouldn't have one entity without the other.
One alternative is to create the objects lazily in the getter:
public class Country {
private City city;
public City getCity() {
if (this.city == null) {
this.city = new City();
}
return this.city;
}
}
As far as the PS questions JSF doesn't instantiate objects for you -- I'm not sure that would be desirable ... but if you use the lazy getter approach you effectively get the same thing. Hibernate persists an entity if it has been instantiated since it persists the current state of the persistable object model and if it didn't persist that entity it wouldn't be working as expected.
I typically don't worry about a few null rows as I choose to use Hibernate knowing that an ORM comes with some small cost in performance. To me it is still well worth it to enjoy the abstraction of persistence.

Changing the type of an entity preserving its ID

I am using hibernate as a persistence layer. There are 2 entities that live in the same table extending one superclass with single table inheritance strategy.
#Entity
#Inheritance(strategy = InheritanceType.SINGLE_TABLE)
public abstract class A {
#Id
#GeneratedValue
protected Long id;
// some common fields for B and C
}
#Entity
public class B extends A {
// B-specific fields
}
#Entity
public class C extends A {
// C-specific fields
}
I have an instance of B with id=4. How do I change the type of this instance to C preserving it's ID (4)?
B b = em.find(B.class, 4L);
C c = convertToC(b);
c.setId(b.getId());
em.remove(b);
em.persist(c);
The code above fails with
org.hibernate.PersistentObjectException: detached entity passed to persist: C
Is it possible at all?
Hibernate attempts to make persistence as transparent as it possibly can - which means it tries to follow the same principles as normal Java objects. Now, rephrasing your question in Java, you'd get:
How can I convert an instance of B class into an instance of (incompatible) C class?
And you know the answer to that - you can't. You can create a new instance of C and copy necessary attributes, but B will always be B, never C. Thus the answer to your original question is - it cannot be done via JPA or Hibernate API.
However, unlike plain Java, with Hibernate you can cheat :-) InheritanceType.SINGLE_TABLE is mapped using #DiscriminatorColumn and in order to convert B into C you need to update its value from whatever's specified for B into whatever's specified for C. The trick is - you cannot do it using Hibernate API; you need to do it via plain SQL. You can, however, map this update statement as named SQL query and execute it using Hibernate facilities.
The algorithm, therefore, is:
Evict B from session if it's there (this is important)
Execute your named query.
Load what-is-now-known-as-C using former B's id.
Update / set attributes as needed.
Persist C
In this case, "c" is an object which the hibernate session knows nothing about, but it has an ID, so it assumes that the object has already been persisted. In that context, persist() makes no sense, and so it fails.
The javadoc for Hibernate Session.persist() (I know you're not using the Hibernate API, but the semantics are the same, and the hibernate docs are better) says "Make a transient instance persistent". If your object already has an ID, it's not transient. Instead, it thinks it's a detached instance (i.e. an instance that has been persisted, but is not associated with the current session).
I suggest you try merge() instead of persist().
You can use your own id (not generated) and do the following:
Retrive B
open transaction
delete B
commit the transaction
open a new transaction
create C and persist it
Close the second transaction
In this way you will clear the id from the table before re-inserting it as C.
How do you distinguish between the two entities in the table? I assume that there some field value (or values) that you can change to make B into a C?
You could have a method where you load the super-class A and change the distinguishing values and save. Then in your next Hibernate Session your B will be a C.
I think skaffman is right here, once the id has been set it won't persist, and further because the id is generated it expects the sequence to be in charge of assigning the id number.
You could possibly not put the id as #GeneratedValue? or one of the different generator strategy types maybe to avoid the merge generating a new sequence value, but I suspect that would be problematic.

Categories

Resources