I have two entities mapped in my application in order to model a Team and its Members. A Team can have many Members and a Member can belong to no more than one Team. Everything is fine about handling this concepts. The problem comes when I try to move a Member from one existing Team to another.
The entities are presented below, in simplified form. The very last method, transfer(), is the one that should perform the removal of a certain Member from its Team and send it to another one.
#Entity
public class Member extends Person {
#ManyToOne
private Team team;
protected Member() {
super();
}
public Member(Team team, String name) {
super(name);
this.team = team;
}
// Trivial getters and setters...
public Team getTeam() {
return team;
}
protected void setTeam(Team team) {
this.team = team;
}
}
#Entity
public class Team {
#Id
private long id;
private String name;
#OneToMany(mappedBy="team", cascade=CascadeType.ALL)
private List<Member> members = new ArrayList<Member>();
protected Team() {
}
public Team(String name) {
this.name = name;
}
// trivial getters and setters...
public Member addMember(String name) {
Member member = new Member(this, name);
members.add(member);
return member;
}
protected void addMember(Member member) {
members.add(member);
member.setTeam(this);
}
public void removeMember(Member member) {
members.remove(member);
}
public Member memberByName(String memberName) {
for(Member member : members)
if(member.getName().equals(memberName))
return member;
return null;
}
public Collection<Members> getMembers() {
return Collections.unmodifiableCollection(members);
}
public void transfer(Member member, Team destination) {
members.remove(member);
destination.addMember(member);
}
}
I have this unit test code that is intended to validate the transfer service
Team teamA = teamRepository.teamById(idTeamA);
Team teamB = teamRepository.teamById(idTeamB);
Team teamC = teamRepository.teamById(idTeamC);
Member zaki = teamA.memberByName("Zaki");
Member denise = teamA.memberByName("Denise");
EntityTransaction t = teamRepository.transactionBegin();
teamA.transfer(zaki, teamB);
teamA.transferir(denise, teamC);
t.commit();
I have the following exception in the commit() line
javax.persistence.PersistenceException: org.hibernate.PersistentObjectException: detached entity passed to persist: application.domain.Member
Any ideas?
UPDATE 1:
I decided to perform a little test and changed the code of the transfer() method as follows
public void transfer(Member member, Team destination) {
member.setTeam(this);
}
The result was curious: no error, but also no update on the tables. Hibernate couldn't track the update and the transfer simply didn't happen.
UPDATE 2:
I decided to give it a try on the suggestion from Steve K and changed the transfer() method to the following:
public void transfer(Member member, Team destination) {
destination.addMember(member);
members.remove(member);
}
Looking the addMember() and removeMember() methods (below) we see that the Team is begin updated too.
protected void addMember(Member member) {
members.add(member);
member.setTeam(this);
}
public void removeMember(Member member) {
members.remove(member);
}
So, the member is being added to the destination collection, its Team is being set to the destination Team and then the member is being removed from the current collection (current Team).
The test case was changed to
EntityTransaction t = teamRepository.transactionBegin();
teamA.transfer(zaki, teamB);
teamA.getEntityManager().refresh(teamA); // I get an exception here!!!
t.commit();
In the refresh() line I have the following exception:
javax.persistence.PersistenceException: org.hibernate.PersistentObjectException: detached entity passed to persist: domain.Member
at org.hibernate.jpa.spi.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1763)
// ... many calls
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
Caused by: org.hibernate.PersistentObjectException: detached entity passed to persist: domain.Member
at org.hibernate.event.internal.DefaultPersistEventListener.onPersist(DefaultPersistEventListener.java:139)
// ... many calls
at org.hibernate.jpa.spi.AbstractEntityManagerImpl.flush(AbstractEntityManagerImpl.java:1335)
... 28 more
It seems, after all, that transfering instances from one collection to another (that are implementing a simple aggregation) is not supported in Hibernate!
Your mappings and data access logic are just fine.
You need to move the transaction boundary at the beginning of the code:
EntityTransaction t = teamRepository.transactionBegin();
Team teamA = teamRepository.teamById(idTeamA);
...
teamA.transferir(denise, teamC);
t.commit();
But the exception you got it's not from the code you've shown us:
javax.persistence.PersistenceException: org.hibernate.PersistentObjectException: detached entity passed to persist: application.domain.Member
There might be a pending persist action being queued up and only translated during flush time (commit time in your case). So you need to activate the SQL logs being executed and try finding out the detached entity.
Moving Member from one list to other is not going to change the team value. Better idea would be to directly change team from A to B in a Member entity.
There are a couple of things that are not clear. First one is how do you handle hibernate session in this test? Is it active throughout the whole test? Second, does teamRepository.teamById(idTeamA) create also members? Because Member zaki = teamA.memberByName("Zaki") will return null if Zaki isn't created.
My setup would be something like this:
In #Before method, initial data is created (teamA (with Zaki and Denise), teamB and teamC)
In test method, begin transaction and get those entities by id
teamRepository.transactionBegin();
Team teamA = teamRepository.teamById(idTeamA); // does this create a team, or returns an existing one? here, it should only return existing team
... // get all teams
Member zaki = teamA.memberByName("Zaki");
... // get all members
Transfer members
teamA.transfer(zaki, teamB);
teamA.transfer(denise, teamC);
Commit transaction
t.commit();
The transfer() method is completely redundant. All you should need to do is call member.setTeam(destination) and Hibernate will update the collections accordingly.
Your Update's code:
member.setTeam(this);
does nothing. It should be
member.setTeam(destination);
You're seeing no updates because you're not changing the value. The member's already a member of this Team.
UPDATE:
Try just changing your transfer() method to this:
public void transfer(Member member, Team destination){
member.setTeam(destination);
}
After you've moved around all the team members you want to move, you can commit or flush the transaction before moving on to the next unit of work. Moving the items around in the local lists is not necessary unless your units of work are badly defined. Transfer your people, then commit. Then you're ready for the next bit of processing. If you need to do so in the middle of a transaction, then call entityManager.flush() which will execute the currently queued updates but still allow you to later commit or rollback. Flushing in this manner is an anti-pattern, though. If you need to know which team a member is a part of during your processing, ask the member, not the team (which is much more efficient anyway).
Related
I am having trouble publishing events from an aggregate-root in a Spring Boot application. What I basically want is to publish an "Update" event every time some information about a person is changed.
The code for this is pretty straightforward:
#Entity
public class Person {
#Transient
private final Collection<AbstractPersonRelatedEvent> events = new ArrayList<>();
Person(Person other) {
// copy other fields
other.events.foreach(events::add);
}
// other stuff
public Person updateInformation(...) {
Person updated = new Person(this);
// setting new data on the updated person
if (!hasUpdateEventRegistered()) {
updated.registerEvent(PersonDataUpdatedEvent.forPerson(updated));
}
return updated;
}
void registerEvent(AbstractPersonRelatedEvent event) {
events.add(event);
}
#DomainEvents
Collection<AbstractPersonRelatedEvent> getModificationEvents() {
return Collections.unmodifiableCollection(events);
}
#AfterDomainEventPublication
void clearEvents() {
events.clear();
}
}
I am managing Person instances through a manager:
#Service
#Transactional
class PersistentPersonManager implements PersonManager {
// other methods are omitted
#Override
public Person save(Person person) {
return personRepository.save(person);
}
}
However when I call the manager (manager.save(person.updateInformation(...)) the events seem to go "missing":
upon calling the save() method all events are still present but when Spring invokes getModificationEvents() the collection is empty. The events seem to have vanished somewhere in between (with only Spring-code being executed).
As this is pretty basic, I must be missing something essential but got stuck in a rut.
So how do I get back on track here?
I assume you are using JPA here.
For JPA the save operation actually does a merge on the JPA EnityManager.
For a detached entity merge loads/finds the entity with the same id from the database or the current session and copies all the (changed) fields over. This does ignore transient fields like the events.
You are dealing with detached entities because you are creating a new entity every time you call updateInformation.
So here is what is happening:
You load an entity (e1) from the database. It does not have any events registered.
By calling updateInformation you create a new detached entity (e2). You also register events with e2.
When calling save JPA finds the matching e1 and copies all changes from e2 into it, except the events. So e1 still has no events registered.
Events get triggered, but there aren't any because only e1 is used.
In order to fix this: Do not create new instances of the entity in updateInformation.
I'm using EJB 3 with Hibernate.
I have a stateless session Bean. There is a method deleteItem in that bean.
When a client call the deleteItem method then delete occurred without any problem.
But If I'm trying to call the deleteItem method using a for loop and set the limit of that loop 5-10 times then sometimes the delete failed. But not always.
The delete operation actually delete data from 2 tables. The child table and the parent table.
Each Delete is committed by performing flush operation.
As I already mentioned that if i execute the delete one by one then no problem happen, it only happen when i try to run it concurrently. The exception I'm getting is below
Caused by: java.sql.BatchUpdateException: Cannot delete or update a parent row: a foreign key
constraint fails (`functionTest/MotherTable`, CONSTRAINT `FKBC6CB0E6D5545EFD` FOREIGN KEY
(`MotherTable_FieldId`) REFERENCES `ChildTable` (`childTableId`))
There is no way to happen concurrent delete operation here. And Item delete is not related with other Item delete operation. So if still concurrency happens it will not be a problem for deleting multiple item at the same time.
So, i came to a decision that - "May be the clients are accessing the Same Bean Instance in Multiple Thread" . In such situation two thread keep the same Entity Manager State in different state. One tries to flush the Persistence Context when the other is not yet completed remove of child item.
At that point the BatchUpdateException Occured. - It is my observation. I'm not 100% sure about it.
So to overcome this situation I have gone for Optimistic locking. I have created version column in the mother table. Now I'm getting the OptimisticLockException . But I'm not able to catch the exception. Below is the code which I'm using to catch the OptimisticLockException.
private boolean deleteItem(Item itemId) {
Item item= getItem(itemId);
removeChildTableData(item);
mEm.remove(item);
try
{
mEm.flush();
}
catch (OptimisticLockException e)
{
try {
Thread.sleep(1000);
}
catch (InterruptedException e1) {
e1.printStackTrace();
}
deleteItem(itemId);
}
catch(Exception ex)
{
if (ex.getCause() instanceof OptimisticLockException)
{
try {
Thread.sleep(1000);
} catch (InterruptedException x) {
}
deleteItem(itemId);
}
}
return true;
}
So my target is to catch the OptimisticLockException and reExecute the Delete Operation Again.
I have checked the Exception Class Name and it is EntityNotFound. But I see that in the stacktrace I'm getting OptimisticLockException as well as StaleObjectStateException.
So, Can anybody please guide me how I should catch this OptimisticLockException ?
You shouldn't. Also +1 to what JB said. This exception is trying to tell you something. You are trying to delete the parent row of a foreign key relation while a child is still referencing it. What are parent and child? Well:
a foreign key constraint fails (`functionTest/MotherTable`, CONSTRAINT `FKBC6CB0E6D5545EFD` FOREIGN KEY (`MotherTable_FieldId`) REFERENCES `ChildTable` (`childTableId`))
So MotherTable.MotherTableFieldId is referencing ChildTable.childTableId. And you're trying to delete a child while its mother is still pointing to it. That won't work.
I'm curious why you would have the relation this way, though. It seems that your model looks like this:
#Entity
#Table(name="MotherTable")
class Mother {
#Id
Long id;
#ManyToOne
#JoinColumn(name="MotherTable_FieldId")
Child child;
}
#Entity
#Table(name="ChildTable"
class Child {
#Id
#Column(name="childTableId")
Long id;
#OneToMany(mappedBy="child")
Set<Mother> mothers;
}
which is odd since now your child can have many mothers. Maybe you wanted this instead:
#Entity
class Mother {
#Id
Long id;
#OneToMany(mappedBy="mother")
Set<Child> children;
}
#Entity
class Child {
#Id
Long id;
#ManyToOne
#JoinColumn(name="mother_id")
Mother mother;
}
In this case, your DAO method would look like this:
#Transactional
public void deleteFamily(Mother mother) {
for (Child c: mother.getChildren()) {
em.remove(c);
}
em.remove(mother);
}
You could also use cascading:
#Entity
class Mother {
#Id
Long id;
#OneToMany(mappedBy="mother", cascading=CascadeType.ALL)
Set<Child> children;
}
which simplifies the DAO method to:
#Transactional
public void deleteFamily(Mother mother) {
em.remove(mother);
}
And even:
#Entity
class Mother {
#Id
Long id;
#OneToMany(mappedBy="mother", cascading=CascadeType.ALL, orphanRemoval=true)
Set<Child> children;
}
and now you don't have to em.remove() children:
#Transactional
public void deleteChild(Child child) {
Mother m = child.getMother();
m.getChildren().remove(child);
}
Also, you shouldn't try to commit transactions with em.flush(), that's wrong on several counts:
Transactions are committed with em.getTransaction().commit()
Think about what you're trying to do: is the deleteFamily supposed to happen in one transaction? Yes? Then implement it that way. Don't try to do a partial commit after deleting the children.
It's much more convenient to let someone else manage the transactions for you. Just mark the methods as #Transactional and let your JTA framework handle the details.
And DAO methods shouldn't even try to do transactions anyway. Think about this: you might want to implement a service later on that uses several DAO methods. If each of them tries to commit themselves in separate transactions the service call in toto cannot be a transaction. That's bad. So if you want to reuse your DAO methods, pull the transactional stuff into a separate layer above them.
Can any one say to me that can I return Hibernate Entities as return value in JAXWS web service methods!?
Indeed I have some Entities like these:
#Entity
public class Parent {
...
private Childone childoneByChildoneid;
#ManyToOne
public
#javax.persistence.JoinColumn(name="ChildOneId",referencedColumnName="Id")
Childone getChildoneByChildoneid() {
return childoneByChildoneid;
}
public void setChildoneByChildoneid(Childone childoneByChildoneid) {
this.childoneByChildoneid = childoneByChildoneid;
}
...
}
#Entity
public class Childone {
...
private Collection<Parent> parentsById;
#OneToMany(mappedBy = "childoneByChildoneid")
public Collection<Parent> getParentsById() {
return parentsById;
}
public void setParentsById(Collection<Parent> parentsById) {
this.parentsById = parentsById;
}
...
}
And have a service like this:
#Stateless
#WebService()
public class MasterDataService {
#EJB
private MasterDataManager manager;
#WebMethod
public Parent getParent(int parentId) {
return manager.getParent(parentId);
}
}
#Stateless
public class MasterDataManager {
#PersistenceContext
EntityManager em;
public Parent getParent(int parentId) {
Parent parent = (Parent) em.createQuery(
"select p from Parent p where p.id=:parentId")
.setParameter("parentId", parentId).getSingleResult();
return parent;
}
}
When I call this web method from client I get LazyInitializationException exception :(
I test Serializable and Cloneable interfaces and override clone method but unfortunately it doesn't work, I use em.detach(parent) in manager but it doesn't work still.
Can any one help me?
tnax
It is debatable. Generally, you have two options:
return the entities, but make sure they are initialized. Either mark the #*ToMany with fetch=FetchType.EAGER or use Hibernate.initialize(..). The reason for the exception is that by default all collections in entities are not fetched from the database until requested. But when you request them from the jax-ws serializer, the hibernate session is already closed. Technically, you can have some OpenSessionInViewIntercepetor but I don't think there's something ready-to-use with JAX-WS, and it might be a problem to write one. If you don't want to transfer these collections, you can annotate them with #XmlTransient (or #JsonIgnore, depending on the serialization technique). It makes the entity somewhat of a mess, but I still prefer it to code duplication.
Use DTOs (data transfer objects) - transfer all data from the entity to a new object with a similar structure, that will be exposed by the web service. Again you'd have to make sure you are populating the DTO when the hibernate session is active
I prefer the first option, because it requires less biolerplate code, but I agree one should be very careful with entity state management when using it.
Is there a nice and elegant way to set a bean value (column) before Hibernate persists an entity? Basically I have a field called "modification_date". It's on a bunch of entities. Whenever one of these entities is updated/modified, I'd basically like that field set automatically.
I could write the code in the service layer to set the date every time the object is saved/updated manually...
I also have a Dao Layer. Every Dao extends from a support class that contains a save() method. I could just use reflection and set the value inside of this method. I could check to see if that class has a field with the name "modicationDate", and if it does, set it to new Date().
Is there a better way than this? Or is using my generic save() method the best approach? This is something I'd like to be robust and not have to worry about it ever again. I will be happy knowing that by simply making a "modificationDate" property that this will be taken care of for me automatically from this point on. Using the save() method seems like the best place, but if there's a better way, I'd like to become aware of it.
Checkout event listeners:
#Entity
#EntityListeners(class=LastUpdateListener.class)
public class Cat {
#Id private Integer id;
private String name;
private Calendar dateOfBirth;
#Transient private int age;
private Date lastUpdate;
#PostLoad
public void calculateAge() {
...
}
}
public class LastUpdateListener {
/**
* automatic property set before any database persistence
*/
#PreUpdate
#PrePersist
public void setLastUpdate(Cat o) {
o.setLastUpdate( new Date() );
}
}
In my code, I did as follows:
queried for a course entity
populate it with the given course data.
courseDao.update(entity) which internally calls persist(entity) method.
Surprisingly, the data is got updated successfully.
I am confused with this behaviour of persist method.
Please help me out.
code is as below:
//My Service......
#Service("myService")
#Transactional
public class MyServiceImpl implements MyService {
#Transactional(rollbackFor = { Throwable.class })
public void updateCourse(final Course course) throws MyServiceException {
------
------
CourseEntity courseEntity = courseDao.findById(course.getId());
populateCourseEntity(courseEntity, course);
courseDao.update(courseEntity);
}
}
//CourseDao.....
public class CourseDaoImpl implements CourseDao {
--------
public void update(final T entity) throws MyDaoException {
if (entity != null) {
this.entityManager.persist(entity);
}
else {
String errMsg = "Object to be updated cannot be null.";
throw new MyDaoException(errMsg);
}
}
}
When an entity is currently managed (attached to a session), all updates to it are directly reflected to the underlying storage even without calling persist().
In your case, you load your entity, so it's in the session. Then even if you don't call persist() it will be updated in the database on transaction commit.
The persist() description from the javadoc:
Make an entity instance managed and persistent.
This means that the method doesn't do anything in your case, since your entity is both persistent and managed.
P.S. Where I say "session", understand "entity manager"
JPA tries very hard to be a helpful API, such that anything you get from it (or save to it) will subsequently be tracked by JPA. This means than any further changes will be automatically handled for you by JPA without any additional work on your part.