Grails replace object in one-to-one association - java

I have two following domains:
User {
UserData userData
}
UserData {
static belongsTo = [user: User]
}
and at some point I want to merge two users into one. I mean delete one instance of User and attache userData to another user.
I've tried:
User zombieUser
User liveUser
UserData data = zombieUser.userData
zombieUser.delete()
liveUser.userData = data
userData.user = liveUser
userData.save()
liveUser.save()
Actually I've tried different variants, different order, seems that all possible ways. But it always fails with an exceptions. Current code will fail with:
deleted object would be re-saved by cascade (remove deleted object from associations): [UserData#1]
If i've put zombie.delete() to bottom, after *.save(), I will get:
Field error in object 'User' on field 'userData': rejected value [UserData : 1]; codes ... default message [Property [{0}] of class [{1}] with value [{2}] must be unique]
Are there any way to reconnect existing object from one object to another?
Working code:
UserData userData = zombieUser.userData
userData.user = null
zombieUser.userData = null
zombieUser.save(flush: true)
userData.save()
liveUser.userData = userData
userData.user = liveUser
liveUser.save()
userDate.save(flush: true)

The problem is that because of your belongsTo declaration, and according to cascading rules, userData is deleted from the database as soon as zombieUser is deleted. When liveUser.save() is called, userData would be saved again. By the way, you call to userData.save() is not needed as again, because of this belongsTo declaration, liveUser.save() is cascaded to userData.
I see two options for solving your problem:
Perform a deep copy of userData to a new UserData object, and attach this object to liveUser, like this:
withTransaction {
UserData data = new UserData(prop: zombieUser.userData.prop)
liveUser.userData = data
zombiUser.delete()
liveUser.save()
}
The old UserData object will be deleted by cascade when deleting zombieUser and a new one will be created when saving liveUser.
Remove belongsTo from userData. You won't benefit of cascading anymore, and will have to manage saving userData yourself, and make sure user.userData is deleted from the database when no user reference it (don't forget to make transactions for that, as always when you have several call to save or delete in your method).
Which approach you choose depends on how coupled your objects are. Typically, if they are very coupled, you will benefit of cascading (through belongsTo) and may go for 1. If they are not very coupled, (i.e. there may be several users with the same userData), then belongsTo is wrong and you should choose approach 2.
The concepts presented in GORM gotchas will greatly help you if you have a GORM model with complicated dependencies.

You need to add a mapping to your domain class to tell it what to do when you delete an object. For example, should deleting a parent also delete all its children??
static mapping = {
userData cascade: "all-delete-orphan"
}

Related

How to update only a subset of fields and update the repository?

I'm making a spring boot application, and I'm looking to update an existing entry in the DB through my service and controller. In my service layer I have the below method. So I'm retrieving the fields associated with a caseID, creating a model mapper which maps my entity object class to my VO, and then mapping the retrieved data to my DTO. Then I save my repository. The purpose is to add only the fields which I have specified in my req message ie if I only want to update 1 field out of 20, it updates this field and leaves the rest untouched. The below runs successfully, but the field I specify in my request message in postman does not update in the DB. Why is this? I have tried mapping different objects and saving different variables to the repository but nothing seems to update the DB.
public StoredOutboundErrorCaseVO updateCase(OutboundErrorCaseVO outboundErrorCaseVO, Long caseNumber) {
OutboundErrorCaseData existingCaseData = ErrorCaseDataRepository.findById(caseNumber).get();
ModelMapper mm = new ModelMapper();
mm.getConfiguration().setAmbiguityIgnored(true);
OutboundErrorCaseData uiOutboundErrorCaseData = mm.map(outboundErrorCaseVO,
OutboundErrorCaseData.class);
mm.map(existingCaseData, uiOutboundErrorCaseData);
ErrorCaseDataRepository.save(uiOutboundErrorCaseData);
return mm.map(uiOutboundErrorCaseData, StoredOutboundErrorCaseVO.class);
}
Controller - code omitted for brevity, POST method (I usually use PUT for updates but I believe I can still use POST)
StoredOutboundErrorCaseVO updatedCase = outboundErrorService.updateCase(outboundErrorCaseVO,
caseNumber);
Repo
#Repository
public interface OutboundErrorCaseDataRepository extends JpaRepository<OutboundErrorCaseData, Long> {
You are getting data and passing it into existingCaseData and save uiOutboundErrorCaseData. So my guess is Hibernate is adding a new object into the database with new Id and with you updated value. It of course depends on your model definition. Especially id.
I also think Hibernate won't let you save uiOutboundErrorCaseData with the same Id if you already have an object in Hibernate Session associated with that id. So, why don't you update existingCaseData with the new value and save it back.
I created a working solution, although I realise it can be improved, it certainly works. The only drawback is that I need to specify all the fields which can be updated, ideally I want a solution which takes in n number of fields and updates the record.
OutboundErrorCaseData existingCaseDta = ErrorCaseDataRepository.findById(caseNumber).get();
if (outboundErrorCaseVO.getChannel() != null) {
existingCaseDta.setChannel(outboundErrorCaseVO.getChannel());
}
ErrorCaseDataRepository.save(existingCaseDta);
ModelMapper mm = new ModelMapper();
return mm.map(existingCaseDta, StoredOutboundErrorCaseVO.class);

Struts2 and Hibernate insert operation error [duplicate]

org.hibernate.HibernateException: identifier of an instance
of org.cometd.hibernate.User altered from 12 to 3
in fact, my user table is really must dynamically change its value, my Java app is multithreaded.
Any ideas how to fix it?
Are you changing the primary key value of a User object somewhere? You shouldn't do that. Check that your mapping for the primary key is correct.
What does your mapping XML file or mapping annotations look like?
You must detach your entity from session before modifying its ID fields
In my case, the PK Field in hbm.xml was of type "integer" but in bean code it was long.
In my case getters and setter names were different from Variable name.
private Long stockId;
public Long getStockID() {
return stockId;
}
public void setStockID(Long stockID) {
this.stockId = stockID;
}
where it should be
public Long getStockId() {
return stockId;
}
public void setStockId(Long stockID) {
this.stockId = stockID;
}
In my case, I solved it changing the #Id field type from long to Long.
In my particular case, this was caused by a method in my service implementation that needed the spring #Transactional(readOnly = true) annotation. Once I added that, the issue was resolved. Unusual though, it was just a select statement.
Make sure you aren't trying to use the same User object more than once while changing the ID. In other words, if you were doing something in a batch type operation:
User user = new User(); // Using the same one over and over, won't work
List<Customer> customers = fetchCustomersFromSomeService();
for(Customer customer : customers) {
// User user = new User(); <-- This would work, you get a new one each time
user.setId(customer.getId());
user.setName(customer.getName());
saveUserToDB(user);
}
In my case, a template had a typo so instead of checking for equivalency (==) it was using an assignment equals (=).
So I changed the template logic from:
if (user1.id = user2.id) ...
to
if (user1.id == user2.id) ...
and now everything is fine. So, check your views as well!
It is a problem in your update method. Just instance new User before you save changes and you will be fine. If you use mapping between DTO and Entity class, than do this before mapping.
I had this error also. I had User Object, trying to change his Location, Location was FK in User table. I solved this problem with
#Transactional
public void update(User input) throws Exception {
User userDB = userRepository.findById(input.getUserId()).orElse(null);
userDB.setLocation(new Location());
userMapper.updateEntityFromDto(input, userDB);
User user= userRepository.save(userDB);
}
Also ran into this error message, but the root cause was of a different flavor from those referenced in the other answers here.
Generic answer:
Make sure that once hibernate loads an entity, no code changes the primary key value in that object in any way. When hibernate flushes all changes back to the database, it throws this exception because the primary key changed. If you don't do it explicitly, look for places where this may happen unintentionally, perhaps on related entities that only have LAZY loading configured.
In my case, I am using a mapping framework (MapStruct) to update an entity. In the process, also other referenced entities were being updates as mapping frameworks tend to do that by default. I was later replacing the original entity with new one (in DB terms, changed the value of the foreign key to reference a different row in the related table), the primary key of the previously-referenced entity was already updated, and hibernate attempted to persist this update on flush.
I was facing this issue, too.
The target table is a relation table, wiring two IDs from different tables. I have a UNIQUE constraint on the value combination, replacing the PK.
When updating one of the values of a tuple, this error occured.
This is how the table looks like (MySQL):
CREATE TABLE my_relation_table (
mrt_left_id BIGINT NOT NULL,
mrt_right_id BIGINT NOT NULL,
UNIQUE KEY uix_my_relation_table (mrt_left_id, mrt_right_id),
FOREIGN KEY (mrt_left_id)
REFERENCES left_table(lef_id),
FOREIGN KEY (mrt_right_id)
REFERENCES right_table(rig_id)
);
The Entity class for the RelationWithUnique entity looks basically like this:
#Entity
#IdClass(RelationWithUnique.class)
#Table(name = "my_relation_table")
public class RelationWithUnique implements Serializable {
...
#Id
#ManyToOne
#JoinColumn(name = "mrt_left_id", referencedColumnName = "left_table.lef_id")
private LeftTableEntity leftId;
#Id
#ManyToOne
#JoinColumn(name = "mrt_right_id", referencedColumnName = "right_table.rig_id")
private RightTableEntity rightId;
...
I fixed it by
// usually, we need to detach the object as we are updating the PK
// (rightId being part of the UNIQUE constraint) => PK
// but this would produce a duplicate entry,
// therefore, we simply delete the old tuple and add the new one
final RelationWithUnique newRelation = new RelationWithUnique();
newRelation.setLeftId(oldRelation.getLeftId());
newRelation.setRightId(rightId); // here, the value is updated actually
entityManager.remove(oldRelation);
entityManager.persist(newRelation);
Thanks a lot for the hint of the PK, I just missed it.
Problem can be also in different types of object's PK ("User" in your case) and type you ask hibernate to get session.get(type, id);.
In my case error was identifier of an instance of <skipped> was altered from 16 to 32.
Object's PK type was Integer, hibernate was asked for Long type.
In my case it was because the property was long on object but int in the mapping xml, this exception should be clearer
If you are using Spring MVC or Spring Boot try to avoid:
#ModelAttribute("user") in one controoler, and in other controller
model.addAttribute("user", userRepository.findOne(someId);
This situation can produce such error.
This is an old question, but I'm going to add the fix for my particular issue (Spring Boot, JPA using Hibernate, SQL Server 2014) since it doesn't exactly match the other answers included here:
I had a foreign key, e.g. my_id = '12345', but the value in the referenced column was my_id = '12345 '. It had an extra space at the end which hibernate didn't like. I removed the space, fixed the part of my code that was allowing this extra space, and everything works fine.
Faced the same Issue.
I had an assosciation between 2 beans. In bean A I had defined the variable type as Integer and in bean B I had defined the same variable as Long.
I changed both of them to Integer. This solved my issue.
I solve this by instancing a new instance of depending Object. For an example
instanceA.setInstanceB(new InstanceB());
instanceA.setInstanceB(YOUR NEW VALUE);
In my case I had a primary key in the database that had an accent, but in other table its foreign key didn't have. For some reason, MySQL allowed this.
It looks like you have changed identifier of an instance
of org.cometd.hibernate.User object menaged by JPA entity context.
In this case create the new User entity object with appropriate id. And set it instead of the original User object.
Did you using multiple Transaction managers from the same service class.
Like, if your project has two or more transaction configurations.
If true,
then at first separate them.
I got the issue when i tried fetching an existing DB entity, modified few fields and executed
session.save(entity)
instead of
session.merge(entity)
Since it is existing in the DB, when we should merge() instead of save()
you may be modified primary key of fetched entity and then trying to save with a same transaction to create new record from existing.

JPA handle merge() of relationship

I have a unidirectional relation Project -> ProjectType:
#Entity
public class Project extends NamedEntity
{
#ManyToOne(optional = false)
#JoinColumn(name = "TYPE_ID")
private ProjectType type;
}
#Entity
public class ProjectType extends Lookup
{
#Min(0)
private int progressive = 1;
}
Note that there's no cascade.
Now, when I insert a new Project I need to increment the type progressive.
This is what I'm doing inside an EJB, but I'm not sure it's the best approach:
public void create(Project project)
{
em.persist(project);
/* is necessary to merge the type? */
ProjectType type = em.merge(project.getType());
/* is necessary to set the type again? */
project.setType(type);
int progressive = type.getProgressive();
type.setProgressive(progressive + 1);
project.setCode(type.getPrefix() + progressive);
}
I'm using eclipselink 2.6.0, but I'd like to know if there's a implementation independent best practice and/or if there are behavioral differences between persistence providers, about this specific scenario.
UPDATE
to clarify the context when entering EJB create method (it is invoked by a JSF #ManagedBean):
project.projectType is DETACHED
project is NEW
no transaction (I'm using JTA/CMT) is active
I am not asking about the difference between persist() and merge(), I'm asking if either
if em.persist(project) automatically "reattach" project.projectType (I suppose not)
if it is legal the call order: first em.persist(project) then em.merge(projectType) or if it should be inverted
since em.merge(projectType) returns a different instance, if it is required to call project.setType(managedProjectType)
An explaination of "why" this works in a way and not in another is also welcome.
You need merge(...) only to make a transient entity managed by your entity manager. Depending on the implementation of JPA (not sure about EclipseLink) the returned instance of the merge call might be a different copy of the original object.
MyEntity unmanaged = new MyEntity();
MyEntity managed = entityManager.merge(unmanaged);
assert(entityManager.contains(managed)); // true if everything worked out
assert(managed != unmanaged); // probably true, depending on JPA impl.
If you call manage(entity) where entity is already managed, nothing will happen.
Calling persist(entity) will also make your entity managed, but it returns no copy. Instead it merges the original object and it might also call an ID generator (e.g. a sequence), which is not the case when using merge.
See this answer for more details on the difference between persist and merge.
Here's my proposal:
public void create(Project project) {
ProjectType type = project.getType(); // maybe check if null
if (!entityManager.contains(type)) { // type is transient
type = entityManager.merge(type); // or load the type
project.setType(type); // update the reference
}
int progressive = type.getProgressive();
type.setProgressive(progressive + 1); // mark as dirty, update on flush
// set "code" before persisting "project" ...
project.setCode(type.getPrefix() + progressive);
entityManager.persist(project);
// ... now no additional UPDATE is required after the
// INSERT on "project".
}
UPDATE
if em.persist(project) automatically "reattach" project.projectType (I suppose not)
No. You'll probably get an exception (Hibernate does anyway) stating, that you're trying to merge with a transient reference.
Correction: I tested it with Hibernate and got no exception. The project was created with the unmanaged project type (which was managed and then detached before persisting the project). But the project type's progression was not incremented, as expected, since it wasn't managed. So yeah, manage it before persisting the project.
if it is legal the call order: first em.persist(project) then em.merge(projectType) or if it should be inverted
It's best practise to do so. But when both statements are executed within the same batch (before the entity manager gets flushed) it may even work (merging type after persisting project). In my test it worked anyway. But as I said, it's better to merge the entities before persisting new ones.
since em.merge(projectType) returns a different instance, if it is required to call project.setType(managedProjectType)
Yes. See example above. A persistence provider may return the same reference, but it isn't required to. So to be sure, call project.setType(mergedType).
Do you need to merge? Well it depends. According to merge() javadoc:
Merge the state of the given entity into the current persistence
context
How did you get the instance of ProjectType you attach to your Project to? If that instance is already managed then all you need to do is just
type.setProgessive(type.getProgressive() + 1)
and JPA will automatically issue an update effective on next context flush.
Otherwise if the type is not managed then you need to merge it first.
Although not directly related this quesetion has some good insight about persist vs merge: JPA EntityManager: Why use persist() over merge()?
With the call order of em.persist(project) vs em.merge(projectType), you probably should ask yourself what should happen if the type is gone in the database? If you merge the type first it will get re-inserted, if you persist the project first and you have FK constraint the insert will fail (because it's not cascading).
Here in this code. Merge basically store the record in different object, Let's say
One Account pojo is there
Account account =null;
account = entityManager.merge(account);
then you can store the result of this.
But in your code your are using merge different condition like
public void create(Project project)
{
em.persist(project);
/* is necessary to merge the type? */
ProjectType type = em.merge(project.getType());
}
here
Project and ProjectType two different pojo you can use merge for same pojo.
or is there any relationship between in your pojo then also you can use it.

Saving OneToOne mapped object in the database?

I'm trying to save a UserOnline object, which has a OneToOne relationship with User, in the database. I want to create a new one if it doesn't exist, and if it does exist, simply change the room.
This is the code I'm using:
UserOnline uo = (UserOnline) UserOnline.find("byUser",
getConnectedUser()).first();
if (uo == null) {
uo = new UserOnline(getConnectedUser(), room);
room.save();
} else {
uo.currentRoom = room;
uo.save();
}
For some reason, even though the uo is actually null, the object isn't actually saved. Any ideas why that is? It's not giving me an error, it just isn't creating the record. I'm also wondering how I could create a UserOnline object starting from the User object.
Something like
User user = User.findById(1);
user.onlineStatus.room = room.
user.save();
Can related objects be (created if they don't exist and otherwise edited) saved this way?
User.java
#OneToOne(mappedBy="user")
public UserOnline onlineStatus;
The save() method is from the play framework.
I guess this question is related to Play Framework. If so, it would be better to mark it as such, since use of Hibernate in Play Framework has some pecularities, see Explicit save.
Regarding the question, Play Framework's save() is cascaded on relationships that have cascade=CascadeType.ALL on them. If your relationship in question is configured this way, it should work fine.

How to create entities in one Entity group?

I am building an app based on google app engine (Java) using JDO for persistence.
Can someone give me an example or a point me to some code which shows persisting of multiple entities (of same type) using javax.jdo.PersistenceManager.makePersistentAll() within a transaction.
Basically I need to understand how to put multiple entites in one Entity Group so that they can be saved using makePersistentAll() inside transaction.
This section of the docs deals with exactly that.
i did this:
public static final Key root_key = KeyFactory.createKey("Object", "RootKey");
...
so a typical datastore persistent object will set the id in the constructor instead of getting one automatically
public DSO_MyType(string Name, Key parent)
{
KeyFactory.Builder b = new KeyFactory.Builder(parent);;
id = b.addChild(DSO_MyType.class.getSimpleName() , Name).getKey();
}
and you pass root_key as the parent
i'm not sure if you can pass different parents to objects of the same kind
Thanks for the response Nick.
This document only tells about implicit handling of entity groups by app engine when its a parent-child relationship. I want to save multiple objects of same type using PeristentManager.makePersistentAll(list) within a transaction. If objects are not same Entity Group this throws exception. Currently I could do it as below but think there must be a better and more appropriate approach to do this -
User u1 = new User("a");
UserDAO.getInstance().addObject(user1);
// UserDAO.addObject uses PersistentManager.makePersistent() in transaction and user
// object now has its Key set. I want to avoid this step.
User u2 = new User("x");
u2.setKey(KeyFactory.createKey(u1.getKey(),User.class.getSimpleName(), 100 /*some random id*/));
User u3 = new User("p");
u3.setKey(KeyFactory.createKey(u1.getKey(), User.class.getSimpleName(), 200));
UserDAO.getInstance().addObjects(Arrays.asList(new User[]{u2, u3}));
// UserDAO.addObjects uses PersistentManager.makePersistentAll() in transaction.
Although this approach works, the problem with this is that you have to depend on an already persistent entity to create an entity group.
Gopi, AFAIK you don't have to do that... this should work (haven't tested it):
List<User> userList = new ArrayList<User>();
userList.add(new User("a"));
userList.add(new User("b"));
userList.add(new User("c"));
UserDAO().getInstance().addObjects(userList);
Again, AFAIK, this should put all these objects in the same entity group. I'd love to know if I am wrong.

Categories

Resources