How to create entities in one Entity group? - java

I am building an app based on google app engine (Java) using JDO for persistence.
Can someone give me an example or a point me to some code which shows persisting of multiple entities (of same type) using javax.jdo.PersistenceManager.makePersistentAll() within a transaction.
Basically I need to understand how to put multiple entites in one Entity Group so that they can be saved using makePersistentAll() inside transaction.

This section of the docs deals with exactly that.

i did this:
public static final Key root_key = KeyFactory.createKey("Object", "RootKey");
...
so a typical datastore persistent object will set the id in the constructor instead of getting one automatically
public DSO_MyType(string Name, Key parent)
{
KeyFactory.Builder b = new KeyFactory.Builder(parent);;
id = b.addChild(DSO_MyType.class.getSimpleName() , Name).getKey();
}
and you pass root_key as the parent
i'm not sure if you can pass different parents to objects of the same kind

Thanks for the response Nick.
This document only tells about implicit handling of entity groups by app engine when its a parent-child relationship. I want to save multiple objects of same type using PeristentManager.makePersistentAll(list) within a transaction. If objects are not same Entity Group this throws exception. Currently I could do it as below but think there must be a better and more appropriate approach to do this -
User u1 = new User("a");
UserDAO.getInstance().addObject(user1);
// UserDAO.addObject uses PersistentManager.makePersistent() in transaction and user
// object now has its Key set. I want to avoid this step.
User u2 = new User("x");
u2.setKey(KeyFactory.createKey(u1.getKey(),User.class.getSimpleName(), 100 /*some random id*/));
User u3 = new User("p");
u3.setKey(KeyFactory.createKey(u1.getKey(), User.class.getSimpleName(), 200));
UserDAO.getInstance().addObjects(Arrays.asList(new User[]{u2, u3}));
// UserDAO.addObjects uses PersistentManager.makePersistentAll() in transaction.
Although this approach works, the problem with this is that you have to depend on an already persistent entity to create an entity group.

Gopi, AFAIK you don't have to do that... this should work (haven't tested it):
List<User> userList = new ArrayList<User>();
userList.add(new User("a"));
userList.add(new User("b"));
userList.add(new User("c"));
UserDAO().getInstance().addObjects(userList);
Again, AFAIK, this should put all these objects in the same entity group. I'd love to know if I am wrong.

Related

HibernateException when updating a collection configured with delete orphan : can't save the parent object

I work on a Java project and I have to write a new module in order to copy some data from one database to another (same tables).
I have an entity Contrat containing several fields and the following field :
#OneToMany(mappedBy = "contrat", fetch = FetchType.LAZY)
#Fetch(FetchMode.SUBSELECT)
#Cascade( { org.hibernate.annotations.CascadeType.ALL, org.hibernate.annotations.CascadeType.DELETE_ORPHAN })
#BatchSize(size = 50)
private Set<MonElement> elements = new HashSet<MonElement>();
I must read some "Contrat" objects from a database and write them in another database.
I hesitate between 2 solutions :
use jdbc to query the first database and get the objects and then write those objects into the second database (paying attention to the order and the different keys). It will be long.
as the project currently uses Hibernate and contains all hibernate mapping classes, I was thinking about opening a first session to the first database, reading the hibernate Contrat object, setting the ids to null in the children elements and writing the object to the destination database with a second session. It should be quicker.
I wrote a test class for the second use case and the process fails with the following exception :
org.hibernate.HibernateException: Don't change the reference to a
collection with cascade="all-delete-orphan"
I think the reference must change when I set the ids to null, but I am not sure : I don't understand how changing a field of a Collection member can change the Collection reference
Note that if I remove DELETE_ORPHAN from the configuration, everything works, all the objects and their dependencies are written in the database.
So I would like to use the hibernate solution which is faster but I have to keep the DELETE_ORPHAN feature because the application currently uses this feature to ensure that every MonElement removed from the elements Set will be deleted in the database.
I don't need this feature but cannot remove it.
Also, I need to set the MonElement ids to null in order to generate new ones because their id in the first database may exist in the target database.
Here is the code I wrote which works well when I remove the DELETE_ORPHAN option.
SessionFactory sessionFactory = new AnnotationConfiguration().configure("/hibernate.cfg.src.xml").buildSessionFactory();
Session session = sessionFactory.openSession();
// search the Contrat object
Criteria crit = session.createCriteria(Contrat.class);
CriteriaUtil.addEqualCriteria(crit, "column", "65465454");
Contrat contrat = (Contrat)crit.list().get(0);
session.close();
SessionFactory sessionFactoryDest = new AnnotationConfiguration().configure("/hibernate.cfg.dest.xml").buildSessionFactory();
Session sessionDest = sessionFactoryDest.openSession();
Transaction transaction = sessionDest.beginTransaction();
// setting id to null, also for the elements in the elements Set
contrat.setId(null);
for (MonElement element:contrat.getElements()) {
element.setId(null);
}
// writing the object in the database
sessionDest.save(contrat);
transaction.commit();
sessionDest.flush();
sessionDest.close();
This is way faster than managing myself the queries and the primary / foreign keys and dependencies between objects.
Does anyone have an idea to get rid of this exception ?
Or maybe I should change the state of the Set.
In fact I'm not trying to delete any element of this Set, I just want them to be considered as new objects.
If I don't find a solution, I will do something dirty : duplicate all hibernate entity objects in my new project and remove the DELETE_ORPHAN parameter in the newly created Contrat.
So the application will continue using its mapping and my new project will use my specific mapping. But I want to avoid that.
Thanks
A correct solution has been written by crizzis as a comment to my question.
I quote him :
I'd try wrapping the contrat.elements in a new collection (contrat.setElements(new HashSet<>(contrat.getElements())) before trying to persist the contract with the new session
It works well.

JPA handle merge() of relationship

I have a unidirectional relation Project -> ProjectType:
#Entity
public class Project extends NamedEntity
{
#ManyToOne(optional = false)
#JoinColumn(name = "TYPE_ID")
private ProjectType type;
}
#Entity
public class ProjectType extends Lookup
{
#Min(0)
private int progressive = 1;
}
Note that there's no cascade.
Now, when I insert a new Project I need to increment the type progressive.
This is what I'm doing inside an EJB, but I'm not sure it's the best approach:
public void create(Project project)
{
em.persist(project);
/* is necessary to merge the type? */
ProjectType type = em.merge(project.getType());
/* is necessary to set the type again? */
project.setType(type);
int progressive = type.getProgressive();
type.setProgressive(progressive + 1);
project.setCode(type.getPrefix() + progressive);
}
I'm using eclipselink 2.6.0, but I'd like to know if there's a implementation independent best practice and/or if there are behavioral differences between persistence providers, about this specific scenario.
UPDATE
to clarify the context when entering EJB create method (it is invoked by a JSF #ManagedBean):
project.projectType is DETACHED
project is NEW
no transaction (I'm using JTA/CMT) is active
I am not asking about the difference between persist() and merge(), I'm asking if either
if em.persist(project) automatically "reattach" project.projectType (I suppose not)
if it is legal the call order: first em.persist(project) then em.merge(projectType) or if it should be inverted
since em.merge(projectType) returns a different instance, if it is required to call project.setType(managedProjectType)
An explaination of "why" this works in a way and not in another is also welcome.
You need merge(...) only to make a transient entity managed by your entity manager. Depending on the implementation of JPA (not sure about EclipseLink) the returned instance of the merge call might be a different copy of the original object.
MyEntity unmanaged = new MyEntity();
MyEntity managed = entityManager.merge(unmanaged);
assert(entityManager.contains(managed)); // true if everything worked out
assert(managed != unmanaged); // probably true, depending on JPA impl.
If you call manage(entity) where entity is already managed, nothing will happen.
Calling persist(entity) will also make your entity managed, but it returns no copy. Instead it merges the original object and it might also call an ID generator (e.g. a sequence), which is not the case when using merge.
See this answer for more details on the difference between persist and merge.
Here's my proposal:
public void create(Project project) {
ProjectType type = project.getType(); // maybe check if null
if (!entityManager.contains(type)) { // type is transient
type = entityManager.merge(type); // or load the type
project.setType(type); // update the reference
}
int progressive = type.getProgressive();
type.setProgressive(progressive + 1); // mark as dirty, update on flush
// set "code" before persisting "project" ...
project.setCode(type.getPrefix() + progressive);
entityManager.persist(project);
// ... now no additional UPDATE is required after the
// INSERT on "project".
}
UPDATE
if em.persist(project) automatically "reattach" project.projectType (I suppose not)
No. You'll probably get an exception (Hibernate does anyway) stating, that you're trying to merge with a transient reference.
Correction: I tested it with Hibernate and got no exception. The project was created with the unmanaged project type (which was managed and then detached before persisting the project). But the project type's progression was not incremented, as expected, since it wasn't managed. So yeah, manage it before persisting the project.
if it is legal the call order: first em.persist(project) then em.merge(projectType) or if it should be inverted
It's best practise to do so. But when both statements are executed within the same batch (before the entity manager gets flushed) it may even work (merging type after persisting project). In my test it worked anyway. But as I said, it's better to merge the entities before persisting new ones.
since em.merge(projectType) returns a different instance, if it is required to call project.setType(managedProjectType)
Yes. See example above. A persistence provider may return the same reference, but it isn't required to. So to be sure, call project.setType(mergedType).
Do you need to merge? Well it depends. According to merge() javadoc:
Merge the state of the given entity into the current persistence
context
How did you get the instance of ProjectType you attach to your Project to? If that instance is already managed then all you need to do is just
type.setProgessive(type.getProgressive() + 1)
and JPA will automatically issue an update effective on next context flush.
Otherwise if the type is not managed then you need to merge it first.
Although not directly related this quesetion has some good insight about persist vs merge: JPA EntityManager: Why use persist() over merge()?
With the call order of em.persist(project) vs em.merge(projectType), you probably should ask yourself what should happen if the type is gone in the database? If you merge the type first it will get re-inserted, if you persist the project first and you have FK constraint the insert will fail (because it's not cascading).
Here in this code. Merge basically store the record in different object, Let's say
One Account pojo is there
Account account =null;
account = entityManager.merge(account);
then you can store the result of this.
But in your code your are using merge different condition like
public void create(Project project)
{
em.persist(project);
/* is necessary to merge the type? */
ProjectType type = em.merge(project.getType());
}
here
Project and ProjectType two different pojo you can use merge for same pojo.
or is there any relationship between in your pojo then also you can use it.

EJB named query preference

I have a database with this structure.
I am using JSP + Servlet + Entity Classes from database + Session Beans for entity classes. As you can see, my tables are normalized which in return makes it necessary to join tables to obtain the whole details of a patient/staff. As i studied the https://netbeans.org/kb/docs/javaee/ecommerce/intro.html i saw that they access the database by using the facade.find etc and etc. Considering my case, I have also tried using the same thing.
For example. I have a session bean (Profile Manager) which accesses the entities and puts it in the map.
public Map getPatientDetails(int patientID)
{
Map patientMap = new HashMap();
Patient patient = patientFacade.find(patientID);
User user = userFacade.find(patient.getUserId().getId());
UserContact userContact = user.getUserContact();
Family family = familyFacade.find(patient.getFamilyId().getId());
String patientDOB = new SimpleDateFormat("MMMMM dd, yyyy").format(user.getDateOfBirth());
patientMap.put("familyRecord", family);
patientMap.put("patientRecord", patient);
patientMap.put("patientDOB", patientDOB);
patientMap.put("userRecord", user);
patientMap.put("userContactRecord", userContact);
return patientMap;
}
As I give myself time to think about it, I thought that I can join the entities by using and setting a namedquery instead making it a single access. Which is the right way to do this? Do you think using facades to access my database is better than constructing an inner join query to acheive getting all the information at once? What would you guys suggest? Thanks!
I would suggest you to avoid joins in your SQL as, in my experience, it is one of the main root cause of performance issues associated to data access layer.
I would suggest to fetch entity one by one (like hibernate). In this method, there will be round trips to the database. But the SQLs will be simple and thus faster.

Saving OneToOne mapped object in the database?

I'm trying to save a UserOnline object, which has a OneToOne relationship with User, in the database. I want to create a new one if it doesn't exist, and if it does exist, simply change the room.
This is the code I'm using:
UserOnline uo = (UserOnline) UserOnline.find("byUser",
getConnectedUser()).first();
if (uo == null) {
uo = new UserOnline(getConnectedUser(), room);
room.save();
} else {
uo.currentRoom = room;
uo.save();
}
For some reason, even though the uo is actually null, the object isn't actually saved. Any ideas why that is? It's not giving me an error, it just isn't creating the record. I'm also wondering how I could create a UserOnline object starting from the User object.
Something like
User user = User.findById(1);
user.onlineStatus.room = room.
user.save();
Can related objects be (created if they don't exist and otherwise edited) saved this way?
User.java
#OneToOne(mappedBy="user")
public UserOnline onlineStatus;
The save() method is from the play framework.
I guess this question is related to Play Framework. If so, it would be better to mark it as such, since use of Hibernate in Play Framework has some pecularities, see Explicit save.
Regarding the question, Play Framework's save() is cascaded on relationships that have cascade=CascadeType.ALL on them. If your relationship in question is configured this way, it should work fine.

Jsf pages don't show newly added values

I'm using Java EE6 and JSF for making a simple CRUD application.
In many of my JSF pages, I have a selectOneMenu for the user to select an existing item. For example, if the user is adding an "Exam", he/she can choose a "Department" from the combo-box, since they have a one-to-many relationship.
The problem is that whenever a new Department is added, the combo-box values are not updated until the session times out. I need to use SessionScoped backing beans, because I need the values to persist across multiple Requests.
Here is a function I'm using to populate the selectOneMenu (not exactly as the example described above, but very similar):
public SelectItem[] getExamsByDepartment(){
if(departmentMaster!=null){
Collection<ExamMaster> examMasterCollection = departmentMaster.getExamMasterCollection();
//Problem: newly added exams aren't shown until session is re-created
if (examMasterCollection != null && examMasterCollection.size() > 0) {
SelectItem[] selectItem = new SelectItem[examMasterCollection.size()];
Iterator<ExamMaster> i = examMasterCollection.iterator();
int count = 0;
while (i.hasNext()) {
ExamMaster tmpExamMaster = i.next();
selectItem[count++] = new SelectItem(tmpExamMaster, tmpExamMaster.getExamName());
}
return selectItem;
}
}
SelectItem[] selectItem = new SelectItem[1];
selectItem[0] = new SelectItem("", "[No exams found]");
return selectItem;
}
Is there any workaround by which I can destroy and re-create the Session? Or any other way by which I can solve this problem?
EDIT: I guess the issue boils down to why the Collection doesn't update after inserting a record, even though the one-to-many relationship is defined in the Persistence class
EDIT 2: Instead of using the collection, I am using a Named Query now, and it fetches the new record as expected. I had assumed that the Collection should be updated automagically. Perhaps that is not true?
Generally it's not a good idea to have session-scoped beans for these cases. Make your bean #ViewScoped.
Also, whenever you add an item, re-fetch the collection. (this will cover the case when the same user adds and then reads the collection)
EDIT 2: Instead of using the
collection, I am using a Named Query
now, and it fetches the new record as
expected. I had assumed that the
Collection should be updated
automagically. Perhaps that is not
true?
An object's collection is only updated when it's re-sync'd to the database. If you're not explicitly re-syncing to the database, it wouldn't update.
The object in session would have it's copy of the the values and wouldn't need to go back to the database unless something actually told it to.

Categories

Resources