I'm trying to save with GreenDAO an entity called hotel. Each hotel has a relation one-to-many with some agreements and each agreement has got... well, a picture is worth a thousand words.
Now, what I do is the following:
daoSession.runInTx(new Runnable() {
#Override
public void run() {
ArrayList<Hotel> listOfHotels = getData().getAvailability();
for(Hotel h : listOfHotels)
{
List<HotelAgreement> hotelAgreements = h.getAgreements();
for(HotelAgreement ha : hotelAgreements) {
ha.setHotel_id(h.getHotel_id());
HotelAgreementDeadline hotelAgreementDeadline = ha.getDeadline();
List<HotelRemark> hr = hotelAgreementDeadline.getRemarks();
List<HotelAgreementDeadlinePolicies> hadp = hotelAgreementDeadline.getPolicies();
daoSession.getHotelReportDao().insertOrReplaceInTx( h.getReports() );
daoSession.getHotelPictureDao().insertOrReplaceInTx( h.getPictures() );
daoSession.getHotelRemarkDao().insertOrReplaceInTx(hr);
daoSession.getHotelAgreementDeadlinePoliciesDao().insertOrReplaceInTx(hadp);
daoSession.getHotelAgreementDeadlineDao().insertOrReplace(hotelAgreementDeadline);
daoSession.getHotelAgreementDao().insertOrReplace(ha);
}
// daoSession.getHotelReportsDao().insertOrReplace( getData().getReports() );
}
daoSession.getHotelDao().insertOrReplaceInTx(listOfHotels);
}
});
This, of course, does not work. I get a "Entity is detached from DAO context" error on the following line:
HotelAgreementDeadline hotelAgreementDeadline = ha.getDeadline();
I understand this is because I try to get the Agreements from a Hotel entity which does not come from the database, but from another source (a web service, in this case). But why does this happen with ha.getDeadline() and not with h.getAgreements()?
Now, I have the Hotel object and it does include pretty much all data: agreements, deadline, policies, remarks, pictures, report. I'd just like to tell GreenDAO: save it! And if I can't and I have to cycle through the tree - which is what I'm trying to do with the code above - how am I supposed to do it?
Here I read that I have to "store/load the object first using a Dao". Pretty awesome, but... how does it work? I read the greenDAO documentation about relations but couldn't find anything.
Thank you to everybody who's willing to help :-)
At some point, when you get the response from the webservice, you are creating new entity objects and filling them with the info. Try inserting each new object in the DB just after that.
If you want, you can insert, for example, all n Agreement for an Hotel using insertOrReplaceInTx, but you shouldn't use any relation before all the involved objects are in the DB.
I think that greendao team have to add the following control in the method
getToOneField() like in the getToManyList()
if(property == null){
code already generated by greendao plugin
}
return property;
so in your case in HotelAgreements class
#Keep
public DeadLine getDeadLine {
if(deadLine == null) {
long __key = this.deadLineId;
if (deadLine__resolvedKey == null || !deadLine__resolvedKey.equals(__key)) {
final DaoSession daoSession = this.daoSession;
if (daoSession == null) {
throw new DaoException("Entity is detached from DAO context");
}
DeadLineDao targetDao = daoSession.getDeadLineDao();
DeadLine deadLineNew = targetDao.load(__key);
synchronized (this) {
deadLine = deadLineNew;
deadLine__resolvedKey = __key;
}
}
}
return deadLine;
}
adding the control
if(deadLine == null) {
...
}
so if you receive data from rest json
the object is populated and getProperty() method return property field from object not from database just like it does with Lists
then you can insert or replace it
Then, when you load or load deeply object from db the property is null and greendao take it from DB
Related
I have a rest application where one of the resources can be updated. Below are two methods responsible for achieving this task:
updateWithRelatedEntities(String, Store): receives id and new object Store which was constructed by deserializing PUT request entity, sets the version (used for optimistic locking) on new object and calls update in a transaction.
public Store updateWithRelatedEntities(String id, Store newStore) {
Store existingStore = this.get(id);
newStore.setVersion(existingStore.getVersion());
em.getTransaction().begin();
newStore = super.update(id, newStore);
em.getTransaction().commit();
return newStore;
}
update(String, T): a generic method for making an update. Checks that ids match and performs merge operation.
public T update(String id, T newObj) {
if (newObj == null) {
throw new EmptyPayloadException(type.getSimpleName());
}
Type superclass = getClass().getGenericSuperclass();
if (superclass instanceof Class) {
superclass = ((Class) superclass).getGenericSuperclass();
}
Class<T> type = (Class<T>) (((ParameterizedType) superclass).getActualTypeArguments()[0]);
T obj = em.find(type, id);
if (!newObj.getId().equals(obj.getId())) {
throw new IdMismatchException(id, newObj.getId());
}
return em.merge(newObj);
}
The problem is that this call: T obj = em.find(type, id); triggers an update of store object in the database which means that we get OptimisticLockException when triggering merge (because versions are now different).
Why is this happening? What would be the correct way to achieve this?
I kind of don't want to copy properties from newStore to existingStore and use existingStore for merge - which would, I think, solve the optimistic lock problem.
This code is not running on an application server and I am not using JTA.
EDIT:
If I detach existingStore before calling update, T obj = em.find(type, id); doesn't trigger an update of store object so this solves the problem. The question still remains though - why does it trigger it when entity is not detached?
I can't see your entity from code which you added but I believe that you missing some key point with optimistic locking -> #Version annotation on version field.
If you have this field on your entity then container should be able to do merge procedure without problems. Please take a look to
Optimistic Locking also good article don't break optimistic locking
Does Objectify throw a ConcurrentModificationException in case an entity with the same key (without a parent) is created at the same time (when before it did not exist) in two different transactions? I just found information regarding the case that the entity already exists and is modified, but not in case it does not yet exist...
ofy().transactNew(20, new VoidWork() {
#Override
public void vrun() {
Key<GameRequest> key = Key.create(GameRequest.class, numberOfPlayers + "_" + rules);
Ref<GameRequest> ref = ofy().load().key(key);
GameRequest gr = ref.get();
if(gr == null) {
// create new gamerequest and add...
// <-- HERE
} else {
...
}
}
});
Thanks!
Yes, you will get CME if anything in that entity group changes - including entity creation and deletion.
The code you show should work fine. Unless you really know what you are doing, you're probably better off just using the transact() method without trying to limit retries or forcing a new transaction. 99% of the time, transact() just does the right thing.
I have an entity class that has a lazy field like this:
#Entity
public Movie implements Serializable {
...
#Basic(fetch = FetchType.LAZY)
private String story;
...
}
The story field should normally be loaded lazily because it's usually large. However sometimes, I need to load it eagerly, but I don't write something ugly like movie.getStory() to force the loading. For lazy relationship I know a fetch join can force a eager loading, but it doesn't work for lazy field. How do I write a query to eagerly load the story field?
I'd try Hibernate.initialize(movie). But calling the getter (and adding a comment that this forces initialization) is not that wrong.
The one possible solution is:
SELECT movie
FROM Movie movie LEFT JOIN FETCH movie.referencedEntities
WHERE...
Other could be to use #Transactional on method in ManagedBean or Stateless and try to access movie.getReferencedEntities().size() to load it but it will generate N+1 problem i.e. generating additional N queries for each relationship which isn't too efficient in many cases.
You can use the fetch all properties keywords in your query:
SELECT movie
FROM Movie movie FETCH ALL PROPERTIES
WHERE ...
To quote the JPA spec (2.0, 11.1.6):
The LAZY strategy is a hint to the persistence provider runtime that data should be fetched
lazily when it is first accessed. The implementation is permitted to eagerly fetch data for which the
LAZY strategy hint has been specified.
Hibernate only supports what you are trying if you use its bytecode enhancement features. There are a few ways to do that. First is to use the build-time enhancement tool. The second is to use (class-)load-time enhancement. In Java EE environments you can enable that on Hibernate JPA using the 'hibernate.ejb.use_class_enhancer' setting (set it to true, false is the default). In Java SE environments, you need to enhance the classes as they are loaded, either on your own or you can leverage org.hibernate.bytecode.spi.InstrumentedClassLoader
If you don't mind having a POJO as a query result you can use constructor query. This will require your object to have constructor with all needed parameters and a query like this:
select new Movie(m.id, m.story) from Movie m
I would suggest to traverse the objects using Java reflection calling all methods starting with "get" and repeat this for all the gotten object, if it has an #Entity annotation.
Not the most beautiful way, but it must be a robust workaround. Something like that (not tested yet):
public static <T> void deepDetach(EntityManager emanager, T entity) {
IdentityHashMap<Object, Object> detached = new IdentityHashMap<Object, Object>();
try {
deepDetach(emanager, entity, detached);
} catch (IllegalAccessException e) {
throw new RuntimeException("Error deep detaching entity [" + entity + "].", e);
} catch (InvocationTargetException e) {
throw new RuntimeException("Error deep detaching entity [" + entity + "].", e);
}
}
private static <T> void deepDetach(EntityManager emanager, T entity, IdentityHashMap<Object, Object> detached) throws IllegalAccessException, InvocationTargetException {
if (entity == null || detached.containsKey(entity)) {
return;
}
Class<?> clazz = entity.getClass();
Entity entityAnnotation = clazz.getAnnotation(Entity.class);
if (entityAnnotation == null) {
return; // Not an entity. No need to detach.
}
emanager.detach(entity);
detached.put(entity, null); // value doesn't matter. Using a map, because there is no IdentitySet.
Method[] methods = clazz.getMethods();
for (Method m : methods) {
String name = m.getName();
if (m.getParameterTypes().length == 0) {
if (name.length() > 3 && name.startsWith("get") && Character.isUpperCase(name.charAt(3))) {
Object res = m.invoke(entity, new Object[0]);
deepDetach(emanager, res, detached);
}
// It is actually not needed for searching for lazy instances, but it will load
// this instance, if it was represented by a proxy
if (name.length() > 2 && name.startsWith("is") && Character.isUpperCase(name.charAt(2))) {
Object res = m.invoke(entity, new Object[0]);
deepDetach(emanager, res, detached);
}
}
}
}
I am learning GAE and am getting a bit stuck. If I use the following, with a finally to make sure the persistence manager is closed, I get an exception when trying to actually read the Note objects:
public class Notes {
public List<Note> getAll() {
PersistenceManager pm = PMF.instance().getPersistenceManager();
try {
Query query = pm.newQuery("select from com.uptecs.google1.model.Note order by subject");
return (List<Note>) query.execute();
} finally {
pm.close();
}
}
}
The exception I get is this:
Object Manager has been closed
org.datanucleus.exceptions.NucleusUserException: Object Manager has been closed
at org.datanucleus.ObjectManagerImpl.assertIsOpen(ObjectManagerImpl.java:3876)
at org.datanucleus.ObjectManagerImpl.getFetchPlan(ObjectManagerImpl.java:376)
at org.datanucleus.store.query.Query.getFetchPlan(Query.java:497)
Try detaching the object from the graph with detachable="true":
#PersistenceCapable(identityType = IdentityType.APPLICATION, detachable="true")
public class Note {
#PrimaryKey
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY)
private Long key;
...
}
Note: I totally understand the need for this, sometimes you need to retrieve the objects and lists in a controller, close the PM in the controller, then pass the models to the views. Until better solutions are known to me, this is what I am doing this on JDO/GAE with no problems so far.
List:
It seems to me that you will have to detach all the items in the list if you want to be able to use them after the PM is closed. I'd use this to get specific lists of items. A full getAll() can be very big in size.
public List<Note> getList(){
List<Note> detachedList=null, list=null;
try {
String query = "select from " + Note.class.getName();
pm = PMF.get().getPersistenceManager();
list = (List<Note>)pm.newQuery(query).execute();
detachedList = new ArrayList<Note>();
for(Note obj : list){
detachedList.add(pm.detachCopy(obj));
}
} finally {
pm.close();
}
return detachedList;
}
By Key:
public Note findByKey(Long key) {
Note detachedCopy=null, object=null;
try{
pm= PMF.get().getPersistenceManager();
object = pm.getObjectById(Note.class,key);
detachedCopy = pm.detachCopy(object);
}catch (JDOObjectNotFoundException e) {
return null; // or whatever
}
finally {
pm.close(); // close here
}
return detachedCopy;
}
Afer the close, you have a detached copy, with which you can work.
Reference: http://www.datanucleus.org/products/accessplatform_1_1/jdo/attach_detach.html
When result is returned in the list - objects are retrieved lazily (only when you ask for them). Since your persistence manager is closed you get an exception. By "detaching" the objects your are effectively telling the persistence manager to retrieve them eagerly.
In addition to the answer from bakkal, I would say that you absolutely need the detachable="true" annotation parameter, otherwise you will never get it to work.
To detach a list of objects, you can also use pm.detachCopyAll(your_query_result_list), wich will be a bit faster than your implementation of the iteration to detach, and will let you spare a few lines of code. Thanks JDO ! ;-) But be aware, this method requires explicit cast of its results.
Here's a working example I currently use in my last App (the Key used in the query is an Encoded String) :
pm = PMF.get().getPersistenceManager();
Query query = pm.newQuery(TandemSubscription.class);
query.setFilter("groupSubscriptionKey==groupSubscriptionKeyParam");
query.setOrdering("dateRDV desc");
query.declareParameters("String groupSubscriptionKeyParam");
// Get Data
#SuppressWarnings("unchecked")
List<TandemSubscription> savedSubscriptions =
(List<TandemSubscription>) query.execute(Key);
// Detach all objects in the list
savedSubscriptions =
(List<TandemSubscription>) pm.detachCopyAll(savedSubscriptions);
pm.close();
// Now you can use the list and its content.
I Hope this helps a bit.
I'm using straight Hibernate 3.0 without annotations.
When saving or updating domain objects, I would like to have Hibernate automatically generate the CREATE_DT and UPDATE_DT fields, as opposed to using database triggers.
What are the best practices for accomplishing this?
The background is that I have an object graph being passed from a client, that contains multiple objects. Some of which will end up being inserted and others updated. I could set the dates on the client, but this would be a bad idea. Setting the dates on the server means I would have to rifle through the graph and detect the changes.
It seems to me that Hibernate would have a facility for making this happen, but it is not jumping out at me.
The Hibernate way to do this without using triggers would be to use Hibernate's event architecture and to register listeners for PreInsertEvent, PreUpdateEvent or SaveOrUpdateEvent (have a look at the org.hibernate.event package for a full list) to set and update the create/update dates.
Another option would be to use an interceptor, either Session-scoped or SessionFactory-scoped, to set the create and update dates in onSave(...) and the update date in onFlushDirty(...).
Maybe have a look at this previous answer for other options.
The simplest way according to me to achieve this would be to have those fields as properties into your object class and set them privately through your constructor. When a property value would be changed, set your DateUpdated (for instance) to DateTime.Now (or whatever it may be in Java).
Once your entity is being persisted, it would automatically save those dates and persist them to your underlying database.
It is not directly with Hibernate, but I would consider this solution easier to implement than playing with Interceptors.
Hope this helps!
I'm checking Pascal's answer as the correct answer as he pointed me directly to the point in the documentation that provided detail and example code. But for the reader, I'm adding more detail here.
I tried both the hibernate events as well as the interceptors, and found that an interceptor worked better in my situation. It was relatively easy to implement.
Thanks for the help!
Below is the code for the interceptor:
public class AuditInterceptor extends EmptyInterceptor {
private Log log = LogFactory.getLog(this.getClass());
private int updates;
private int creates;
#Override
public boolean onSave(Object entity, Serializable id, Object[] state,
String[] propertyNames, Type[] types) {
if (entity instanceof AuditableVO) {
creates++;
// Find the create date and change it
for (int i=0; i < propertyNames.length; i++) {
if (propertyNames[i].equals("createDate")) {
state[i] = new Date();
return true;
}
}
}
return false;
}
#Override
public boolean onFlushDirty(Object entity, Serializable id,
Object[] currentState, Object[] previousState,
String[] propertyNames, Type[] types) {
if (entity instanceof AuditableVO) {
updates++;
// Find the update date and change it
for (int i=0; i < propertyNames.length; i++) {
if (propertyNames[i].equals("updateDate")) {
currentState[i] = new Date();
return true;
}
}
}
return false;
}
#Override
public void afterTransactionCompletion(Transaction tx) {
if (tx.wasCommitted()) {
log.info("Creations: " + creates + ", Updates: " + updates);
}
creates = 0;
updates = 0;
super.afterTransactionCompletion(tx);
}
}