MyObject myObject = repositoryHibernateImpl.getMyObjectFromDatabase();
//transaction is finished, and no, there is not an option to reopen it
ThirdPartyUtility.doStuffWithMyObjectType( myObject );
at this point you've already defined what is lazy and eager loaded, and the third party utility will try to call all of the methods on your "myObject" instance, this is fine because you don't want to return anything for the lazily loaded properties, unfortunately it doesn't return null, it throws a LazyInitializationException.
This happens because you're actually calling the method on Hibernate's proxy of your object, and it knows that it hasn't fetched that data, and throws an exception.
Is it even possible to get the underlying object with null values so that a getter just returns null, and doesn't throw an exception? Basically detaching the object so that Hibernate is no longer aware of it at all. The accessor to the object that is lazily loaded must return null, it cannot return the actual values, we want to be able to convert the entity into a POJO without having to create an object that looks just like the entity and has to remap all the values.
Let's say you have a field, in the getter you could:
MyField getMyField() {
if (Hibernate.isInitialized(myField)) {
return myField;
}
return null;
}
From the javadoc of org.hibernate.Hibernate:
public static boolean
isInitialized(Object proxy): check if
the proxy or persistent collection is
initialized.
If you don't want to couple your domain to Hibernate, another possibility is to have your DAO instantiate your own instance of the entity from inside getMyObjectFromDatabase() and populate that with the appropriate fields from Hibernate's proxy. I've done this and it works well.
Obviously this is more code, but you're guaranteed a "pure" instance of your entity (complete with null uninitialized values) if that's what you want.
check my solution.
Minimal example:
I do not load the property from the object but from the controller.
Code:
{..}
MyProp prop = employeeController.getMyProp(employee);
{..}
This initiaqlizes the property via repository object and returns it.
EmployeeController.java:
public Set<MyProp> getMyProp(Employee employee) {
if (this.employeeRepository.InitMyProp(employee)){
return employee.getMyProp();
}
return null;
}
Repository get/open the session, reload employee object ! and initialize lazy loaded field
EmployeeRepository.java:
public boolean InitMyProp(Employee employee) {
if (Hibernate.isInitialized(employee.getMyProp())){
return true;
}
try {
Session session = getSession();
session.refresh(employee);
Hibernate.initialize(employee.getMyProp());
} catch (Exception ex) {
return false;
}
return true;
}
private Session getSession(){
if (session == null || !session.isConnected() || !session.isOpen()){
session = HibernateUtil.getSessionFactory().getCurrentSession();
}
if (!session.getTransaction().isActive()) {
session.beginTransaction();
}
return session;
}
I have in my solution a TableView with several thousand records and 2 further TableViews with details on the selected record in the first TableView.
hope it helps.
Related
I am attempting to write to multiple databases using hibernate. I have encapsulated write and read/write sessions within a single session object. However, when I go to save I get a lot of errors that the objects are already associated with another session: "Illegal attempt to associate a collection with two open sessions"
Here is my code:
public class MultiSessionObject implements Session {
private Session writeOnlySession;
private Session readWriteSession;
#Override
public void saveOrUpdate(Object arg0) throws HibernateException {
readWriteSession.saveOrUpdate(arg0);
writeOnlySession.saveOrUpdate(arg0);
}
}
I have tried evicting the object and flushing; however, that causes problems with "Row was updated or deleted by another transaction"... even though both sessions point to different databases.
public class MultiSessionObject implements Session {
private Session writeOnlySession;
private Session readWriteSession;
#Override
public void saveOrUpdate(Object arg0) throws HibernateException {
readWriteSession.saveOrUpdate(arg0);
readWriteSession.flush();
readWriteSession.evict(arg0);
writeOnlySession.saveOrUpdate(arg0);
writeOnlySession.flush();
writeOnlySession.evict(arg0);
}
}
In addition to the above, I have also attempted using the replicate functionality of hibernate. This was also unsuccessful without errors.
Has anyone successfully saved an object to two databases that have the same schema?
The saveOrUpdate tries to reattach a given Entity to the current running Session, so Proxies (LAZY associations) are bound to the Hibernate Session. Try using merge instead of saveOrUpdate, because merge simply copies a detached entity state to a newly retrieved managed entity. This way, the supplied arguments never gets attached to a Session.
Another problem is Transaction Management. If you use Thread-bound Transaction, then you need two explicit transactions if you want to update two DataSources from the same Thread.
Try to set the transaction boundaries explicitly too:
public class MultiSessionObject implements Session {
private Session writeOnlySession;
private Session readWriteSession;
#Override
public void saveOrUpdate(Object arg0) throws HibernateException {
Transaction readWriteSessionTx = null;
try {
readWriteSessionTx = readWriteSession.beginTransaction();
readWriteSession.merge(arg0);
readWriteSessionTx.commit();
} catch (RuntimeException e) {
if ( readWriteSessionTx != null && readWriteSessionTx.isActive() )
readWriteSessionTx.rollback();
throw e;
}
Transaction writeOnlySessionTx = null;
try {
writeOnlySessionTx = writeOnlySession.beginTransaction();
writeOnlySession.merge(arg0);
writeOnlySessionTx.commit();
} catch (RuntimeException e) {
if ( writeOnlySessionTx != null && writeOnlySessionTx.isActive() )
writeOnlySessionTx.rollback();
throw e;
}
}
}
As mentioned in other answers, if you are using Session then you probably need to separate the 2 updates and in two different transactions. The detached instance of entity (after evict) should be able to be reused in the second update operation.
Another approach is to use StatelessSession like this (I tried a simple program so had to handle the transactions. I assume you have to handle the transactions differently)
public static void main(final String[] args) throws Exception {
final StatelessSession session1 = HibernateUtil.getReadOnlySessionFactory().openStatelessSession();
final StatelessSession session2 = HibernateUtil.getReadWriteSessionFactory().openStatelessSession();
try {
Transaction transaction1 = session1.beginTransaction();
Transaction transaction2 = session2.beginTransaction();
ErrorLogEntity entity = (ErrorLogEntity) session1.get(ErrorLogEntity.class, 1);
entity.setArea("test");
session1.update(entity);
session2.update(entity);
transaction1.commit();
transaction2.commit();
System.out.println("Entry details: " + entity);
} finally {
session1.close();
session2.close();
HibernateUtil.getReadOnlySessionFactory().close();
HibernateUtil.getReadWriteSessionFactory().close();
}
}
The issue with StatelessSession is that it does not use any cache and does not support cascading of associated objects. You need to handle that manually.
Yeah,
The problem is exactly what it's telling you. The way to successfully achieve this is to treat it like 2 different things with 2 different commits.
Create a composite Dao. In it you have a
Collection<Dao>
Each of those Dao in the collection is just an instance of your existing code configured for 2 different data sources. Then, in your composite dao, when you call save, you actually independently save to both.
Out-of-band you said you it's best effort. So, that's easy enough. Use spring-retry to create a point cut around your individual dao save methods so that they try a few times. Eventually give up.
public interface Dao<T> {
void save(T type);
}
Create new instances of this using a applicationContext.xml where each instance points to a different database. While you're in there use spring-retry to play a retry point-cut around your save method. Go to the bottom for the application context example.
public class RealDao<T> implements Dao<T> {
#Autowired
private Session session;
#Override
public void save(T type) {
// save to DB here
}
}
The composite
public class CompositeDao<T> implements Dao<T> {
// these instances are actually of type RealDao<T>
private Set<Dao<T>> delegates;
public CompositeDao(Dao ... daos) {
this.delegates = new LinkedHashSet<>(Arrays.asList(daos));
}
#Override
public void save(T stuff) {
for (Dao<T> delegate : delegates) {
try {
delegate.save(stuff);
} catch (Exception e) {
// skip it. Best effort
}
}
}
}
Each 'stuff' is saved in it's own seperate session or not. As the session is on the 'RealDao' instances, then you know that, by the time the first completes it's totally saved or failed. Hibernate might want you to have a different ID for then so that hash/equals are different but I don't think so.
I get the an exception when trying to get data, lazily(Exception at the very end)
//application gets data by the following DAO.
public T findById(PK id) {
T result = getHibernateTemplate().get(this.type, id);
getHibernateTemplate().flush();
return result;
}
//Junit test calls a serviceX.getById
#Transactional
public SomeObject getById(int x){
return (SomeObject) aboveDao.findById(x);
}
//Withing the JUnit
SomeObject someObj = serviceX.getById(3);
someObj.getAnotherObject().y.equals("3"); //**Exception** at this line.
//SomeObject class has the following property.
#OneToMany(cascade = { CascadeType.ALL }, fetch = FetchType.LAZY)
private AnotherObject anotherObject;
I get the following exception when tryin to access anotherObject in the junit
Methods already tried + extra configuration
We use spring annotation TransactionManager.
<tx:annotation-driven /> specified in the config file.
Also, I tried to add #Transaction(propagation = Propagation.REQUIRED) on top of the JUnit, this did not solve the issue. If I run the application, it works without any issues.
How to solve this type of issue for JUnit?
org.hibernate.LazyInitializationException: failed to lazily initialize
a collection of role xxxxx , no session or session was closed
Here's what happens
SomeObject someObj = serviceX.getById(3); // #Transactional boundary, no more session
someObj.getAnotherObject().y.equals("3"); // session doesn't exist, can't fetch LAZY loaded object
Because your AnotherObject is LAZY fetched, it doesn't actually get loaded in the getById() method. The Session it was associated with is lost when the #Transactional ends, ie. when execution returns from getById(). Because there is no longer a Session, you get the exception.
You can change your FetchType to EAGER. If you're going to that field of your object, you need to initialize it in your Transaction boundaries.
If you only some times need the anotherObject, a possible solution is to create a #Transactional method that calls the getById and eagerly loads the object.
#Transactional
public SomeObject eagerGetById(int x){
SomeObject obj = getById(x);
obj.getAnotherObject(); // will force loading the object
return obj;
}
Calls this method whenever you need to eagerly load the object.
This is could be useful to you LazyInitilializationException
org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: com.t4bt.gov.persistence.entities.Experts.institutaionList, no session or session was closed
You provide very little details in your question (code?), so it will have to be a generalized answer regarding lazy loading. In the future, if you want answers, please provide concrete information about the actual problem, as well as descriptions as to what you have tried to solve it.
A LazyInitialization occurs when you try to access a lazily loaded property after the session is closed (which is usually after the transaction has ended). The way lazy initalization works is that it doesn't fetch the lazily initialized properties when you fetch the object, but when you actually try to access it, Hibernate does another query to the database to fetch it.
The following would produce such an error:
public class Something {
[...]
#OneToMany(fetch = FetchType.LAZY)
private List<SomethingElse> somethingElse;
public List<SomethingElse> getSomethingElse() {
return somethingElse;
}
}
public class SomethingDao {
#Inject
private EntityManager em;
#Transactional
public Something getById(final Integer id) {
return em.find(Something.class, id);
}
}
public class SomethingService {
#Inject
private SomethingDao dao;
public List<SomethingElse> getSomethingElseForSomething(final Integer somethingId) {
final Something something = dao.getById(somethingId);
return something.getSomethingElse() //Throws LazyInitializationException
}
}
Here the transaction (and thus the session) only exists in the dao-class. Once leaving the dao-method, the session is gone. So, when you try to access a lazy-loaded property in the service, it will fail when Hibernates tries to contact the session in order to retrieve it.
To avoid this, there are several possibilities.
Change the annotation of the Something-class to #OneToMany(fetch = FetchType.EAGER)
The property is no longer lazy-loaded, so no more problems.
Add #Transactional to Service-method. Then the call to getSomethingElse() would be in the same transaction as the fetching of the Something-object, and the session will still be alive when doing so.
Add a call to getSomethingElse() in the Dao-method. Then it will initialize the property (fetch it from the database) before leaving the Dao-class (and the transaction), and it will be available outside the transaction, with no need to communicate with the session in order to retrieve it.
I'm trying to use hibernate to build up a local cache of data that I pull from source websites. I have the the objects configured with JPA (if it makes a difference) and can read/write them fine in a test application.
Now I want to move the code into a generic "Caching" class so that I can request an object from the Cache and process it as normal. I can read the object from the database and pass it back to the calling class, but when I try and access the collections in the object I get the dreaded lazy initialization exception.
I know what causes this, the class that I read the object from commits the transaction after it's read the object from the database and before it returns the object to the calling class.
I have tried various methods to work around this, and the simplest (for me) seems to be to try and access all of the collections in the object to ensure they are loaded before closing the transaction and returning the object.
The problem with that approach is that I don't know the structure of the object that I am retrieving from the database (part of the benefit of Hibernate for me) and therefore I can't call the appropriate methods to load the data. How do I overcome this? I don't really want to do eager fetching of the objects as they may be used by other applications. I don't want to use hbm files if I can avoid it.
This is the call to the Cache class:
Series series = (Series) Cache.getFromCache(id, Series.class)
In the Cache class:
public static Object getFromCache(String key, Class clazz) {
Object dbObject = HibernateUtil.loadObject(clazz, key);
if (dbObject != null) {
logger.debug("Cache (Get): Got object (" + clazz.getSimpleName() + ") for " + key);
return dbObject;
}
}
And HibernateUtil does:
public static Object loadObject(Class clazz, Serializable key) {
Session session = sessionFactory.getCurrentSession();
Object dbObject;
try {
session.beginTransaction();
dbObject = clazz.cast(session.get(clazz, key));
} finally {
session.getTransaction().commit();
}
return dbObject;
First of all, you could avoid a type cast by making your loadObject method parameterized:
public static <T> T loadObject(Class<T> clazz, Serializable key) {
Session session = sessionFactory.getCurrentSession();
T dbObject;
try {
session.beginTransaction();
dbObject = clazz.cast(session.get(clazz, key));
}
finally {
session.getTransaction().commit();
}
return dbObject;
}
Second: why do you open and commit the transaction in this method? Letting the caller open a transaction and commit it when he has finished using the object it has loaded from the cache would solve the problem.
Third: if you really want to let this method open and close the transaction, let it take an Initializer<T> instance as parameter. This initializer would have the responsibility to initialize all the necessary associations before returning the entity.
public static <T> T loadObject(Class<T> clazz,
Serializable key,
Initializer<T> initializer) {
Session session = sessionFactory.getCurrentSession();
T dbObject;
try {
session.beginTransaction();
dbObject = clazz.cast(session.get(clazz, key));
initializer.initialize(dbObject);
}
finally {
session.getTransaction().commit();
}
return dbObject;
}
I have a Hibernate project where a call to update() needs to compare the modified object in memory to the data that has already been saved to the database. For example, my business logic states that if a record is "effective" (the effective date is today or earlier), an update cannot change the effective date. In order to accomplish this, I have the following code (it's a little long and involved):
Manager
public class LogicManager {
#Autowired
SessionFactory sessionFactory
private Session getSession() {
return sessionFactory.getCurrentSession();
}
public MemberRecord findRecord(Integer id) {
// << Code to check authorization >>
return memberRecordDAO.findById(id);
}
public void updateRecord(MemberRecord record) {
getSession().evict(record);
MemberRecord oldRecord = memberRecordDAO.findById(record.getId());
Date oldEffectiveDate = oldRecord.getEffectiveDate();
if ( isEffective(oldEffectiveDate) &&
!oldEffectiveDate.equals(record.getEffectiveDate)) {
throw new IllegalArgumentException("Cannot change date");
}
// << Other data checks >>
memberRecordDAO.update(record);
}
}
DAO
public class MemberRecordDAO {
#Autowired
private SessionFactory sessionFactory;
private Session getSession() {
return sessionFactory.getCurrentSession();
}
public MemberRecord findById(Integer id) {
return (MemberRecord)getSession()
.getNamedQuery("findMemberById")
.setInteger("id", id)
.uniqueResult();
}
}
Client Code
// ...
public void changeEffectiveDate(Integer recordId, Date newDate) {
LogicManager manager = getBean("logicManager");
MemberRecord record = manager.findById(recordId);
record.setEffectiveDate(newDate);
manager.updateRecord(record);
}
Before I added the evict() call in the Manager, I noticed that the manager was behaving in unexpected ways. In order to update a record, I'd first have to get that record by calling findById(), which would put the record into the Session cache. I'd make changes on that object, then call updateRecord() which would call findById() to get the (supposedly) persisted data. I realized that this second call to findById() would not look at the database data, but just pull the object from the cache. This would result in my oldEffectiveDate always being the same as my newly changed date, since record and oldRecord would be the exact same object.
To counteract this, I added the call to evict(), which I understood to mean that the object would be removed from the cache, forcing Hibernate to go to the database to get the MemberRecord. After I made that change, my MemberRecordDAO throws an exception when it calls uniqueResult(), which says AssertionFailed: possible nonthreadsafe access to session. When I run the debugger, I see that both LogicManager and MemberRecordDAO are using the same Session, which is what I thought was correct.
So, my questions:
Is my thinking/algorithm correct? Is evict() the correct thing to do? Is there a better way? I am not too savvy on Sessions, caching or evict(). I want to make sure that this logic is correct before dealing with threading issues.
Why is it that accessing the Session from the DAO is not threadsafe?
The evict() approach will work, but I believe the 'preferred hibernate way of doing things' would be to use Session.merge(), as in:
public MemberRecord updateRecord(MemberRecord newRecord) {
MemberRecord oldRecord = memberRecordDAO.findById(record.getId());
Date oldEffectiveDate = oldRecord.getEffectiveDate();
if ( isEffective(oldEffectiveDate) &&
!oldEffectiveDate.equals(newRecord.getEffectiveDate)) {
throw new IllegalArgumentException("Cannot change date");
} else {
MemberRecord merged = (MemberRecord) session.merge(newRecord);
return merged;
}
}
Just keep in mind that Session.merge() will update all of the fields of oldRecord with the values from newRecord.
This was the solution that passed my tests, but it still seems a little gross to me:
Manager
public void updateRecord(MemberRecord record) {
MemberRecord oldRecord = record;
record = record.clone(); //Added a clone() to MemberRecord
getSession().evict(record);
getSession().evict(oldRecord);
getSession().refresh(oldRecord);
// At this point, record has all of the new values, but none of the Hibernate
// data attached to it, due to the clone().
// oldRecord is populated with the data currently in the database.
Date oldEffectiveDate = oldRecord.getEffectiveDate();
if ( isEffective(oldEffectiveDate) &&
!oldEffectiveDate.equals(record.getEffectiveDate)) {
throw new IllegalArgumentException("Cannot change date");
}
// << Other data checks >>
memberRecordDAO.update(record);
}
If this type of thing can be done cleaner, please tell me.