Given the following mapping
<class name="com.domain.Season" table="cm.pub.jsn_mstr">
<id name="seasonCode" column="season_code" length="1"/>
<property name="name" type="string" column="name" length="20"/>
<set name="promotions" lazy="false">
<key column="season_code"/>
<one-to-many class="com.domain.Promotion" not-found="ignore"/>
</set>
</class>
How can I include or exclude the load of promotions? I could use lazy="true" though I'm using Jackson to serialize the result which is after the session is closed.
public Collection<Season> getSeasons(boolean withPromotions) {
final Session session = sessionFactory.getCurrentSession();
try {
session.beginTransaction();
return (List<Season>) session.createQuery("from Season s").list();
} finally {
session.getTransaction().commit();
}
}
UPDATE: Problem with using lazy loading.
The getSeasons method above is used in an MVC controller that will retrieve seasons, then using jackson serialize them to JSON (using Spring/MVC's view-resolver) so I don't actually access the objects myself, therefore any attempt to lazily load the collection results in an exception (as jackson will call an iterator on all collection properties).
Here's an example that shows an exception will get thrown:
public Collection<Season> getSeasons(boolean withPromotions) {
final Session session = sessionFactory.getCurrentSession();
final List<Season> r;
try {
session.beginTransaction();
r = (List<Season>) session.createQuery(
withPromotions
? "from Season s join fetch s.promotions"
: "from Season s"
).list();
} finally {
session.getTransaction().commit();
}
try {
for (Season s : r) {
for (Promotion p : s.getPromotions()) {
// Exception thrown here as we attempted to get an iterator.
LOG.debug("Promotion: " + p.getName());
}
}
} catch (Exception ex) {
LOG.error("Couldn't get promotions", ex);
}
return r;
}
And of course this time the mapping needs to have lazy="true" otherwise it will always eager read the collection.
<class name="com.domain.Season" table="cm.pub.jsn_mstr">
<id name="seasonCode" column="jsn_seas" length="1"/>
<set name="promotions" lazy="true">
<key column="jpr_seas"/>
<one-to-many class="com.domain.Promotion" not-found="ignore"/>
</set>
</class>
Data type for promotions field is Collection<Promotion>.
Try this:
session.createQuery("from Season s join fetch s.promotions").list();
From hibernate reference:
A "fetch" join allows associations or collections of values to be initialized along with their parent objects using a single select.
You can use Hibernate.initialize() to initialize a collection from the proxies. In your example, add the below code.
public Collection<Season> getSeasons(boolean withPromotions) {
final Session session = sessionFactory.getCurrentSession();
try {
session.beginTransaction();
List<Season> list = session.createQuery("from Season s").list();
for(Season s : list){
if(condition){
Hibernate.initialize(s.getPromotions());
}
}
return list;
} finally {
session.getTransaction().commit();
}
}
From the API docs this will,
The static methods Hibernate.initialize() and Hibernate.isInitialized(), provide the application with a convenient way of working with lazily initialized collections or proxies. Hibernate.initialize(cat) will force the initialization of a proxy, cat, as long as its Session is still open. Hibernate.initialize( cat.getKittens() ) has a similar effect for the collection of kittens.
Read the API here or an example here.
I've found the only way I could do this is to not map the collection and instead send off two queries, and adding to the collection myself.
This gives me greater control of the association and also improves performance as I'm only sending off two queries instead of 1+n (where n is the rows retrieved from the first query).
Essentially, your problem is Jackson -- reading everything in the bean. Not Hibernate.
Either:
1) Control Jackson properly, to transmit the data you actually need -- and avoid what you don't want it to 'pull on'.
2) Use Spring's 'OpenSessionInViewFilter' or similar that binds a Session to a thread-local, and leaves it open until the view has run -- enabling Jackson to still 'pull on' & load the collections at view-rendering stage.
3) or, Map two different classes (one full details, one light-weight) to the same table, and retrieve the one you actually need.
SeasonLite could perhaps be an ancestor of Season (full details), but note that you're not mapping inheritance with Hibernate -- you're mapping two separate views of the same table.
PS: methods like 'getSomeData(boolean retrieveThisData)' appear to be a common anti-pattern. I've seen it many times before.
Sorry, this is a very late answer, but I only just found your question when looking for something else.
As already stated in some of the answers, the problem is that Hibernate uses a 'placeholder' PersistentCollection for a lazy loading collection. To stop Hibernate trying to fill out the collection when it is accessed, you could null out the actual Promotions list in each of the Season entities that are returned by the query. In code, something like this:
if (!withPromotions) {
for (Season s : r) {
s.setPromotions(null);
}
}
Setting the Promotions to null will clear out the reference to the Hibernate collection object. You should be able to do this outside the session.
What we do is create methods on our entities for each of our collections similar to this:
public boolean isPromotionsLoaded() {
if (((promotions instanceof org.hibernate.collection.PersistentCollection)
&& !((org.hibernate.collection.PersistentCollection) promotions).wasInitialized())
|| (getPromotions() == null)) {
return false;
} else {
return true;
}
}
Then we can use these methods to find out if we can safely access the collection.
Anyway, hope this helps someone.
Related
My use case is as follows: I've inherited a project that is using hibernate. The entity I'm focused on right now is, for the purposes of this exercise, closed to modification. The object of the exercise is to replace the use of the legacy entity with an unrelated implementation that is better suited to the new requirements.
The goal is to be able to move functionality from the old entity to the new incrementally.
As is, the use of the legacy entity looks something like
//...
final Session currentSession = sessionFactory().getCurrentSession();
{
LegacyEntity oldAndBusted = get(currentSession, "12345");
oldAndBusted.change(...);
put(oldAndBusted);
}
}
LegacyEntity get(final Session currentSession, String businessId) {
return (LegacyEntity) currentSession
.createQuery("from PurpleMonkeyDishwasher where businessId = ?")
.setParameter(0, "12345")
.uniqueResult();
}
void put(final Session currentSession, LegacyEntity changed) {
currentSession.saveOrUpdate(changed);
}
With configuration magic hidden off in some hbm.xml file
<class name="LegacyEntity" table="PurpleMonkeyDiswasher">
<!-- stuff -->
</class>
How do I arrange analogous code for a new entity mapped to the same table
BuzzwordCompliantEntity get(final Session currentSession, String businessId);
void put(BuzzwordCompliantEntity changed);
without breaking the code paths that are still using LegacyEntity in the same process?
"The entity I'm focused on right now is, for the purposes of this exercise, closed to modification. ... The goal is to be able to move functionality from the old entity to the new incrementally." I find this contradictory. When replacing a class by another, I have always had success by: 1) incrementally changing the old class API to be a subset of the new class API, 2) renaming the old type to have the same name and package as the new class, 3) removing the old class (that we renamed in step 2). While doing all of this, I rely as much as possible on the refactoring capabilities of the IDE.
I want to use the hibernate filter but I don't know if what I want to do is possible
I have 2 entities :
Message and MessageUser.
A Message has a list of MessageUser.
I want to create a filter so I can do something like :
final Session filteredSession = sessionFactory.openSession();
final Filter filter = filteredSession.enableFilter("userRecipient");
filter.setParameter("userRecipient", myUser);
filter.validate();
final List<Message> userMessages = filteredSession.createQuery("from Message").list();
it returns me only the message where myUser is the recipient ?
is it possible to to and how?
Thanks a lot !
If you are comfortable with Criteria you could create criteria like this
Session hbSession= sessionFactory.openSession();
Criteria criteria = hbSession.createCriteria(Message.class);
criteria.createCriteria("msgUserList","userListAlias");// msgUserList is variable name of users list in Message
criteria.add(Restrictions.eq("userListAlias.user",myUser));//user is variable for User type in msgUserList's class.
List<Message> userMessages = criteria.list();
Have a look at this for reference while creating criteria!
If you only want to use filter then I hope you have configured filter on your User List some thing like bellow
By *.hbm.xml
<hibernate-mapping package="com....">
<class name="Message" table="message_table">
....
<list name="msgUserList" inverse="true" cascade="all">
<key column="user_id" />
<one-to-many class="MessageUsers" />
<filter name="userRecipient" condition="user_id =:userParam" />
</list>
</class>
<filter-def name="userRecipient">
<filter-param name="userParam" type="User" />//User is class
</filter-def>
</hibernate-mapping>
Or By annotation
#Entity
#FilterDef(name="userRecipient",
parameters=#ParamDef(name="userParam", type="PAKAGE.User" ))
#Table(name = "message_table", catalog = "your_db")
public class Message{
...
#OneToMany(fetch = FetchType.LAZY, mappedBy = "stock")
#Filter(name = "userRecipient",condition="user = :userParam")
public List<MessageUser> msgUserList;
after this you will be able get your filter working
Filter filter = session.enableFilter("userRecipient");
filter.setParameter("userParam", myUser);
Update
Purpose of Filter is different than the Criteria, from my understanding you can say that Filter is just like a Criteria that is already applied on your class or collection which has on and off switch. If your hibernate session has certain filter enabled with it's parameters set than that filter is on and all queries relating to the class or collection which has this filter specified will always return filtered result as per the condition. This means you don't have to explicitly define it every time and by using getEnabledFilter("filterName") you can just change that filter's parameters any time.
Example usage of filter can be if you have Movies table and Actor table with many-to-many relationship, like Leonardo Dicaprio can have many movies at the same times A titanic can have many actors, here when you get Actor obviously you would want only those movies which this Actor has performed in, so you can use filter here which is applied on collection of Movies that is mapped in Actor class. This way when you get Actor object say by simple criteria of name and nothing else and access it's Movie collection by . operator on Actor object it will return you only movies which that actor has performed. This also means no matter how you got Actor object from database when you access Movie collection of Actor it will provide you movies that this actor has performed in
Criteria on the other hand you can use when you require result from database with certain conditions which does not need to be replicated rather you don't want it to be replicated later in the hibernate session. Like Actor lets say Leonardo Dicaprio containing collection of Movies that got nominated him in Oscar. This collection will only be populated in Actor object when gone through certain criteria and will not be available on other Actor objects which have not being retrieved by this criteria.
I hope you understood basic concept of filter and criteria, and from my understanding of your problem it will be better if you use criteria!
I am using Spring transactions so the transaction is still active when POJO to DTO conversion occurs.
I would like to prevent Dozer from triggering lazy loading, so that hidden sql queries never occur : all fetching has to be done explicitly via HQL (to get the best control on performances).
Is it a good practice (I can't find it documented anywhere) ?
How to do it safely ?
I tried this before DTO conversion :
PlatformTransactionManager tm = (PlatformTransactionManager) SingletonFactoryProvider.getSingletonFactory().getSingleton("transactionManager");
tm.commit(tm.getTransaction(new DefaultTransactionDefinition()));
I don't know what happens to the transaction, but the Hibernate session doesn't get closed, and the lazy loading still occurs.
I tried this :
SessionFactory sf = (SessionFactory) SingletonFactoryProvider.getSingletonFactory().getSingleton("sessionFactory");
sf.getCurrentSession().clear();
sf.getCurrentSession().close();
And it prevents lazy loading, but is it a good practice to manipulate session directly in the application layer (which is called "facade" in my project) ? Which negative side effects should I fear ? (I've already seen that tests involving POJO -> DTO conversions could no more be launched through AbstractTransactionnalDatasource Spring test classes, because this classes try to trigger a rollback on a transaction which is no more linked to an active session).
I've also tried to set propagation to NOT_SUPPORTED or REQUIRES_NEW, but it reuse the current Hibernate session, and doesn't prevent lazy loading.
The only generic solution I have found for managing this (after looking into Custom Converters, Event Listeners & Proxy Resolvers) is by implementing a Custom Field Mapper. I found this functionality tucked away in the Dozer API (I don't believe it is documented in the User Guide).
A simple example is as follows;
public class MyCustomFieldMapper implements CustomFieldMapper
{
public boolean mapField(Object source, Object destination, Object sourceFieldValue, ClassMap classMap, FieldMap fieldMapping)
{
// Check if field is a Hibernate collection proxy
if (!(sourceFieldValue instanceof AbstractPersistentCollection)) {
// Allow dozer to map as normal
return false;
}
// Check if field is already initialized
if (((AbstractPersistentCollection) sourceFieldValue).wasInitialized()) {
// Allow dozer to map as normal
return false;
}
// Set destination to null, and tell dozer that the field is mapped
destination = null;
return true;
}
}
This will return any non-initialized PersistentSet objects as null. I do this so that when they are passed to the client I can differentiate between a NULL (non-loaded) collection and an empty collection. This allows me to define generic behaviour in the client to either use the pre-loaded set, or make another service call to retrieve the set (if required). Additionally, if you decide to eagerly load any collections within the service layer then they will be mapped as usual.
I inject the custom field mapper using spring:
<bean id="dozerMapper" class="org.dozer.DozerBeanMapper" lazy-init="false">
<property name="mappingFiles">
...
</property>
<property name="customFieldMapper" ref="dozerCustomFieldMapper" />
</bean>
<bean id="dozerCustomFieldMapper" class="my.project.MyCustomFieldMapper" />
I hope this helps anyone searching for a solution for this, as I failed to find any examples when searching the Internet.
A variation on the popular version above, makes sure to catch both PersistentBags, PersistentSets, you name it...
public class LazyLoadSensitiveMapper implements CustomFieldMapper {
public boolean mapField(Object source, Object destination, Object sourceFieldValue, ClassMap classMap, FieldMap fieldMapping) {
//if field is initialized, Dozer will continue mapping
// Check if field is derived from Persistent Collection
if (!(sourceFieldValue instanceof AbstractPersistentCollection)) {
// Allow dozer to map as normal
return false;
}
// Check if field is already initialized
if (((AbstractPersistentCollection) sourceFieldValue).wasInitialized()) {
// Allow dozer to map as normal
return false;
}
return true;
}
}
I didn't get the above to work (probably different versions). However this works fine
public class HibernateInitializedFieldMapper implements CustomFieldMapper {
public boolean mapField(Object source, Object destination, Object sourceFieldValue, ClassMap classMap, FieldMap fieldMapping) {
//if field is initialized, Dozer will continue mapping
return !Hibernate.isInitialized(sourceFieldValue));
}
}
Have you considered disabling lazy loading altogether?
It doesn't really seem to jive with the patterns you state you would like to use:
I would like to prevent Dozer from triggering lazy loading, so that hidden sql queries never occur : all fetching has to be done explicitly via HQL (to get the best control on performances).
This suggests you would never want to use lazy loading.
Dozer and the Hibernate-backed beans you pass to it are blissfully ignorant of each other; all Dozer knows is that it is accessing properties in the bean, and the Hibernate-backed bean is responding to calls to get() a lazy-loaded collection just as it would if you were accessing those properties yourself.
Any tricks to make Dozer aware of the Hibernate proxies in your beans or vice versa would, IMO, break down the layers of your app.
If you don't want any "hidden SQL queries" fired at unexpected times, simply disable lazy-loading.
Short version of this mapper will be
return sourceFieldValue instanceof AbstractPersistentCollection &&
!( (AbstractPersistentCollection) sourceFieldValue ).wasInitialized();
Using CustomFieldMapper may not be a good idea as it gonna invoke for every field of your source class,but our concern is only lazy association mapping(child object list),so we can set the null value in getter of the entity object,
public Set<childObject> getChild() {
if(Hibernate.isInitialized(child){
return childObject;
}else
return null;
}
Given that you have a lot of domain objects, that all interact with one another, it would be very useful to know which objects have changed in a particular transaction.
Is this possible ? I would like to essentially do this :
public void someBusinessLogicMethod(someparams) {
Session s = getSession();
Transaction tr = s.beginTransaction()
domainObject = s.load(...)
domainObject.setSomethingOrOther(...);
domainObject.getSomeLink().setSomethingElse(...);
callSomeOtherBusinessLogicMethod();
tr.commit();
/* at this point many objects have changed, Hibernate knows which ones */
for (Object s : tr.getAffectedObjects(?)) {
....
}
}
Does this exist ?
Assuming you want to do something like create audit entries for all the changes, you could use a Hibernate Listener or an Interceptor. If you hook the listener/interceptor at the right moment (e.g. onFlushDirty), you have access to the objects and properties that have changed.
More info: http://docs.jboss.org/hibernate/core/3.3/reference/en/html/events.html
Hope this helps.
I can't delete a child object from the database. From the org.apache.struts.action.Action.execute() method, I am removing the child from the parent's List, and also calling session.delete(child). I've simplified the code below and only included what I believe to be relavent.
Hibernate Mapping
<class
name="xxx.xxx.hibernate.Parent"
table="parent">
...
<list
name="children"
cascade="all,delete-orphan"
lazy="true"
inverse="true">
<key column="parent_id"/>
<index column="list_index"/>
<one-to-many class="xxx.xxx.hibernate.Child"/>
</list>
</class>
<class
name="xxx.xxx.hibernate.Child"
table="child">
...
<many-to-one
name="parent"
class="xxx.xxx.hibernate.Parent"
not-null="true"
column="parent_id" />
</class>
Excerpt from execute() method
Transaction tx = session.beginTransaction(); //session is of type org.hibernate.Session
try {
Parent parent = (Parent) session.get(Parent.class, getParentId());
Iterator i = form.getDeleteItems().iterator(); //form is of type org.apache.struts.action.ActionForm
while(i.hasNext()){
Child child = (Child) i.next();
session.delete(child);
parent.getChildren().remove(child); //getChildren() returns type java.util.List
}
session.saveOrUpdate(parent);
tx.commit();
} ...
I've tried with only session.delete(child); and I've tried with only parent.getChildren().remove(child); and with both lines, all without success. There are no errors or thrown exceptions or anything of the sort. I'm sure this code gets called (I've even used System.out.println(); to trace what's happening), but the database isn't updated. I can add children using similar code, edit non-collection properties of existing children, edit the parent's properties, all of that works, just not deleting!
According to the Hibernate FAQ I'm doing the mapping right, and according to this SO question I've got the right logic. I've looked all over the internet and can't seem to find anything else.
What am I doing wrong? Please help! Thanks.
Notes on versions
Everything is a few years old:
Java 1.4.2
SQL Server 2005
Hibernate 3.0.5
Struts 1.2.7
Apache Tomcat 5.0.28
If you haven't overridden the equals() method, the entity is probably not found in the list, because it has been detached, and is now a different instance. That's why the remove isn't working. Then even if the delete works, the objects are re-cascacde because they still exist in the collection. Here's what to do:
either override the equals() (and hashCode()) method(s), using either the id (easy) or some sort of busines key (more appropriate) (search stackoverflow for tips for overrideing these two metods), and leave only getChildren().remove(child)
Iterate over the collection of children in the first loop, like this:
Iterator<Child> i = form.getDeleteItems().iterator();
while(i.hasNext()){
Child child = i.next();
for (Iterator<Child> it = parent.getChildren().iterator();) {
if (child.getId().equals(it.next().getId()) {
it.remove(); // this removes the child from the underlying collection
}
}
}
I'm not sure what causes this behavior in hibernate, you can get going by loading the Child first. Separately deleting the Child is not nessesary. Updated code should look like;
Transaction tx = session.beginTransaction(); //session is of type org.hibernate.Session
try {
Parent parent = (Parent) session.get(Parent.class, getParentId());
Iterator i = form.getDeleteItems().iterator(); //form is of type org.apache.struts.action.ActionForm
while(i.hasNext()){
Child child = (Child) session.get(Chile.class, ((Child) i.next()).getChildId());
parent.getChildren().remove(child); //getChildren() returns type java.util.List
}
session.saveOrUpdate(parent);
tx.commit();
} ...
show the SQL generated by Hibernate
<property name="show_sql">true</property>
<property name="format_sql">true</property>
Edit:
Check out this Chapter 10. Working with objects
In this case, the Child class is the owner of the inverse relation, Hibernate will look at the parent reference of the child to determine whether the relation is still there. Since you don't set the parent to null, the relation exists and the child may not be deleted. Try doing
parent.getChildren().remove(child);
child.parent = null;
session.delete(child);
Also remove the not-null="true" from the parent property mapping.
The best thing to do when working with inverse associations, is to update both sides in Java code, that way you can continue working with the objects in memory and you don't have to worry about which side owns the relation.
A similar situation is discussed here: http://simoes.org/docs/hibernate-2.1/155.html