hibernate/jpa complaining "flush during cascade" - java

My application is crashing with the error below:
org.hibernate.HibernateException: Flush during cascade is dangerous
I am not flushing unless hibernate is doing it on my behalf.
Specs:
webapp on tomcat
hibernate/jpa for persistence (application managed
entity manager)
This is the code of my util class to manage entity manager:
private static EntityManagerFactory emFactory = Persistence.createEntityManagerFactory("returnit");
private static EntityManager entityManager;
public static EntityManager getEntityManager(){
return entityManager;
}
public static EntityManager initEntityManager(){
if (emFactory == null) {
emFactory = Persistence.createEntityManagerFactory( "returnit" );
}
entityManager = emFactory.createEntityManager();
return entityManager;
}
And this is the method that triggers the error:
#POST
#Consumes(MediaType.APPLICATION_JSON)
public Response post(#HeaderParam(HttpHeaders.AUTHORIZATION) String authHeader, MasterCrossDock mcd) {
EntityManager em = Utils.initEntityManager();
em.getTransaction().begin();
MasterCrossDockDAO.save(mcd);
em.getTransaction().commit();
em.close();
return Response.ok(mcd.getId()).build();
}
public static void save(MasterCrossDock new_mcd) {
List<Receptacle> receptacles = new_mcd.getReceptacles();
List<Long> ids = new ArrayList<Long>();
for (Receptacle r: receptacles) {
ids.add(r.getId());
}
new_mcd.getReceptacles().clear();
EntityManager em = Utils.getEntityManager();
new_mcd.getCountryDestination())
em.createQuery("UPDATE receptacle r"
+ " SET r.masterCrossDock.id = :mcd_id"
+ " WHERE r.id IN :ids")
.setParameter("ids", ids)
.setParameter("mcd_id", new_mcd.getId())
.executeUpdate();
new_mcd.getEreturns());
}
Why am I getting the error above and how to fix it?

Entity manager is not thread safe. Using EntityManager in container managed transatcions is fine, but here you are managing both the EntityManager and the transaction yourself. Also the entity manager is static so you are effectivly re-using it over the different requests you may get from the controller. Incoming call would execute the update query which would invoke a flush.
I noticed that during your initEntityManager you are swapping the static instance of the entityManager with a new one. What about the old reference that may be in use by another thread ?
Do the following:
Delete entirly your method initEntityManager
Delete private static EntityManager entityManager;
Make you method Utils.getEntityManager(); to aways create a new EntityManager
Alternative solution should be to make Spring or your container if you use container manage your transactions. Make a service, annotate it with #Transaction attribute and make Spring/Container inject the EntutyManager in it, or just use spring-data repositories.

The flush operation is called by the EntityTransaction Hibernate implementation, which might be JdbcResourceLocalTransactionCoordinatorImpl in your case, on commit.
Inside SessionImpl, this is what throws the HibernateException.
private void doFlush() {
checkTransactionNeeded();
checkTransactionSynchStatus();
try {
if ( persistenceContext.getCascadeLevel() > 0 ) {
throw new HibernateException( "Flush during cascade is dangerous" );
}
...
Maybe, and I say maybe, some other thread got a hold on the Session object and is operating on your entities.

Related

Hibernate - Rollback list of entities if one entity fails

im just working on a project to create, change user in my mysql database. Therefore i have UserService (REST) which creates a user and a GenericDAO class where i can persist users. In my DAO for each user i begin, persist and commit a transaction. Creating single users or find users works perfect.
Now i am facing with the problem to persist or update a list of users. Especially if one user can not be persisted (e.g. duplicates) the hole transaction should be rolled back. It doesnt work in my current setup.
My first idea is to outsource the commit in a separate method. With an loop over all users i only persist them. At the end of the loop i would call my method to commit everything. If a single or more users fails i can catch them with the rollback. Is that a good approach?
AbstractDAO (current)
public abstract class GenericDAO<T> implements IGenericDAO<T>{
#PersistenceContext
protected EntityManager em = null;
private CriteriaBuilder cb = null;
private Class<T> clazz;
public GenericDAO(Class<T> class1) {
this.clazz = class1;
this.em = EntityManagerUtil.getEntityManager();
this.em.getCriteriaBuilder();
}
public final void setClazz(Class<T> clazzToSet) {
this.clazz = clazzToSet;
}
public T create(T entity) {
try {
em.getTransaction().begin();
em.persist(entity);
em.getTransaction().commit();
return entity;
} catch (PersistenceException e) {
em.getTransaction().rollback();
return null;
}
}
public T find(int id) {
return em.find(this.clazz, id);
}
public List<T> findAll() {
return em.createQuery("from "+this.clazz.getName()).getResultList();
}
/** Save changes made to a persistent object. */
public void update(T entity) {
em.getTransaction().begin();
em.merge(entity);
em.getTransaction().commit();
}
/** Remove an object from persistent storage in the database */
public void delete(T entity) {
em.getTransaction().begin();
em.remove(entity);
em.getTransaction().commit();
}
Wouldn't the most convenient solution be to simply add methods like createAll()/updateAll()?
Adding separate public methods for starting and persisting the transaction like start() and commit() creates a whole bunch of problems because it means you suddenly introduce a stateful conversation between the Dao and its clients.
The Dao methods now need to be called in a certain order and, worse still, the state of the EntityManager transaction is retained. If you forget to commit() at the end of one service call using your Dao, a subsequent call is going to mistakenly assume a transaction was not yet started, and that call is going to fail 'for no apparent reason' (not to mention that the original call will appear completed when in reality the transaction was left hanging). This creates bugs that are hard to debug, and tricky to recover from.
EDIT As I already pointed out in the comment below this answer, getting programmatic transaction management right is tricky in a multi-layer application structure, and so, I would recommend to have a look at declarative transaction management.
However, if you insist on managing transactions yourself, I would probably introduce sth like a TransactionTemplate:
public class TransactionTemplate {
private EntityManager em; //populated in a constructor, for instance
public void executeInTransaction(Runnable action) {
try {
em.getTransaction().begin();
action.run();
em.getTransaction().commit();
} catch (Exception e) {
em.getTransaction().rollback();
} finally {
em.clear(); // since you're using extended persistence context, you might want this line
}
}
}
and use it in a service like so:
public class UserService {
private TransactionTemplate template;
private RoleDao roleDao;
private UserDao userDao; //make sure TransactionTemplate and all Daos use the same EntityManager - for a single transaction, at least
public void saveUsers(Collection<User> users, String roleName) {
template.executeInTransaction(() -> {
Role role = roleDao.findByName(roleName);
users.forEach(user -> {
user.addRole(role);
userDao.create(user);
});
// some other operations
});
}
}
(of course, using the above approach means only one layer - the service layer in this case - is aware of transactions, and so DAOs must always be called from inside a service)

Is there any reasons to call entityManager.flush() right after the manager was created (JPA, Hibernate)

I am following Hibernate video lesson, and there were shown this code:
public class Main {
private static EntityManagerFactory entityManagerFactory;
public static void main(String[] args)
{
entityManagerFactory = Persistence.createEntityManagerFactory("org.hibernate.tutorial.jpa");
addEntities("Client1","Bank1");
entityManagerFactory.close();
}
private static void addEntities(String clientName, String BankName)
{
Client client = new Client();
client.setName(clientName);
Bank bank = new Bank();
bank.setName(BankName);
EntityManager entityManager = entityManagerFactory.createEntityManager();
entityManager.getTransaction().begin();
entityManager.flush();
entityManager.persist(client);
entityManager.persist(bank);
entityManager.getTransaction().commit();
}
}
And I am concerned about this part of code:
EntityManager entityManager = entityManagerFactory.createEntityManager();
entityManager.getTransaction().begin();
entityManager.flush();
We generated new EntityManager. As I understand, it has empty Persistence context, as it was just created, isn't it ?
In that case,why do we call flush() method. What is the purpose ?
EntityManager#flush actually pushes the changes to the database immediately.
In the above code, a transaction has just started entityManager.getTransaction().begin() and there is no change that needs to be pushed to the database so I would say it is not needed there. You may remove it.
Anyways, it is a good practice to let entitymanager take care of when to push the data changes to the database instead of manually taking control over it. There could be use case when different applications or threads are trying to access the same data at same time.

#Remote EJB which uses #RequestScoped CDI bean

I have an application which has a #Remote #Singleton EJB which injects a #RequestScoped entity manager produced by CDI. Another application on the same server (wildfly 9)/JVM will use this EJB to get a result fetched from the entity manager.
The first invocation of the EJB will return the expected result. It produces the entity manager, fetches the data and disposes the entity manager again when the invocation returns. Each subsequent invocation of that EJB will throw an error because of a closed entity manager. No produce / dispose for a new entity manager is made.
Is this the expected bahavior? Do I have an error in my code?
IFrameworkResourceManager framework = _applicationContext.getFrameworkResourceManager();
final User resolvedUser = framework.resolveUser(username, domain);
// ...
final Rights resolvedRights = framework.resolveRights(resolvedUser.getGuid(), applicationId);
// ...
This piece of code is executed in a CDI producer which again is executed as soon as a new http session is created for an user. Nothing changes if I call getFramworkResourceManager again before invoking resolveRights.
public IFrameworkResourceManager getFrameworkResourceManager() {
return IFrameworkResourceManager frm = (IFrameworkResourceManager) ctx
.lookup("java:global/WebFramework/WebFrameworkImpl!my.package.IWebFramework");
}
It doesn't matter if I use a direct JNDI lookup or #EJB injection. The returned instance is reported (toString()) as Proxy for remote EJB StatelessEJBLocator for "/WebFramework/WebFrameworkImpl", view is interface my.package.IWebFramework, affinity is None
#LocalBean
#Singleton
public class WebFrameworkImpl implements IWebFramework, Serializable {
#Inject
private EntityManager _entityManager;
#Override
public User resolveUser(String username, String domain) {
System.out.println(_entityManager + " || " + _entityManager.isOpen());
// execute query using QueryDSL and the injected entityManager
}
#Override
public Rights resolveRights(String guidUser, int applicationId) {
System.out.println(_entityManager + " || " + _entityManager.isOpen());
// execute query using QueryDSL and the injected entityManager
}
}
#Remote
public interface IWebFramework extends IFrameworkResourceManager {
// some methods...
}
public interface IFrameworkResourceManager {
public User resolveUser(String username, String domain);
public Rights resolveRights(String guidUser, int applicationId);
}
Sysout of resolveUser: org.hibernate.jpa.internal.EntityManagerImpl#379e882b || true
Sysout of resolveRights: org.hibernate.jpa.internal.EntityManagerImpl#379e882b || false
Edit 20.11.2015 13:43: Persistence unit is of type RESOURCE_LOCAL. Additionally all #ResourceScoped beans are affected. #PostConstruct and #PreDestroy are only invoked for the first EJB invocation. Each subsequent invocation uses the previous instance of the resource scoped bean which is not correct.
Edit 20.11.2015 13:55: Everything works as expected if the EJB is invoked from within the same application that provides the EJB. This behavior only appears for invocations from other applications.
Edit 20.11.2015 15:24: JBoss AS 7.1.3.Final, Wildfly 9.0.0.Final and Wildfly 10.0.0.CR4 are all effected. But according to the CDI spec (1.0 to 1.2) chapter 6.7.4 this should work. I've filled a bug report (WFLY-5716).
When using RESOURCE_LOCAL, you shoud create youe EntityManager from EntityManagerFacgtory and handle it by yourself, like:
private EntityManagerFactory factory = Persistence.createEntityManagerFactory("unit-name");
public void someMethod(){
EntityManager em = emf.createEntityManager();
EntityTransaction tx = null;
try {
tx = em.getTransaction();
tx.begin();
// do some work
tx.commit();
}
catch (RuntimeException e) {
if ( tx != null && tx.isActive() )
tx.rollback();
throw e; // or display error message
}
finally {
em.close();
}
}
Bugfix for this wired behavior is already merged in the WELD repository:
WFLY-5716
WELD-2069

JPA, when to open and close entityManager

I've setup a spring MVC application for a web application and I'm using Hibernates implementation of JPA 2.1.
I've created my models and am able to interact with the database just fine.
I've also decided to use service classes which will manage returning the entities. What I've done is created a BaseService class, so all other service classes will expand on this, and they'll have access to common functions such as create(), delete(), update() and list().
My problem is I'm unsure as to when I should be creating the EntityManager and when I should be closing it?
Currently, in my controller I'm initiating the required services when the controller loads;
#Controller
#RequestMapping("/mycontroller")
public class TestController {
CarService carService = new CarService();
ShowroomService showroomService = new ShowroomService();
}
}
Here is the BaseService that each other service extends;
public class Service<Ety> {
EntityManager em = null;
public Class<Ety> entityClass;
public Service(Class<Ety> entityClass) {
this.entityClass = entityClass;
em = JPAUtil.getEntityManager();
}
public Ety get(int id) {
Ety object = null;
em.getTransaction().begin();
object = em.find(entityClass, id);
em.getTransaction().commit();
return object;
}
public List list() {
List<Ety> objects;
em.getTransaction().begin();
objects = em.createQuery("SELECT c FROM "+entityClass.getName()+" c").getResultList();
em.getTransaction().commit();
return objects;
}
public void save(Ety object) {
em.getTransaction().begin();
em.persist(object);
em.getTransaction().commit();
}
public void update(Ety object) {
em.getTransaction().begin();
em.merge(object);
em.getTransaction().commit();
}
public void delete(Ety object) {
em.getTransaction().begin();
em.remove(object);
em.getTransaction().commit();
}
}
Here's an example Service which expands the above;
public class CarService extends Service<Car> {
public CarService() {
super(Car.class);
}
}
As you can see, I'm creating an EntityManager when the service is created, but at the moment I'm not closing it anywhere.
I'm I creating the entity manager in the correct place? when should I close it.
I had considered putting the entity manager in a static property and creating it within a filter, and then closing it at the end of the application, however I do believe this wouldn't be thread safe and would cause issues?
Any advice would be appreciated.
your CarService should be a spring bean and the instance is created from spring. NOT from your code. The same with the EntityManager. You can use the entityManager with the #autowired annotation.
You open a new EntityManager for each transaction.
This EntityManager is like a Bag mapped to the database, but with zero entity managed inside when it's just opened.
When you work with it, this Bag will be filled with some entities and Hibernate will work to create the adequate requests.
You will close this Bag to save memory at the end of the transaction.
Of course there is some tricks to have many transactions for a given EntityManager, but you have the most general idea. As always it depends...
If you use a framework like Spring or JavaEE, it will open and close the EntityManager, as well starting and committing transactions for you. You have only your business work to write.

Different ways of getting the EntityManager

The usual idiom I see for creating the EntityManager is something like this:
public class BaseDao {
private static final String PERSISTENCE_UNIT_NAME = "Employee";
EntityManagerFactory factory = Persistence.createEntityManagerFactory(PERSISTENCE_UNIT_NAME);
public EntityManager getEntityManager() {
return factory.createEntityManager();
}
}
Then it is used like this:
Employee emp = new Employee();
emp.setName("Joe M");
getEntityManager().persist(emp);
Question is why not do it this way:
public class BaseDao{
private static final String PERSISTENCE_UNIT_NAME = "Employee";
EntityManagerFactory factory = Persistence.createEntityManagerFactory(PERSISTENCE_UNIT_NAME);
private EntityManager entityManager = null;
public void setEntityManger() {
EntityManagerFactory factory = Persistence.createEntityManagerFactory(PERSISTENCE_UNIT_NAME);
this.entityManager = factory.createEntityManager();
}
public EntityManager getEntityManager() {
return this.entityManager;
}
}
In other words is there a need to always get the entity manager through factory.createEntityManager()? or can it be created as an instance (or even static) variable and retrieved like that?
To clarify, I am talking about an environment that doesn't use EJB or Spring containers.
Thanks.
There are two ways to create EntityManager instances.
One way is for SDK applications, and I use this way a lot in unit testing. This is what you have in your example:
EntityManagerFactory factory =
Persistence.createEntityManagerFactory(PERSISTENCE_UNIT_NAME);
In Enterprise applications you let the container create them for you and inject them when needed.
EntityManager is just a wrapper around a JDBC connection. It's very light weight and can be created and destroyed without performance penalty.
Keep in mind that the EntityManager is not thread safe, so if you have one instance, you may need to synchronize access to it. See transaction basics for details.
Here's how I would do it (roughly):
public class BaseDao{
private static final String PERSISTENCE_UNIT_NAME = "Employee";
private static EntityManagerFactory factory =
Persistence.createEntityManagerFactory(PERSISTENCE_UNIT_NAME);
public void create(MyEntiy person){
EntityManager em = factory.createEntityManager();
em.getTransaction().begin();
// do what ever you need
em.getTransaction().commit();
em.close();
}
// add more methods to the dao.
}
Once you get this protoyped and ready, you can use a generic DAO.
Today you should probably look at sometime like spring-data and #PersistanceUnit for managing your EntityManager.
An EntityManager is more than just a wrapper a wrapper for a JDBC connection. It defines the scope of a persistence context, which defines the unit of work that should be performed when a transaction is committed (of when you flush queries to the database). Within a persistence context you are also guaranteed that a given entity in the database will result in the same Java object, regardless if you load it directly, or access it through a OneToMany relation of another entity.
With regards to the original question about obtaining an EntityManagerFactory in a non-spring setting. You simply call
Persistence.createEntityManagerFactory(PERSISTENCE_UNIT_NAME);
This method is a static factory method, depending on your JPA implementation you either get the same instance for the same PU, or a shallow wrapper that wraps the underlying persistence session (of which there is one per PU).

Categories

Resources