I want update some of my table in database and want all of these work do in 1 transaction,
first of all I delete some entry in branchbuildin(Table) and Insert new one after this action
The problem occurred when I insert and entry with same buildingname and branch_fk (be cause I have this constraint on this table ( uniqueConstraints={#UniqueConstraint(columnNames={"buildingname","branch_fk"})})) but when I don't use hibernate session and use normal JDBC transaction I don't have these problem.
List<Integer> allBranchBuilding = branchBuildingDao.getAllBranchBuildingID(pkId, sess);
for (Integer integer : allBranchBuilding) {
branchBuildingDao.delete(integer, sess); // delete kardane tamame BranchBuilding ha va tel haie aanha
}
Address myAdr = new Address();
setAddress(myAdr, centralFlag, city, latit, longit, mainstreet, remainAdr, state);
BranchBuildingEntity bbe = new BranchBuildingEntity();
setBranchBuildingEntity(bbe, be, myAdr, city, centralFlag, latit, longit, mainstreet, buildingName, remainAdr, state, des);
branchBuildingDao.save(bbe, sess);//Exception Occurred
I get my session at the first of Method:
Session sess = null;
sess = HibernateUtil.getSession();
Transaction tx = sess.beginTransaction();
You're right, everything occurs in the same transaction, and the same Hibernate Session.
The Session keeps track of every entity it manages. Even though you asked to delete it in the database, the corresponding object is still memorised in the Session until the Session is terminated.
In general, it is possible that
Hibernate reorders your operations
when sending them to the database, for
efficiency reasons.
What you could do is flush (ie. send to the database) your transaction because the save (if needed, you could also clear - ie empty the entities memorized by the Session - it after flushing):
sess.flush();
// sess.clear(); // if needed or convenient for you
branchBuildingDao.save(bbe, sess);
Note also that while your entities are memorized by the session, modifying them will trigger an automatic update when closing the session.
In our project, we have a method that deletes efficiently a collection (and another for an array, declared using the convenient ... parameter syntax) of entities (it works for all entities, it doesn't have to be done for each entity), removing them out of the session at the same time, and taking care of the flushing before :
Loop on all entities, delete it (using sess.delete(e)) and add it to a 'deleteds' list.
Every 50 entities (corresponding to the batch size we configured for efficiency reasons) (and at the end) :
flush the Session to force Hibernate to send immediately the changes to the database,
loop on 'deleteds' list, clear each entity from the Session (using sess.evict(e)).
empty the 'deleteds'list.
Don't worry, flush only sends the SQL to the database. It is still subject to commit or rollback.
Related
I have to improve the performance of a very slow code and I am pretty new to Hibernate. I have studied carefully the code and concluded that the issue is that it has a large set of entities to load and update/insert. To translate the algorithm to a more digestible example, let's say we have an algorithm like this:
for each competitionToSave in competitionsToSave
competition <- load a Competition by competitionToSave from database
winner <- load Person by competitionToSave.personID
do some preprocessing
if (newCompetition) then
insert competition
else
update competition
end if
end for
This algorithm is of course problematic when there are lots of competitions in competitionToSave. So, my plan is to select all competitions and winners involved with two database requests the most, preprocess data, which will quicken the read, but more importantly, to make sure I will save via insert/update batches of 100 competitions instead of saving them separately. Since I am pretty new to Hibernate, I consulted the documentation and found the following example:
Session session = sessionFactory.openSession();
Transaction tx = session.beginTransaction();
for ( int i=0; i<100000; i++ ) {
Customer customer = new Customer(.....);
session.save(customer);
if ( i % 20 == 0 ) { //20, same as the JDBC batch size
//flush a batch of inserts and release memory:
session.flush();
session.clear();
}
}
tx.commit();
session.close();
However, I am not sure I understand it correctly. About the method .save() I read:
Persist the given transient instance, first assigning a generated
identifier. (Or using the current value of the identifier property if
the assigned generator is used.) This operation cascades to associated
instances if the association is mapped with cascade="save-update".
But it is unclear to me whether a request to the database is send upon every save. Am I accurate if I assume that in the example taken from the documentation session.save(customer) saves the modification of the object in the Session without sending a request to the database and then on every 20th item the session.flush() sends the request to the database and session.clear() removes the cache of the Session?
You are correct in your assumptions, though the inserts will be triggered one-by-one:
insert into Customer(id , name) values (1, 'na1');
insert into Customer(id , name) values (2, 'na2');
insert into Customer(id , name) values (3, 'na3');
You can try and take advantage of the bulk insert feature to increase the performance even more.
There is hibernate property which you can define as one of the properties of hibernate's SessionFactory:
<property name="jdbc.batch_size">20</property>
With this batch setting you should have output like this after each flush:
insert into Customer(id , name) values (1, 'na1') , (2, 'na2') ,(3, 'na3')..
One insert instead of a twenty.
We have a function in our application that cleans up the database and resets the data, we call a method cleanup() which at first deletes all the data from the database and then calls a sql script file to insert all the necessary default data of our application. Our application supports Oracle - MySQL and MSSQL, the function works fine on both Oracle and MySQL but doesn't work as it supposed to on MSSQL.
The problem is that it is clearing all the data from the database but not inserting the default data, the first section of the method works fine but the second section doesn't get committed to the database. Here is the function:
public boolean cleanup(...){
Session session = sessionFactory.openSession();
Transaction tx = session.beginTransaction();
// delete and drop sql queries here...
tx.commit();
session.close();
// end of first section
session = sessionFactory.openSession();
tx = session.beginTransaction();
// insert default data sql queries here...
tx.commit();
session.close();
// end of second section
}
The database gets cleared successfully, but the default data is not being inserted to the database. Please let me know what am I doing wrong here. I tried doing both delete and insert sections in one transaction with no luck.
I want to insert an object into database in a transaction and after that object is saved in the database, I'd like to delete that it once a specific operation is done. Can I restart the transaction again and perform deletion and then commit? Is this a correct way of doing it?
Example :
Employee employee = new Employee();
String name = "Ronnie";
entityManager.getTransaction.begin();
employee.setName(name);
entityManager.persist(employee);
entityManager.getTransaction.commit();
//After few steps
entityManager.getTransaction.begin();
entityManager.remove(employee);
entityManager.getTransaction.commit();
SHORT ANSWER: Yes, you can do that whithout problems.
LONG ANSWER: Yes, you can.
Every transaction is independent of any other transaction. So, if you do some operations, commit them (remember, committing a transaction execs the operations in the DB, and closes it), and then reopen it lately, it is independent of the last transaction.
You can even be in the same transaction, whithout closing it, by flushing changes to the DB:
Employee employee = new Employee();
String name = "Ronnie";
entityManager.getTransaction.begin();
employee.setName(name);
entityManager.persist(employee);
entityManager.flush();
//After few steps, the transaction is still the same
entityManager.remove(employee);
entityManager.getTransaction.commit();
The transaction isolate database state from other transactions. So you can insert and delete in the same transaction. no need to commit it.
I have a database with 3 tables: Slideshows, MediaItemsInSlideshows and Mediaitems. I am using this database with a jsp site using hibernate.
I would like to be able to delete a slideshow without deleting the mediaitems. The rows in the MediaItemsInSlideshows should be deleted though.
Currently I use the following code to remove the slideshow. When I use this all mediaitems that were used in the slideshow are gone.
Session session = HibernateUtil.getSessionFactory().openSession();
Slideshow s = this.getSlideshowById(id, session);
session.beginTransaction();
session.delete(s);
session.getTransaction().commit();
This is a visual representation of the database:
Deleting A will set the reference to it in B to null which is forbidden by the schema. An alternative to changing the order of deletions would be to add a reverse one-to-many collection in B, with cascaded deletes. Only the deletion of A would than be needed.
(source: Deleting of related objects in hibernate)
I am using hibernate to update 20K products in my database.
As of now I am pulling in the 20K products, looping through them and modifying some properties and then updating the database.
so:
load products
foreach products
session begintransaction
productDao.MakePersistant(p);
session commit();
As of now things are pretty slow compared to your standard jdbc, what can I do to speed things up?
I am sure I am doing something wrong here.
The right place to look at in the documentation for this kind of treatment is the whole Chapter 13. Batch processing.
Here, there are several obvious mistakes in your current approach:
you should not start/commit the transaction for each update.
you should enable JDBC batching and set it to a reasonable number (10-50):
hibernate.jdbc.batch_size 20
you should flush() and then clear() the session at regular intervals (every n records where n is equal to the hibernate.jdbc.batch_size parameter) or it will keep growing and may explode (with an OutOfMemoryException) at some point.
Below, the example given in the section 13.2. Batch updates illustrating this:
Session session = sessionFactory.openSession();
Transaction tx = session.beginTransaction();
ScrollableResults customers = session.getNamedQuery("GetCustomers")
.setCacheMode(CacheMode.IGNORE)
.scroll(ScrollMode.FORWARD_ONLY);
int count=0;
while ( customers.next() ) {
Customer customer = (Customer) customers.get(0);
customer.updateStuff(...);
if ( ++count % 20 == 0 ) {
//flush a batch of updates and release memory:
session.flush();
session.clear();
}
}
tx.commit();
session.close();
You may also consider using the StatelessSession.
Another option would be to use DML-style operations (in HQL!): UPDATE FROM? EntityName (WHERE where_conditions)?. This the HQL UPDATE example:
Session session = sessionFactory.openSession();
Transaction tx = session.beginTransaction();
String hqlUpdate = "update Customer c set c.name = :newName where c.name = :oldName";
// or String hqlUpdate = "update Customer set name = :newName where name = :oldName";
int updatedEntities = s.createQuery( hqlUpdate )
.setString( "newName", newName )
.setString( "oldName", oldName )
.executeUpdate();
tx.commit();
session.close();
Again, refer to the documentation for the details (especially how to deal with the version or timestamp property values using the VERSIONED keyword).
If this is pseudo-code, I'd recommend moving the transaction outside the loop, or at least have a double loop if having all 20K products in a single transaction is too much:
load products
foreach (batch)
{
try
{
session beginTransaction()
foreach (product in batch)
{
product.saveOrUpdate()
}
session commit()
}
catch (Exception e)
{
e.printStackTrace()
session.rollback()
}
}
Also, I'd recommend that you batch your UPDATEs instead of sending each one individually to the database. There's too much network traffic that way. Bundle each chunk into a single batch and send them all at once.
I agree with the answer above about looking at the chapter on batch processing.
I also wanted to add that you should make sure that you only load what is neccessary for the changes that you need to make for the product.
What I mean is, if the product eagerly loads a large number of other objects that are not important for this transaction, you should consider not loading the joined objects - it will speed up the loading of products and depending on their persistance strategy, may also save you time when making the product persistent again.
The fastest possible way to do a batch update would be to convert it to a single SQL statement and execute it as raw sql on the session. Something like
update TABLE set (x=y) where w=z;
Failing that you can try to make less transactions and do updates in batches:
start session
start transaction
products = session.getNamedQuery("GetProducs")
.setCacheMode(CacheMode.IGNORE)
.scroll(ScrollMode.FORWARD_ONLY);
count=0;
foreach product
update product
if ( ++count % 20 == 0 ) {
session.flush();
session.clear();
}
}
commit transaction
close session
For more information look at the Hibernate Community Docs