I am using MySQL + spring + hibernate.
When I execute the following code it creates a new row:
sessionFactory.getCurrentSession()
.save(configTable);
However this code below updates
sessionFactory.getCurrentSession()
.update(configTable);
I am not sure why the above code creates a new row in tables it should update in both cases to my understanding,
Any idea what could I be missing? Or what info will you need to help me track the problem...
You need to use saveOrUpdate() session method as:
sessionFactory.getCurrentSession().saveOrUpdate(configTable);
When update pass Entity primary key for update.
Make sure that your Entity have this annotation for save and update.
#org.hibernate.annotations.Entity(dynamicInsert = true, dynamicUpdate = true)
Related
Been able to successfully use wnwrap session to reduce SQL count to UPDATE only from SELECT + UPDATE for Entity (annotated with #DynamicUpdate) but DynamicUpdate is not working in that case.
doInJPA(entityManager -> {
Session session = entityManager.unwrap( Session.class );
for ( Post post: posts ) {
session.update( post );
}
});
Found #DynamicUpdate only works when using entityManager.merge pattern while doing update but it leads to 2 SQL queries while detached state SELECT + UPDATE.
doInJPA(entityManager -> {
for ( Post post: posts ) {
entityManager.merge( post );
}
});
Can session unwrap pattern be made to do use of SQL UPDATE query generated by entityManager.merge , so that don't have to re-implement DynamicUpdate ?
Note -
Entity has #Id key on String dataType which is UUID.toString field and is not a generated field
tried implementing Persistable and Transient isNew and getId without success (i.e unable to reduce SELECT + UPDATE in case of entityManager.merge pattern to UPDATE only)
Been able to do insert successfully using single INSERT using entityManager.createNativeQuery.executeUpdate pattern, just the update is firing double queries with merge pattern. Session unwrap and update reduces it to 1 UPDATE but DynamicUpdate does not work
Tried Transactional at higher scope too so that GET + UPDATE are in same transaction without success
Dynamic update will only work on managed entities(which requires an entity to be SELECTed at least once) because Hibernate relies on comparing state against snapshots which are created when doing a SELECT.
If you want real dynamic updates, I can recommend you take a look at Blaze-Persistence Entity-Views which supports that: https://persistence.blazebit.com/documentation/entity-view/manual/en_US/index.html#updatable-entity-views
I work on a Java project and I have to write a new module in order to copy some data from one database to another (same tables).
I have an entity Contrat containing several fields and the following field :
#OneToMany(mappedBy = "contrat", fetch = FetchType.LAZY)
#Fetch(FetchMode.SUBSELECT)
#Cascade( { org.hibernate.annotations.CascadeType.ALL, org.hibernate.annotations.CascadeType.DELETE_ORPHAN })
#BatchSize(size = 50)
private Set<MonElement> elements = new HashSet<MonElement>();
I must read some "Contrat" objects from a database and write them in another database.
I hesitate between 2 solutions :
use jdbc to query the first database and get the objects and then write those objects into the second database (paying attention to the order and the different keys). It will be long.
as the project currently uses Hibernate and contains all hibernate mapping classes, I was thinking about opening a first session to the first database, reading the hibernate Contrat object, setting the ids to null in the children elements and writing the object to the destination database with a second session. It should be quicker.
I wrote a test class for the second use case and the process fails with the following exception :
org.hibernate.HibernateException: Don't change the reference to a
collection with cascade="all-delete-orphan"
I think the reference must change when I set the ids to null, but I am not sure : I don't understand how changing a field of a Collection member can change the Collection reference
Note that if I remove DELETE_ORPHAN from the configuration, everything works, all the objects and their dependencies are written in the database.
So I would like to use the hibernate solution which is faster but I have to keep the DELETE_ORPHAN feature because the application currently uses this feature to ensure that every MonElement removed from the elements Set will be deleted in the database.
I don't need this feature but cannot remove it.
Also, I need to set the MonElement ids to null in order to generate new ones because their id in the first database may exist in the target database.
Here is the code I wrote which works well when I remove the DELETE_ORPHAN option.
SessionFactory sessionFactory = new AnnotationConfiguration().configure("/hibernate.cfg.src.xml").buildSessionFactory();
Session session = sessionFactory.openSession();
// search the Contrat object
Criteria crit = session.createCriteria(Contrat.class);
CriteriaUtil.addEqualCriteria(crit, "column", "65465454");
Contrat contrat = (Contrat)crit.list().get(0);
session.close();
SessionFactory sessionFactoryDest = new AnnotationConfiguration().configure("/hibernate.cfg.dest.xml").buildSessionFactory();
Session sessionDest = sessionFactoryDest.openSession();
Transaction transaction = sessionDest.beginTransaction();
// setting id to null, also for the elements in the elements Set
contrat.setId(null);
for (MonElement element:contrat.getElements()) {
element.setId(null);
}
// writing the object in the database
sessionDest.save(contrat);
transaction.commit();
sessionDest.flush();
sessionDest.close();
This is way faster than managing myself the queries and the primary / foreign keys and dependencies between objects.
Does anyone have an idea to get rid of this exception ?
Or maybe I should change the state of the Set.
In fact I'm not trying to delete any element of this Set, I just want them to be considered as new objects.
If I don't find a solution, I will do something dirty : duplicate all hibernate entity objects in my new project and remove the DELETE_ORPHAN parameter in the newly created Contrat.
So the application will continue using its mapping and my new project will use my specific mapping. But I want to avoid that.
Thanks
A correct solution has been written by crizzis as a comment to my question.
I quote him :
I'd try wrapping the contrat.elements in a new collection (contrat.setElements(new HashSet<>(contrat.getElements())) before trying to persist the contract with the new session
It works well.
I'm developing a java app with MySql database, JPA objects and EntityManagerFactory with EclipseLink to manage the database. Everything works Ok but I have an issue.
One of my JPA objects is like this
public class JPAObject1{
#Id
#GeneratedValue
private int id;
#OneToMany(//things here)
List<JPAObject2> list1;
...
}
So the id field will be autogenerated by the EntityManagerFactory when I store it in the database. Asumming em type EntityManager and object type JPAObject1:
em.getTransaction().begin();
em.persist(object);
em.getTransaction().commit();
//house work closing things
The JPAObject1 is added correctly, I can see all fields in my database. As field id is the key to do the find operation, my question is:
Is there a way to get the last added object on the EntityManager on just the moment it is added?
Because I have others objects that use the JPAObject1 id field as a foreign key and I need that field when just the object is added to the database to link the others, but the only way I know to get it is getting all the JPAObjects and getting the last one in the Collection. So, with a few Objects it won't be a problem but if one process insert on database and another do the same before process 1 does the findAll to get the last added, there will be a coherence error....
I think I've explained it well.
Thanks a lot!
you can use this code
Obejct en = new Obejct ();
en.setxxx("My name");
em.persist(en);
em.flush();
System.out.println(en.getId());
the id genrated after flush
Note that the datas saved to database is a set, not list. So they don't have the order or anything like that, and you can't get the last one you've added. If you want to, pls add a column like date, time..., and the query will be like:
" SELECT * FROM Table ORDER BY dateColumn DESC LIMIT 1"
I am new in hibernate. I am using SesssionFactory to get the session of the transaction and one way which I found after searching is used for setting few fields using set parameter i.e
Query query = getCurrentSession().createSQLQuery(
"UPDATE table_name set field1=:f1 where ID=:id");
query.setParameter("f1", f1);
query.setParameter("id", id);
but I want to update the whole row. I have already set the values in the entity class but is there a way to pass the values of the whole entity class to the database based on the id the id is the primary key for the table which I want to update.
you already have all data present in the hibernate entity object? Then just call the session directly:
getCurrentSession().save(myEntity);
to create a new object, or
getCurrentSession().update(myEntity);
to update an existing row.
If not sure, you can use:
getCurrentSession().saveOrUpdate(myEntity);
Take a look at Session#update (or saveOrUpdate). This will allow you to persist a complete, mapped, object to the database.
To be as OO as you can, you can get entity by session.get(entityClass, id);
And then after modifying object by setters/getters, you can save it back to the DB using update method :session.update(entity);
I have a couple of objects that are mapped to tables in a database using Hibernate, BatchTransaction and Transaction. BatchTransaction's table (batch_transactions) has a foreign key reference to transactions, named transaction_id.
In the past I have used a batch runner that used internal calls to run the batch transactions and complete the reference from BatchTransaction to Transaction once the transaction is complete. After a Transaction has been inserted, I just call batchTransaction.setTransaction(txn), so I have a #ManyToOne mapping from BatchTransaction to Transaction.
I am changing the batch runner so that it executes its transactions through a Web service. The ID of the newly inserted Transaction will be returned by the service and I'll want to update transaction_id in BatchTransaction directly (rather than using the setter for the Transaction field on BatchTransaction, which would require me to load the newly inserted item unnecessarily).
It seems like the most logical way to do it is to use SQL rather than Hibernate, but I was wondering if there's a more elegant approach. Any ideas?
Here's the basic mapping.
BatchQuery.java
#Entity
#Table(name = "batch_queries")
public class BatchQuery
{
#ManyToOne
#JoinColumn(name = "query_id")
public Query getQuery()
{
return mQuery;
}
}
Query.java
#Entity
#Table(name = "queries")
public class Query
{
}
The idea is to update the query_id column in batch_queries without setting the "query" property on a BatchQuery object.
Using a direct SQL update, or an HQL update, is certainly feasible.
Not seeing the full problem, it looks to me like you might be making a modification to your domain that's worth documenting in your domain. You may be moving to having a BatchTransaction that has as a member just the TransactionId and not the full transaction.
If in other activities, the BatchTransaction will still be needing to hydrate that Transaction, I'd consider adding a separate mapping for the TransactionId, and having that be the managing mapping (make the Transaction association update and insert false).
If BatchTransaction will no longer be concerned with the full Transaction, just remove that association after adding a the TransactionId field.
As you have writeen, we can use SQL to achieve solution for above problem. But i will suggest not to update the primary keys via SQL.
Now, as you are changing the key, which means you are creating alltogether a new object, for this, you can first delete the existing object, with the previous key, and then try to insert a new object with the updated key(in your case transaction_id)