touch equivalent for Hibernate entity - java

I'd like to implement repository method void touch(MyEntity myEntity) which enforces SQL call of update of entity columns to their current values. (The reason behind is the on update trigger which needs to be invoked in some point of execution.) Ideal usecase is:
void serviceMethod(Long myEntityId) {
MyEntity myEntity = myEntityRepository.findOne(myEntityId);
...
myEntityRepository.touch(myEntity);
...
}
There are already similar questions on SO which don't work for me: Force update in Hibernate (my entity is detached), Implementing “touch” on JPA entity? (doing some harmless change works but is not general and has bad impact on code readability), Hibernate Idempotent Update (similar example).
I am aware of session interceptor method findDirty and also CustomEntityDirtinessStrategy both described in this Vlad Mihalcea's article. However, it seems to use findDirty I would have to override session interceptor, which is not possible from within repository method since the interceptor is final field assigned to session at session creation. And CustomEntityDirtinessStrategy comes from SessionFactory which is global. I rather need some one-shot solution to temporary consider one concrete entity of one concrete class dirty.
The so-far-best working solution is to set invalid (array of nulls) entity snapshot into persistence context, so that the subsequent logic in flush() evaluates entity as differing from snapshot and enforce update. This works:
#Override
#Transactional
public void touch(final T entity) {
SessionImpl session = (SessionImpl)em.getDelegate();
session.update(entity);
StatefulPersistenceContext pctx = (StatefulPersistenceContext) session.getPersistenceContext();
Serializable id = session.getIdentifier(entity);
EntityPersister persister = session.getEntityPersister(null, entity);
EntityKey entityKey = session.generateEntityKey(id, persister);
int length = persister.getPropertyNames().length;
Field entitySnapshotsByKeyField = FieldUtils.getField(pctx.getClass(), "entitySnapshotsByKey", true);
Map<EntityKey,Object> entitySnapshotsByKey = (Map<EntityKey,Object>)ReflectionUtils.getField(entitySnapshotsByKeyField, pctx);
entitySnapshotsByKey.put(entityKey, new Object[length]);
session.flush();
em.refresh(entity);
}
The advice in Force update in Hibernate didn't work for me because session.evict(entity) clears entitySnapshotsByKey entry at all, which causes subsequent org.hibernate.event.internal.DefaultFlushEntityEventListener#getDatabaseSnapshot loads fresh entity from db. The question is 9 years old and I'm not sure if it's applicable to current version of Hibernate (mine is 5.2.17).
I am not satisfied with such hacky solution though. Is there some straightforward way or something I could do simpler?

Related

#Transactional annotation Spring boot 2.0 and hibernate LazyInitializationException

I have the following question. From what I understand the #Transactional annotation is supposed to keep the session alive, thus enabling to lazy fetch child entities without the need to performe a specific joining query.
I have the following scenario where I do not understand why I'm still getting a LazyInitializationException.
My app runs a resolver in order to provide the various controller services with a resolved object so that it can be used directly.
Said resolver intercepts a header from the request and using it's value attempts to query the db in order to fetch the object. Now the object in question is quite simple is it's doings albeit it has a list of two sub-entities.
In order to perform the resolving action I'm using an extra service where I basically wrap some JpaRepository methods. The complete is below:
#Service
public class AppClientServiceImpl implements AppClientService {
private static final Logger LOGGER = LoggerFactory.getLogger(AppClientServiceImpl.class.getCanonicalName());
private final AppClientRepository repository;
#Autowired
public AppClientServiceImpl(AppClientRepository repository) {
this.repository = repository;
}
#Override
#Transactional(readOnly = true)
public AppClient getByAppClientId(final String appClientId) {
LOGGER.debug("Attempting to retrieve appClient with id:: {}", appClientId);
return repository.findByAppClientId(appClientId);
}
#Override
#Transactional
public void saveAndFlush(final AppClient appClient) {
LOGGER.debug("Attempting to save/update appClient:: {}", appClient);
repository.saveAndFlush(appClient);
}
}
As you can see both methods are annotated as #Transactional meaning that the should keep the session alive in the context of that said method.
Now, my main questions are the following:
1) Using the debugger I'm seeing even on that level getByAppClientId the list containing on the sub-entities which is lazy loaded has been resolved just fine.
2) On the resolver itself, where the object has been received from the delegating method, the list fails to be evaluated due to a LazyInitializationException.
3) Finally on the final controller service method which is also marked as #Transactional, the same as above occurs meaning that this eventually fails to it's job (since it's performing a get of the list that has failed to initialize.
Based on all the above, I would like to know what is the best approach in handling this. For once I do not want to use an Eager fetching type and I would also like to avoid using fetch queries. Also marking my resolver as #Transactional thus keeping the session open there as well is also out of the question.
I though that since the #Transactional would keep the session open, thus enabling the final service method to obtain the list of sub-entities. This seems not to be the case.
Based on all the above it seems that I need a way for the final service method that gets call (which needs the list on hand) to fetch it somehow.
What would the best approach to handle this? I've read quite a few posts here, but I cannot make out which is the most accepted methods as of Spring boot 2.0 and hibernate 5.
Update:
Seems that annotating the sub-entitie with the following:
#Fetch(FetchMode.SELECT)
#LazyCollection(LazyCollectionOption.TRUE)
Resolves the problem but I still don't know whether this is the best approach.
You initialize the collection by debugging. The debugger usually represents collections in a special way by using the collection methods which trigger the initialization, so that might be the reason why it seems to work fine during debugging. I suppose the resolver runs outside of the scope of the getByAppClientId? At that point the session is closed which is why you see the exception.
I created Blaze-Persistence Entity Views for exactly that use case. You essentially define DTOs for JPA entities as interfaces and apply them on a query. It supports mapping nested DTOs, collection etc., essentially everything you'd expect and on top of that, it will improve your query performance as it will generate queries fetching just the data that you actually require for the DTOs.
The entity views for your example could look like this
#EntityView(AppClient.class)
interface AppClientDto {
String getName();
}
Querying could look like this
List<AppClientDto> dtos = entityViewManager.applySetting(
EntityViewSetting.create(AppClientDto.class),
criteriaBuilderFactory.create(em, AppClient.class)
).getResultList();

How to work with "long living entities" or "longer living persistence context"?

I am currently working on a medium sized, desktop-based administration and configuration tool implemented in Java using JavaFx, google-guice, and hibernate for its jpa implementation.
Until now i got away with a single EntityManager injected as a #Singleton. Meaning that i had this EntityManager "open" from start to shutdown. All loaded entites were permanently known in the context and I barely had any problems with this approach. Although i know/believe it is not the best solution (but easy and a I had no time to redesign the application).
Now the application gets extended and I have to use multiple persistence units simultaneously.
I could try to get my current singleton-approach working with using something like:
#Inject
#PersistenceContext(name="JPA-Unit1")
#Singleton
private EntityManager em;
It never felt perfect, but that feels "ugly". And since I had severe problems getting multiple persistence contexts working with guice, I had to do a lot of reasearch on this topic.
And i came across several blogs SO-questions either mentioning that an instance of the EntityManager should only live as long it is needed or some extended persistence contexts.
Since I useJavaFx in place I use the *Property classes to bind the data directly into the UI.
Simplified user entity (property-based access):
#Entity
#Table(name = "USERS")
#NamedQuery(name = "User.findAll", query = "SELECT u FROM User u")
public class User implements Serializable {
[...]
private final SimpleStringProperty loginProperty = new SimpleStringProperty();
public User() {
}
public String getLogin() {
return this.loginProperty.get();
}
public void setLogin(String login) {
this.loginProperty.set(login);
}
public SimpleStringProperty loginProperty() {
return this.loginProperty;
}
[...]
}
If i start editing the user data in the UI it gets directly updated in the entity:
this.login.textProperty().bindBidirectional(user.loginProperty());
There is no need for extensive "business logic". It gets all handled via (input) validation. If all input is valid i simply save the data via
userService.update(user);
Parts of the UserService (exactly: its abstract super-class):
public abstract class AbstractService<PK extends Serializable, Type> implements GenericService<PK, Type> {
protected Class<Type> clazz;
#PersistenceContext(name = "JPA-Unit1")
#Inject
protected Provider<EntityManager> emProvider;
public AbstractService(Class<Type> clazz) {
this.clazz = clazz;
}
#Transactional
#Override
public Type create(Type entity) {
this.emProvider.get().persist(entity);
return entity;
}
#Transactional
#Override
public Type update(Type entity) {
this.emProvider.get().persist(entity);
return entity;
}
}
As you can see: the service class is pretty straightforward. I could even delete all this "service"-classes and directly use the entitymanager directly in my UI controller.
In this service you can see the "problem" the user i edit got loaded earlier by its named query and put into a list. The loading is also done in a #Transactional method.
But everytime i call this.emProvider.get() I get a new instance with an empty context. And if I want to save the previously edited user I have the problem that persist actually performs an insert (I assume because it is not known in the context [detached]) which leads to an PK-constraint violation or if I delete (null) its ID-property there is a new user row inserted.
My actual questions are:
1. Is this approach "OK"? If yes what do I do with this "always" new persistence context? Call contains and merge every single time?
Should I get rid of my service class and implement the persistence operations directly in my UI-controller?
Can I do an this.emProvider.get() once the User-UI-controller got loaded and use it the entire life time of the application?
Something totally different?
My understanding is that your app uses Guice Persist.
The answer to this question depends on your use cases; however, you absolutely need to realize one thing:
For as long as an EntityManager is open, its underlying persistence context tracks every single change to each persistent entity.
This means that if you keep an entity manager open for the duration of the application, whenever you call e.g. User.setLogin(), the change you just made is already regarded as persistent. Now, moving to your update method, calling persist on an entity that is already managed has no effect; however, since you're calling it from a #Transactional method, Guice wraps the call in a transaction, and consequently, all the changes are are being flushed to the database once the method ends.
This means that if you modify multiple entities at once within your app, and then call AbstractService.update on one of them, you will actually be saving all the changes your app has done to other entities in the meantime, even if AbstractService.update has not been called on them explicitly.
Using the entity manager-per-transaction approach is indeed much safer. Between transactions, there will be no open persistence context, and as a result all the entities will become detached, which will prevent any updates on them from accidentally being flushed to the database.
However, for the same reason, your update method will need to call em.merge on the entity you want to update in the database. merge is basically telling the entity manager 'please put this entity back into the persistence context, and make it have the exact state that the provided entity has'. Calling persist makes it look as though it was a new entity, and PK-constraint violations will indeed follow.

JPA handle merge() of relationship

I have a unidirectional relation Project -> ProjectType:
#Entity
public class Project extends NamedEntity
{
#ManyToOne(optional = false)
#JoinColumn(name = "TYPE_ID")
private ProjectType type;
}
#Entity
public class ProjectType extends Lookup
{
#Min(0)
private int progressive = 1;
}
Note that there's no cascade.
Now, when I insert a new Project I need to increment the type progressive.
This is what I'm doing inside an EJB, but I'm not sure it's the best approach:
public void create(Project project)
{
em.persist(project);
/* is necessary to merge the type? */
ProjectType type = em.merge(project.getType());
/* is necessary to set the type again? */
project.setType(type);
int progressive = type.getProgressive();
type.setProgressive(progressive + 1);
project.setCode(type.getPrefix() + progressive);
}
I'm using eclipselink 2.6.0, but I'd like to know if there's a implementation independent best practice and/or if there are behavioral differences between persistence providers, about this specific scenario.
UPDATE
to clarify the context when entering EJB create method (it is invoked by a JSF #ManagedBean):
project.projectType is DETACHED
project is NEW
no transaction (I'm using JTA/CMT) is active
I am not asking about the difference between persist() and merge(), I'm asking if either
if em.persist(project) automatically "reattach" project.projectType (I suppose not)
if it is legal the call order: first em.persist(project) then em.merge(projectType) or if it should be inverted
since em.merge(projectType) returns a different instance, if it is required to call project.setType(managedProjectType)
An explaination of "why" this works in a way and not in another is also welcome.
You need merge(...) only to make a transient entity managed by your entity manager. Depending on the implementation of JPA (not sure about EclipseLink) the returned instance of the merge call might be a different copy of the original object.
MyEntity unmanaged = new MyEntity();
MyEntity managed = entityManager.merge(unmanaged);
assert(entityManager.contains(managed)); // true if everything worked out
assert(managed != unmanaged); // probably true, depending on JPA impl.
If you call manage(entity) where entity is already managed, nothing will happen.
Calling persist(entity) will also make your entity managed, but it returns no copy. Instead it merges the original object and it might also call an ID generator (e.g. a sequence), which is not the case when using merge.
See this answer for more details on the difference between persist and merge.
Here's my proposal:
public void create(Project project) {
ProjectType type = project.getType(); // maybe check if null
if (!entityManager.contains(type)) { // type is transient
type = entityManager.merge(type); // or load the type
project.setType(type); // update the reference
}
int progressive = type.getProgressive();
type.setProgressive(progressive + 1); // mark as dirty, update on flush
// set "code" before persisting "project" ...
project.setCode(type.getPrefix() + progressive);
entityManager.persist(project);
// ... now no additional UPDATE is required after the
// INSERT on "project".
}
UPDATE
if em.persist(project) automatically "reattach" project.projectType (I suppose not)
No. You'll probably get an exception (Hibernate does anyway) stating, that you're trying to merge with a transient reference.
Correction: I tested it with Hibernate and got no exception. The project was created with the unmanaged project type (which was managed and then detached before persisting the project). But the project type's progression was not incremented, as expected, since it wasn't managed. So yeah, manage it before persisting the project.
if it is legal the call order: first em.persist(project) then em.merge(projectType) or if it should be inverted
It's best practise to do so. But when both statements are executed within the same batch (before the entity manager gets flushed) it may even work (merging type after persisting project). In my test it worked anyway. But as I said, it's better to merge the entities before persisting new ones.
since em.merge(projectType) returns a different instance, if it is required to call project.setType(managedProjectType)
Yes. See example above. A persistence provider may return the same reference, but it isn't required to. So to be sure, call project.setType(mergedType).
Do you need to merge? Well it depends. According to merge() javadoc:
Merge the state of the given entity into the current persistence
context
How did you get the instance of ProjectType you attach to your Project to? If that instance is already managed then all you need to do is just
type.setProgessive(type.getProgressive() + 1)
and JPA will automatically issue an update effective on next context flush.
Otherwise if the type is not managed then you need to merge it first.
Although not directly related this quesetion has some good insight about persist vs merge: JPA EntityManager: Why use persist() over merge()?
With the call order of em.persist(project) vs em.merge(projectType), you probably should ask yourself what should happen if the type is gone in the database? If you merge the type first it will get re-inserted, if you persist the project first and you have FK constraint the insert will fail (because it's not cascading).
Here in this code. Merge basically store the record in different object, Let's say
One Account pojo is there
Account account =null;
account = entityManager.merge(account);
then you can store the result of this.
But in your code your are using merge different condition like
public void create(Project project)
{
em.persist(project);
/* is necessary to merge the type? */
ProjectType type = em.merge(project.getType());
}
here
Project and ProjectType two different pojo you can use merge for same pojo.
or is there any relationship between in your pojo then also you can use it.

JPA, removing an entity which has found by different manager

Assume we have a simple entity bean, like above
#Entity
public class Schemes implements serializable{
...
#Id private long id;
...
}
I find a record using find method and it works perfect, the problem is I cannot manipulate it(remove) by another EntityManager later, for example I find it with a method, and later I want to remove it, what is the problem?! if I find it with same manager again I would remove it, but if object has found by another manager I cannot.
#ManagedBean #SessionScopped class JSFBean {
private Schemes s;
public JSFBean(){
....
EntityManager em;//.....
s=em.find(Schemes.class,0x10L);//okay!
....
}
public void remove(){//later
....
EntityManager em;//.....
em.getTransaction().begin();
em.remove(s);//Error! some weird error, it throws IllegalArgumentException!
em.getTransaction().commit();
....
}
}
many thanks.
You are probably getting a java.lang.IllegalArgumentException: Removing a detached instance.
The two EMs do not share a persistence context and for the second EM, your object is considered detached. Trying to remove a detached object will result in an IllegalArgumentException.
You can refetch the entity before the removal:
Schemes originalS = em.find(Schemes.class, s.getId());
em.remove(originalS);
EDIT You can also delete the entity without fetching it first by using parametrized bulk queries:
DELETE FROM Schemes s WHERE s.id = :id
Be aware that bulk queries can cause problems on their own. First, they bypass the persistence context, meaning that whatever you do with a bulk query will not be reflected by the objects in the persistence context. This is less an issue for delete queries than for update queries. Secondly, if you have defined any cascading rules on your entites - they will be ignored by a bulk query.

JPA managed entities vs JavaFX properties

My current project is done using JavaFX. I use properties to bind (bidirectionnal) view fields to bean (with BeanPathAdapter of JFXtras).
I choose to use JPA with ObjectDB as model.
This is the first time I use JPA in a standalone project and here I'm facing the problem of managed entities.
Actually, I bind managed entities to view fields and when the value of a view field changes, the entities is updated... and the database also.
I'm trying to find a way to manually persist/merge an entity so I can ask the user if he wants to save or not.
Here's the code i use to get list :
EntityManagerFactory emf = Persistence.createEntityManagerFactory("$objectdb/data/db.odb");
EntityManager em = emf.createEntityManager();
List<XXX> entities = em.createQuery("SELECT x FROM XXX x").getResultList();
So when i do
entity.setName("test");
the entity is updated in the database.
What i'm looking for is that the entity doesn't update automatically.
I tried (just after the getResultList)
em.clear();
or
em.detach(entity);
but it looses the relations instances even with CascadeType.DETACH.
I also tried
em.setFlushMode(FlushModeType.COMMIT);
but it still updates automatically...
I also tried to clone the object. But when i want to merge it, it gives me an exception :
Attempt to reuse an existing primary key value
I thought an alternative solution : use a variable as 'buffer' and fill the managed bean with buffer if the user saves. But BeanPathAdapter looses its sense. It's the same as filling view fields manually and filling bean fields manually after saving.
Could you help me to find a solution ?
Thanks,
Smoky
EDIT:
I answer to my own question :p
After 3 hours of research, I found a solution.
The 'cloning' solution was the 'best' of each I quoted but I don't think it's the best one.
The cause of the exception was the code I used to persist/merge my entity. I was persisting an entity non-managed with an already existing id. I thought I was merging...
I did a generic method not to fail again
public <T extends IEntity> T persist(T object) {
em.getTransaction().begin();
if (object.getId() == null) {
em.persist(object);
em.flush();
em.getTransaction().commit();
em.refresh(object);
}
else {
object = em.merge(object);
em.getTransaction().commit();
}
return object;
}
So the solution : When I have to bind the entity to the view, I use entity.clone() so I can use the entity as non-managed and merge when I want.
But if you have a proper solution, i'm interested :)
Thanks again
In addition to the solution above, standard solutions are:
Use detached objects in the model and then merge them into the EntityManager.
Use managed objects in the model, keeping the EntityManager open (with no detach/merge).

Categories

Resources