I am working on a personal project, and i have a strange issue which i can't seem to solve, even after many hours of research and debugging, so obviously it must be somehting very simple i'm ignoring ....
Anyway, the context is : JPA + PostgresSQL + Glassfish.
I have an entity (generated by netbeans), MvUser, with:
#Id
#Basic(optional = false)
#NotNull
#Column(name = "id")
#SequenceGenerator(name="mv_user_autoincrement_gen",sequenceName ="mv_user_autoincrement",allocationSize=1)
#GeneratedValue(strategy=GenerationType.IDENTITY, generator="mv_user_autoincrement_gen")
private Long id;
Then, i have an AbstractFacade with generics for all the boilerplate persistence code.
Here i have a method which doesn't do much, just:
#Override
public void create(T entity) {
getEntityManager().persist(entity);
}
Now, let's say i call this in my service class:
First i inject my facade:
#EJB
IMvUserFacade userFacade;
then i'll use it:
#Override
public void saveUser(MvUser user)
{
userFacade.create(user);
// more business specific code follows
}
I make a call to the service like this
MvUser = new MvUser();
... setters etc
mvUserService.saveUser(user);
Now, what is happening is that in the create method the object is persisted, i have the generated id and everything.
Because on the whole chain i have object parameters, i'm presuming that at the saveUser level the same object will be found, but no, i am left with a detached entity.
What i'm doing wrong?
Thanks.
If I understand you correctly, you want to search for the object you saved in create(). I think the object is not yet persisted to database, what you retrieve is the object from first-level cache, http://www.tutorialspoint.com/hibernate/hibernate_caching.htm, since you are still within transaction. If you try to retrieve the object as follow and the method saveUser() is not within transaction you will get the user object from database (but detached since outside transaction):
public void saveUser(MvUser user)
{
userFacade.create(user);
userFacade.get(userId);
// more business specific code follows
}
Related
I'm working on a Spring Boot application that uses JPA (Hibernate) for the persistence layer.
I'm currently implementing a migration functionality. We basically dump all the existing entities of the system into an XML file. This export includes ids of the entities as well.
The problem I'm having is located on the other side, reimporting the existing data. In this step the XML gets transformed to a Java object again and persisted to the database.
When trying to save the entity, I'm using the merge method of the EntityManager class, which works: everything is saved successfully.
However when I turn on the query logging of Hibernate I see that before every insert query, a select query is executed to see if an entity with that id already exists. This is because the entity already has an id that I provided.
I understand this behavior and it actually makes sense. I'm sure however that the ids will not exist so the select does not make sense for my case. I'm saving thousands of records so that means thousands of select queries on large tables which is slowing down the importing process drastically.
My question: Is there a way to turn this "checking if an entity exists before inserting" off?
Additional information:
When I use entityManager.persist() instead of merge, I get this exception:
org.hibernate.PersistentObjectException: detached entity passed to
persist
To be able to use a supplied/provided id I use this id generator:
#Id
#GeneratedValue(generator = "use-id-or-generate")
#GenericGenerator(name = "use-id-or-generate", strategy = "be.stackoverflowexample.core.domain.UseIdOrGenerate")
#JsonIgnore
private String id;
The generator itself:
public class UseIdOrGenerate extends UUIDGenerator {
private String entityName;
#Override
public void configure(Type type, Properties params, ServiceRegistry serviceRegistry) throws MappingException {
entityName = params.getProperty(ENTITY_NAME);
super.configure(type, params, serviceRegistry);
}
#Override
public Serializable generate(SessionImplementor session, Object object)
{
Serializable id = session
.getEntityPersister(entityName, object)
.getIdentifier(object, session);
if (id == null) {
return super.generate(session, object);
} else {
return id;
}
}
}
If you are certain that you will never be updating any existing entry on the database and all the entities should be always freshly inserted, then I would go for the persist operation instead of a merge.
Per update
In that case (id field being set-up as autogenerated) the only way would be to remove the generation annotations from the id field and leave the configuration as:
#Id
#JsonIgnore
private String id;
So basically setting the id up for always being assigned manually. Then the persistence provider will consider your entity as transient even when the id is present.. meaning the persist would work and no extra selects would be generated.
I'm not sure I got whether you fill or not the ID. In the case you fill it on the application side, check the answer here. I copied it below:
Here is the code of Spring SimpleJpaRepository you are using by using Spring Data repository:
#Transactional
public <S extends T> S save(S entity) {
if (entityInformation.isNew(entity)) {
em.persist(entity);
return entity;
} else {
return em.merge(entity);
}
}
It does the following:
By default Spring Data JPA inspects the identifier property of the given entity. If the identifier property is null, then the entity will be assumed as new, otherwise as not new.
Link to Spring Data documentation
And so if one of your entity has an ID field not null, Spring will make Hibernate do an update (and so a SELECT before).
You can override this behavior by the 2 ways listed in the same documentation. An easy way is it to make your Entity implement Persistable (instead of Serializable), which will make you implement the method "isNew".
I am currently working on a medium sized, desktop-based administration and configuration tool implemented in Java using JavaFx, google-guice, and hibernate for its jpa implementation.
Until now i got away with a single EntityManager injected as a #Singleton. Meaning that i had this EntityManager "open" from start to shutdown. All loaded entites were permanently known in the context and I barely had any problems with this approach. Although i know/believe it is not the best solution (but easy and a I had no time to redesign the application).
Now the application gets extended and I have to use multiple persistence units simultaneously.
I could try to get my current singleton-approach working with using something like:
#Inject
#PersistenceContext(name="JPA-Unit1")
#Singleton
private EntityManager em;
It never felt perfect, but that feels "ugly". And since I had severe problems getting multiple persistence contexts working with guice, I had to do a lot of reasearch on this topic.
And i came across several blogs SO-questions either mentioning that an instance of the EntityManager should only live as long it is needed or some extended persistence contexts.
Since I useJavaFx in place I use the *Property classes to bind the data directly into the UI.
Simplified user entity (property-based access):
#Entity
#Table(name = "USERS")
#NamedQuery(name = "User.findAll", query = "SELECT u FROM User u")
public class User implements Serializable {
[...]
private final SimpleStringProperty loginProperty = new SimpleStringProperty();
public User() {
}
public String getLogin() {
return this.loginProperty.get();
}
public void setLogin(String login) {
this.loginProperty.set(login);
}
public SimpleStringProperty loginProperty() {
return this.loginProperty;
}
[...]
}
If i start editing the user data in the UI it gets directly updated in the entity:
this.login.textProperty().bindBidirectional(user.loginProperty());
There is no need for extensive "business logic". It gets all handled via (input) validation. If all input is valid i simply save the data via
userService.update(user);
Parts of the UserService (exactly: its abstract super-class):
public abstract class AbstractService<PK extends Serializable, Type> implements GenericService<PK, Type> {
protected Class<Type> clazz;
#PersistenceContext(name = "JPA-Unit1")
#Inject
protected Provider<EntityManager> emProvider;
public AbstractService(Class<Type> clazz) {
this.clazz = clazz;
}
#Transactional
#Override
public Type create(Type entity) {
this.emProvider.get().persist(entity);
return entity;
}
#Transactional
#Override
public Type update(Type entity) {
this.emProvider.get().persist(entity);
return entity;
}
}
As you can see: the service class is pretty straightforward. I could even delete all this "service"-classes and directly use the entitymanager directly in my UI controller.
In this service you can see the "problem" the user i edit got loaded earlier by its named query and put into a list. The loading is also done in a #Transactional method.
But everytime i call this.emProvider.get() I get a new instance with an empty context. And if I want to save the previously edited user I have the problem that persist actually performs an insert (I assume because it is not known in the context [detached]) which leads to an PK-constraint violation or if I delete (null) its ID-property there is a new user row inserted.
My actual questions are:
1. Is this approach "OK"? If yes what do I do with this "always" new persistence context? Call contains and merge every single time?
Should I get rid of my service class and implement the persistence operations directly in my UI-controller?
Can I do an this.emProvider.get() once the User-UI-controller got loaded and use it the entire life time of the application?
Something totally different?
My understanding is that your app uses Guice Persist.
The answer to this question depends on your use cases; however, you absolutely need to realize one thing:
For as long as an EntityManager is open, its underlying persistence context tracks every single change to each persistent entity.
This means that if you keep an entity manager open for the duration of the application, whenever you call e.g. User.setLogin(), the change you just made is already regarded as persistent. Now, moving to your update method, calling persist on an entity that is already managed has no effect; however, since you're calling it from a #Transactional method, Guice wraps the call in a transaction, and consequently, all the changes are are being flushed to the database once the method ends.
This means that if you modify multiple entities at once within your app, and then call AbstractService.update on one of them, you will actually be saving all the changes your app has done to other entities in the meantime, even if AbstractService.update has not been called on them explicitly.
Using the entity manager-per-transaction approach is indeed much safer. Between transactions, there will be no open persistence context, and as a result all the entities will become detached, which will prevent any updates on them from accidentally being flushed to the database.
However, for the same reason, your update method will need to call em.merge on the entity you want to update in the database. merge is basically telling the entity manager 'please put this entity back into the persistence context, and make it have the exact state that the provided entity has'. Calling persist makes it look as though it was a new entity, and PK-constraint violations will indeed follow.
I know that when using Wicket with JPA frameworks it is not advisable to serialize entities that have already been persisted to the database (because of problems with lazy fields and to save space). In such cases we are supposed to use LoadableDetachableModel. But what about the following use-case?
Suppose we want to create a new entity (say, a Contract) which will consist, among other things, of persisted entities (say, a Client which is selected from a list of clients stored in the DB). The entity under creation is a model object of some Wicket component (say, a Wizard). In the end (when we finish our wizard) we save the new entity to the DB. So my question is: what is the best generic solution to the serialization problem of such model objects? We can't use LDM because the entity is not in the DB yet but we don't want our inner entities (like Client) to be serialized wholly, too.
My idea was to implement a custom wicket serializer that checks if the object is an entity and if it is persisted. If so, store only its id, otherwise use the default serialization. Similarly, when deserializing use the stored id and get the entity from the DB or deserialize using the default mechanism. Not sure, though, how to do that in a generic way. My next thought was that if we can do it, then we do not need any LDM anymore, we can just store all our entities in simple org.apache.wicket.model.Model models and our serialization logic will take care of them, right?
Here's some code:
#Entity
Client {
String clientName;
#ManyToOne(fetch = FetchType.LAZY)
ClientGroup group;
}
#Entity
Contract {
Date date;
#ManyToOne(fetch = FetchType.LAZY)
Client client;
}
ContractWizard extends Wizard {
ContractWizard(String markupId, IModel<Contract> model) {
super(markupId);
setDefaultModel(model);
}
}
Contract contract = DAO.createEntity(Contract.class);
ContractWizard wizard = new ContractWizard("wizard", ?);
How to pass the contract? If we just say Model.of(contract) the whole contract will be serialized along with inner client (and it can be big), moreover if we access contract.client.group after deserialization we can bump into the problem: https://en.wikibooks.org/wiki/Java_Persistence/Relationships#Serialization.2C_and_Detaching
So I wonder how people go about solving such issues, I'm sure it's a fairly common problem.
I guess there are 2 approaches to your problem:
a.) Only save the stuff the user actually sees in Models. In your example that might be "contractStartDate", "contractEndDate", List of clientIds. That's the main approach if you don't want your DatabaseObjects in your view.
b.) Write your own LoadableDetachableModel and make sure you only serialize transient objects. For example like: (assuming that any negative id is not saved to the database)
public class MyLoadableDetachableModel extends LoadableDetachableModel {
private Object myObject;
private Integer id;
public MyLoadableDetachableModel(Object myObject) {
this.myObject = myObject;
this.id = myObject.getId();
}
#Override
protected Object load() {
if (id < 0) {
return myObject;
}
return myObjectDao.getMyObjectById(id);
}
#Override
protected void onDetach() {
super.onDetach();
id = myObject.getId();
if (id >= 0) {
myObject = null;
}
}
}
The downfall of this is that you'll have to make your DatabaseObjects Serializable which is not really ideal and can lead to all kind of problems. You would also need to decouple the references to other entities from the transient object by using a ListModel.
Having worked with both approaches I personally prefer the first. From my expierence the whole injecting dao objects into wicket can lead to disaster. :) I would only use this in view-only projects that aren't too big.
Most projects I know of just accept serializing referenced entities (e.g. your Clients) along with the edited entity (Contract).
Using conversations (keeping a Hibernate/JPA session open over several requests) is a nice alternative for applications with complex entity relations:
The Hibernate session and its entities is kept separate from the page and is never serialized. The component just keeps an identifier to fetch its conversation.
I have an entity class in my Enterprise Java application that has an entity listener attached to it:
#Entity
#EntityListeners(ChangeListener.class)
public class MyEntity {
#Id
private long id;
private String name;
private Integer result;
private Boolean dirty;
...
}
However, I would like it so that the entity listener got triggered for all fields except the boolean one. Is there any way exclude a field from triggering the entity listener without making it transient?
I'm using Java EE 5 with Hibernate.
However, it is possible if you implement your own solution. I've had the same need for audit log business requirement, so designed my own AuditField annotation, and applied to the fields to be audit-logged.
Here's the example in one entity bean - Site.
#AuditField(exclude={EntityActionType.DELETE})
#Column(name = "site_code", nullable = false)
private String siteCode;
So, the example indicates the 'siteCode' is a field to audit log, except DELETE action. (EntityActionType is an enum and it contains CRUD operations.)
Also, the EntityListenerhas this part of code.
#PostPersist
public void created(Site pEntity) {
log(pEntity, EntityActionType.CREATE);
}
#PreUpdate
public void updated(Site pEntity) {
log(pEntity, EntityActionType.UPDATE);
}
#PreRemove
public void deleted(Site pEntity) {
log(pEntity, EntityActionType.DELETE);
}
Now what it has to do in log() is, to figure what fields are to audit log and what custom actions are involved optionally.
However, there's another to consider.
If you put the annotation at another entity variable, what fields of the entity have to be logged? (i.e. chained logging)
It's your choice whether what are annotated with #AuditField only in the entity or some other ways. For my case, we decided to log only the entity ID, which is a PK of a DB table. However, I wanted to make it flexible assuming the business can change. So, all the entites must implement auditValue() method, which is coming from a base entity class, and the default implementation (that's overridable) is to return its ID.
There is some kind of mixing of concepts here. EntityListeners are not notified about changes in attribute values - not for single attribute, neither for all attributes.
For reason they are called lifecycle callbacks. They are triggered by following lifecycle events of entity:
persist (pre/post)
load (post)
update(pre/post)
remove (pre/post)
For each one of them there is matching annotation. So answer is that it is not possible to limit this functionality by type of persistent attributes.
Is there any way to avoid having JPA to automatically persist objects?
I need to use a third party API and I have to pull/push from data from/to it. I've got a class responsible to interface the API and I have a method like this:
public User pullUser(int userId) {
Map<String,String> userData = getUserDataFromApi(userId);
return new UserJpa(userId, userData.get("name"));
}
Where the UserJpa class looks like:
#Entity
#Table
public class UserJpa implements User
{
#Id
#Column(name = "id", nullable = false)
private int id;
#Column(name = "name", nullable = false, length = 20)
private String name;
public UserJpa() {
}
public UserJpa(int id, String name) {
this.id = id;
this.name = name;
}
}
When I call the method (e.g. pullUser(1)), the returned user is automatically stored in the database. I don't want this to happen, is there a solution to avoid it? I know a solution could be to create a new class implementing User and return an instance of this class in the pullUser() method, is this a good practice?
Thank you.
Newly create instance of UserJpa is not persisted in pullUser. I assume also that there is not some odd implementation in getUserDataFromApi actually persisting something for same id.
In your case entity manager knows nothing about new instance of UserJPA. Generally entities are persisted via merge/persist calls or as a result of cascaded merge/persist operation. Check for these elsewhere in code base.
The only way in which a new entity gets persisted in JPA is by explicitly calling the EntityManager's persist() or merge() methods. Look in your code for calls to either one of them, that's the point where the persist operation is occurring, and refactor the code to perform the persistence elsewhere.
Generally JPA Objects are managed objects, these objects reflect their changes into the database when the transaction completes and before on a first level cache, obviously these objects need to become managed on the first place.
I really think that a best practice is to use a DTO object to handle the data transfering and then use the entity just for persistence purposes, that way it would be more cohesive and lower coupling, this is no objects with their nose where it shouldnt.
Hope it helps.