I have an entity which looks something like this: (I'm coding to the web page so I apologize for any mistakes)
#Entity
public class Entity {
#Id
private Long id;
private String field;
// Insert getters and setters here...
}
I try to manipulate it using reflection:
Long id = 1;
Entity entity = myDao.getEntity(id);
entity.setField("set directly");
Field[] fields = entity.getClass().getDeclaredFields();
for (Field f : fields) {
if (f.getName().equals("field")) {
f.setAccessible(true);
f.set(entity, "set using reflection");
f.setAccessible(false);
}
}
System.out.println(entity.getField());
This program prints "set using reflection". However, in the database the value set using reflection does not get updated:
SELECT * FROM ENTITY WHERE ID = 1
ID FIELD
1 set directly
This is strange. I could swear that this used to work - but now it isn't. Is it really so that you cannot manipulate entities using reflection?
I'm using EclipseLink 1.1.1 if that matters.
Changing values of an entity class by reflection is going to fraught with issues. This is because you're dealing with a class which is persistent and thus the persistence API needs to know about changes to the fields.
If you make changes via reflection, chances are the persistence API will not know about those changes.
A better solution would be to call the setters via reflection.
I'm pretty sure the Entity you are given by your persistence framework is actually wrapped in another class (possibly the same with stuff tacked on through reflection). Changing the field directly through reflection seems unlikely to work. You might want to check if there's a (generated) setter that you can use. Although if you're going that route one might ask why you don't allow callers to call the setter directly?
Your class might be instrumented and the setters responsible for recording changes. I'm not familiar with EclipseLink to check if the class returned by myDao.getEntity(id); is your actual class of a sub-class generated by EclipseLink.
Related
I'm wondering what's the best practice when using a PUT method to update a specific property of an entity stored in DB.
Let's see for example the following json that is received on the Rest Controller:
{"id":1, "surname":"Doe"}
The entity that we have stored looks something like this:
public class Employee {
Long id;
String name;
String surname;
Date createdAt;
Date updatedAt;
}
I omitted the annotations for simplicity purposes.
What I'd like to achieve is that on the RestController I receive something like this:
#PutRequest
public Employee updateEmployee(#RequestBody Employee employee) {
repo.saveAndFlush(employee);
}
So, if I do it, then the existing fields for the name and timestamps will be set to null because the provided entity doesn't contain such fields.
I'm wondering if there's a way to run the following actions:
Load the entity with the ID provided on DB
Update the fields provided in the Json/Request Body.
Persist the updated entity -> This can be done the same way I've showed in the code.
I'm aware that it exists the #JsonIdentity and #JsonIdentifyreference(alwaysAsId=true) which I use in conjunction with resolvers to fetch the data from DB for fetching a nested entity where only the ID is provided rather the entity itself.
PATCH method is designed for that functionality.
PUT should be used when you are replacing the whole resource - that means setting null on fields that you didn't provide in request.
PATCH is used for updating a resource, you can update a single field, or all the fields, your choice.
Be aware that the actual database update may not automagically work, just because you changed the HTTP method. For Hibernate there is a #DynamicUpdate that provides the same functionality. Without #DynamicUpdate the fields set to null will be updated, but with #DynamicUpdateonly the fields that were modified will be updated.
I have a unidirectional relation Project -> ProjectType:
#Entity
public class Project extends NamedEntity
{
#ManyToOne(optional = false)
#JoinColumn(name = "TYPE_ID")
private ProjectType type;
}
#Entity
public class ProjectType extends Lookup
{
#Min(0)
private int progressive = 1;
}
Note that there's no cascade.
Now, when I insert a new Project I need to increment the type progressive.
This is what I'm doing inside an EJB, but I'm not sure it's the best approach:
public void create(Project project)
{
em.persist(project);
/* is necessary to merge the type? */
ProjectType type = em.merge(project.getType());
/* is necessary to set the type again? */
project.setType(type);
int progressive = type.getProgressive();
type.setProgressive(progressive + 1);
project.setCode(type.getPrefix() + progressive);
}
I'm using eclipselink 2.6.0, but I'd like to know if there's a implementation independent best practice and/or if there are behavioral differences between persistence providers, about this specific scenario.
UPDATE
to clarify the context when entering EJB create method (it is invoked by a JSF #ManagedBean):
project.projectType is DETACHED
project is NEW
no transaction (I'm using JTA/CMT) is active
I am not asking about the difference between persist() and merge(), I'm asking if either
if em.persist(project) automatically "reattach" project.projectType (I suppose not)
if it is legal the call order: first em.persist(project) then em.merge(projectType) or if it should be inverted
since em.merge(projectType) returns a different instance, if it is required to call project.setType(managedProjectType)
An explaination of "why" this works in a way and not in another is also welcome.
You need merge(...) only to make a transient entity managed by your entity manager. Depending on the implementation of JPA (not sure about EclipseLink) the returned instance of the merge call might be a different copy of the original object.
MyEntity unmanaged = new MyEntity();
MyEntity managed = entityManager.merge(unmanaged);
assert(entityManager.contains(managed)); // true if everything worked out
assert(managed != unmanaged); // probably true, depending on JPA impl.
If you call manage(entity) where entity is already managed, nothing will happen.
Calling persist(entity) will also make your entity managed, but it returns no copy. Instead it merges the original object and it might also call an ID generator (e.g. a sequence), which is not the case when using merge.
See this answer for more details on the difference between persist and merge.
Here's my proposal:
public void create(Project project) {
ProjectType type = project.getType(); // maybe check if null
if (!entityManager.contains(type)) { // type is transient
type = entityManager.merge(type); // or load the type
project.setType(type); // update the reference
}
int progressive = type.getProgressive();
type.setProgressive(progressive + 1); // mark as dirty, update on flush
// set "code" before persisting "project" ...
project.setCode(type.getPrefix() + progressive);
entityManager.persist(project);
// ... now no additional UPDATE is required after the
// INSERT on "project".
}
UPDATE
if em.persist(project) automatically "reattach" project.projectType (I suppose not)
No. You'll probably get an exception (Hibernate does anyway) stating, that you're trying to merge with a transient reference.
Correction: I tested it with Hibernate and got no exception. The project was created with the unmanaged project type (which was managed and then detached before persisting the project). But the project type's progression was not incremented, as expected, since it wasn't managed. So yeah, manage it before persisting the project.
if it is legal the call order: first em.persist(project) then em.merge(projectType) or if it should be inverted
It's best practise to do so. But when both statements are executed within the same batch (before the entity manager gets flushed) it may even work (merging type after persisting project). In my test it worked anyway. But as I said, it's better to merge the entities before persisting new ones.
since em.merge(projectType) returns a different instance, if it is required to call project.setType(managedProjectType)
Yes. See example above. A persistence provider may return the same reference, but it isn't required to. So to be sure, call project.setType(mergedType).
Do you need to merge? Well it depends. According to merge() javadoc:
Merge the state of the given entity into the current persistence
context
How did you get the instance of ProjectType you attach to your Project to? If that instance is already managed then all you need to do is just
type.setProgessive(type.getProgressive() + 1)
and JPA will automatically issue an update effective on next context flush.
Otherwise if the type is not managed then you need to merge it first.
Although not directly related this quesetion has some good insight about persist vs merge: JPA EntityManager: Why use persist() over merge()?
With the call order of em.persist(project) vs em.merge(projectType), you probably should ask yourself what should happen if the type is gone in the database? If you merge the type first it will get re-inserted, if you persist the project first and you have FK constraint the insert will fail (because it's not cascading).
Here in this code. Merge basically store the record in different object, Let's say
One Account pojo is there
Account account =null;
account = entityManager.merge(account);
then you can store the result of this.
But in your code your are using merge different condition like
public void create(Project project)
{
em.persist(project);
/* is necessary to merge the type? */
ProjectType type = em.merge(project.getType());
}
here
Project and ProjectType two different pojo you can use merge for same pojo.
or is there any relationship between in your pojo then also you can use it.
Say I have the following Java class, which is owned by a vendor so I can't change it:
public class Entry {
private String user;
private String city;
// ...
// About 10 other fields
// ...
// Getters, setters, etc.
}
I would like to persist it to a table, using JPA 2.0 (OpenJPA implementation). I cannot annotate this class (as it is not mine), so I'm using orm.xml to do that.
I'm creating a table containing a column per field, plus another column called ID. Then, I'm creating a sequence for it.
My question is: is it at all possible to tell JPA that the ID that I would like to use for this entity doesn't even exist as a member attribute in the Entry class? How do I go about creating a JPA entity that will allow me to persist instances of this class?
EDIT
I am aware of the strategy of extending the class and adding an ID property it. However, I'm looking for a solution that doesn't involve extending this class, because I need this solution to also be applicable for the case when it's not only one class that I have to persist, but a collection of interlinked classes - none of which has any ID property. In such a scenario, extending doesn't work out.
Eventually, I ended up doing the following:
public class EntryWrapper {
#Id
private long id;
#Embedded
private Entry entry;
}
So, I am indeed wrapping the entity but differently from the way that had been suggested. As the Entry class is vendor-provided, I did all its ORM work in an orm.xml file. When persisting, I persist EntryWrapper.
I don't have much experience with JPA, but I wouldn't extend your base classes, instead I would wrap them:
public class PersistMe<T> {
#Id
private long id;
private T objToWrap;
public(T objToWrap) {
this.objToWrap = objToWrap;
}
}
I can't test it, if it doesn't work let me know so I can delete the answer.
I'm implementing several DAO classes for a web project and for some reasons I have to use JDBC.
Now I'd like to return an entity like this:
public class Customer{
// instead of int userId
private User user;
// instead of int activityId
private Activity act;
// ...
}
Using JPA user and activity would be loaded easily (and automatically specifying relations between entities).
But how, using JDBC? Is there a common way to achieve this? Should I load everiting in my CustomerDAO? IS it possible to implement lazy initialization for referenced entities?
My first idea was to implement in my UserDAO:
public void initUser(Customer customer);
and in my ActivityDAO:
public void initActivity(Customer customer);
to initialize variables in customer.
Active Record route
You could do this with AspectJ ITDs and essentially make your entities into Active Record like objects.
Basically you make an Aspect that advises class that implement an interface called "HasUser" and "HasActivity". Your interfaces HasUser and HasActivity will just define getters.
You will then make Aspects that will weave in the actual implementation of getUser() and getActivity().
Your aspects will do the actual JDBC work. Although the learning curve on AspectJ is initially steep it will make your code far more elegant.
You can take a look at one of my answers on AspectJ ITD stackoverflow post.
You should also check out springs #Configurable which will autowire in your dependencies (such as your datasource or jdbc template) into non managed spring bean.
Of course the best example of to see this in action is Spring Roo. Just look at the AspectJ files it generates to get an idea (granted that roo uses JPA) of how you would use #Configurable (make sure to use the activerecord annotation).
DAO Route
If you really want to go the DAO route than you need to this:
public class Customer{
// instead of int userId
private Integer userId;
// instead of int activityId
private Integer activityId;
}
Because in the DAO pattern your entity objects are not supposed to have behavior. Your Services and/or DAO's will have to make transfer objects or which you could attach the lazy loading.
I'm not sure if there is any automated approach about this. Without ORM I usually define getters as singletons where my reference types are initialized to null by default, i.e. my fetching function would load primitives + Strings and will leave them as null. Once I need getUser(), my getter would see if this is null and if so, it would issue another select statement based on the ID of the customer.
I'm observing a very strange behaviour with an entity class and loading an object of this class whith JPA (hibernate entitymanager 3.3.1.ga). The Class has a (embedded) field, that is initialized in the declaration. The setter of the field implements a null check (i.e. would throw an exception when a null value is set).
...
#Entity
public class Participant extends BaseEntity implements Comparable<Participant> {
...
#Embedded
private AmsData amsData = new AmsData();
public void setAmsData(AmsData amsData) {
Checks.verifyArgNotNull(amsData, "amsdata");
this.amsData = amsData;
}
...
}
When I get this object with JPA, the field is null, if there is no data in the db for the fields specified in the embedded object.
...
public class ParticipantJpaDao implements ParticipantDao {
#PersistenceContext
private EntityManager em;
#Override
public Participant getParticipant(Long id) {
return em.find(Participant.class, id);
}
...
}
I debugged the process with a watchpoint on the field (should halt when the field is accessed or modified), and I see one modification when the field is initialized, but when I get the result from the find call, the field is null.
Can anybody explain, why this is so? How can I ensure, that the field is not null, also when there is no data for the embedded object's fields in the db (besides from setting it manually after the find call).
The JPA specification doesn't explicitly say how to handle a set of columns representing an embeddable object which are all empty. It could signal a null reference, or an object instance with all null fields. Hibernate chooses a null reference in this case, though other JPA implementations may pick the later.
The reason why your setter is never called is because Hibernate is accessing your field via reflection, bypassing the setter you implemented. It's doing this because you utilize field-based access rather than property-based access.
Chad's answer would provide the functionality you're looking for, but there is a caveat (see below).
"...The persistent state of an entity
is accessed by the persistence
provider runtime[1] either via
JavaBeans style property accessors or
via instance variables. A single
access type (field or property access)
applies to an entity hierarchy. When
annotations are used, the placement of
the mapping annotations on either the
persistent fields or persistent
properties of the entity class
specifies the access type as being
either field- or property-based access
respectively..." [ejb3 persistence
spec]
so by moving the annotations down to the setter, you are telling JPA that you want to used property-based access instead of field-based access. You should know, however, that field-based access - as you currently implement it - is preferred over property-based access. There are a couple reasons why property-based access is discouraged, but one is that they you're forced to add getters and setters for all of your persistent entity fields, but you may not want those same fields susceptible to mutation by external clients. In other words, using JPA's property-based access forces you to weaken your entity's encapsulation.
The answer is (thanks to rcampell), if all data of an embedded object is null (in the db), the embedded object will also be null, although when it is initialized in the declaration. The only solution seems to be, setting the object manually.
#Override
public Participant getParticipant(Long id) {
Participant participant = em.find(Participant.class, id);
if(participant != null && participant.getAmsData() == null)
{
participant.setAmsData(new AmsData());
}
return participant;
}
Still feels strange to me ...
Well, it's possible that your object could be getting constructed twice behind the scenes. JPA implementations will usually set those fields directly.
I think you need to put the annotations on the Getters and setters themselves if you want them to be used. See this answer:
Empty constructors and setters on JPA Entites
It's 2018 now and I had the same problem in a similiar situation.
Using your code as example, I solved the problem like this:
#Entity
public class Participant extends BaseEntity implements Comparable<Participant> {
...
#Embedded
private AmsData amsData = new AmsData();
public void getAmsData(AmsData amsData) {
Checks.verifyArgNotNull(amsData, "amsdata");
this.amsData = amsData;
}
public AmsData getAmsData(){
if(amsData == null){
amsData = new AmsData();
}
return amsData;
}
...
}
I was having the same problem , I just added getters and setters using #Getter and #setter lombok annotations and it started working