Dozer nulls fields existing only in the dest instead of ignoring - java

I have a Pojo:
class Pojo{
String name;
String surname;
}
and a jpa/hibernate entity:
class Entity{
Long id;
String code;
List<EntityPojo> ep;
}
finally the entity pojo
class EntityPojo{
Long id;
String name;
String surname;
}
In my code I do:
//I receive a pojo from the rest call and I want to update
Entity entity = repo.findByCode("aCode");
//at this point my entity pojo IDs are correctly filled!
dozerMapper.map(pojo, entity);
//after the map the IDs are null!!
repo.save(entity); //BAM!
As you can see the mapping just deletes the IDs and this brings me to the constraint exception in hibernate...why is that?

You didn't provide a code that isolates and reproduce the problem.
From what you provided, at least things to check:
1) What id is in pojo instance? If it's null - here your result. Could be useful: Exclude Mapping Null Values
2) Check access levels. I'm not sure how Dozer currently working if fields are not public and have no public access methods.
3) Did you verify that fields are in the entity are null, not in a database? If you do - why this question has spring-data-jpa tag and code related to persistency level? If you don't - why are you sure that something wrong with Dozer, not with your database mapping?
In general - you don't seem to work on the question by yourself at all, because if you do - you would at least make an example with cleaned up database persistence layer after checking that it's not a place where you lose your fields. It's really hard to answer a question with mix of technologies and problem could be anywhere.

Related

Bean validation #ElementCollection and #Version conflict and fails validation

I am facing a very strange issue at the moment.
I have an entity that contains a property that is an element collection.
#ElementCollection(targetClass=Integer.class, fetch = FetchType.EAGER)
#CollectionTable(name="campaign_publisher", joinColumns=#JoinColumn(name="campaign_id"))
#Column(name = "publisher_id")
...
#NotEmpty(message = "campaign.publishers.missing")
public Set<Integer> getPublishers() {
return this.publishers;
}
public Campaign setPublishers(Set<Integer> publisherId) {
this.publishers = publisherId;
return this;
}
This all works fine. The values are validated and saved correct.
I also want this entity to have optimistic concurrency so I applied a #Version annotation as well.
#Version
private Long etag = 0L;
...
public Long getEtag() {
return etag;
}
public void setEtag(Long etag) {
this.etag = etag;
}
By adding the #Version annotation the #NotEmpty validation on my set of publishers always returns invalid.
To try and diagnose this I have tried the following:
Creating a custom validator at the entity level so I can inspect the values in the entity. I found that the Set of values have been replaced with an empty PersistentSet which is causing the validation to always fail.
I created some unit tests for the entity that uses a validator that is retrieved from the validationfactory and this validator seems to work as expected.
I have also tried to change the ElementCollection to a many-to-many relationship and a bi-directional one-to-many but the issue persists.
Right now I am out of ideas. The only thing I have found that works correctly is disabling the hibernate validation and manually calling the validator just before I save my data.
So my questions are:
Has anyone encountered this issue before?
Any advice on what I could try next?
Thank you all for reading!
Short answer: Set the initial value for etag = null.
// this should do the trick
#Version
private Long etag = null;
Longer one : When you are adding a optimistic locking via adding #Version annotation on a field with a default value you are making hibernate/spring-data think that the entity is not a new one (even the id is null). So on initial save instead of persisting entity undelying libraries try to do a merge. And merging transient entity forces hibernate to just one by one copy all the properties from source entity (the ones which you are persisting) to the target one (which is autocreate by hibernate with all the properties set to default values aka nulls) and here comes the problem, as hibernate will just copy the values of associations of FROM_PARENT type or in other words only associations which are hold on entity side but in your case the association is TO_PARENT (a foreign key from child to parent) hibernate will try to postpone association persistance after main entity save, but save will not work as entity will not pass #NotEmpty validation.
First I would suggest to remove the default value initialization for your #Version property. This property is maintained by hibernate, and should be initialized by it.
Second: are you sure that you are validating the fully constructed entity? i.e. you are constructing something, then do something, and for exact persist/flush cycle your entity is in wrong condition.
To clarify this, while you are on a Spring side, I would suggest to introduce service-level validation on your DAO layer. I.e. force the bean validation during initial call to DAO, rather then bean validation of entity during flush (yeap hibernate batches lots of things, and exact validation happens only during flush cycle).
To achieve this: mark your DAO #Validated and make your function arguments beign validated: FancyEntity store(#Valid #NotNull FancyEntity fancyEntity) { fancyEntity = em.persist(fancyEntity); em.flush(); return fancyEntity;}
By making this, you will be sure that you are storing valid entity: the validation would happen before store method is called. This will reveal the place where your entity became invalid: in your service layer, or in bad behaving hibernate layer.
I noticed that you use mixed access: methods and fields. In this case you can try to set #Version on the method:
#Version
public Long getEtag() {
return etag;
}
not on the field.

How to load missing object attributes before persisting?

I have a question about Jackson and Hibernate. My application is based on rest and objects are transferred between frontend and backend as json, so I have some situations when some object's attributes are missing when I deserialize json to java object and I'd like to load those attributes before persisting changes (because I don't want to lose that data from database). Has anybody any ideas to solve this problem?
Edit
I am not sure that my question has understood right. So I give simple example, what I try to say.
So I have following Java class:
#Entity
#Table( name = "employees" )
public class Employee extends BaseEntity<Long> {
private String lastName;
private String firstName;
#Embedded
private Address address;
//... a lot of other attributes and methods..
}
Now I get json data from frontend, which is something like this:
{
"id":17,
"lastName":"Smith",
"firtName":"John"
}
Next I want to save these changes to database but my deserialized java entity is totally incomplete, there are a lot of missing attributes and references (values are nulls). How can I load those missing attribute values before persisting object, without losing those new values that I got from UI?
I have tried to use EntityManager's merge-method but it didn't work...
Load the Data from DB (if the record already exists), do a merge & save.
You can map multiple DTOs as JPA #Entities to the same database table. When you save one such DTO, only it's fields are propagated to the DB, without interfering with other database column the current DTO hasn't mapped.

Safe embedded entity with objectify

I have two entities.
#Entity
public class Recipe {
#Id
private Long id;
private List<Step> steps;
}
#Entity
public class Step {
#Id
private Long id;
private String instruction;
}
And the following Clound Endpoint
#ApiMethod(
name = "insert",
path = "recipe",
httpMethod = ApiMethod.HttpMethod.POST)
public Recipe insert(Recipe recipe) {
ofy().save().entities(recipe.getSteps()).now(); //superfluous?
ofy().save().entity(recipe).now();
logger.info("Created Recipe with ID: " + recipe.getId());
return ofy().load().entity(recipe).now();
}
I'm wondering how do I skip the step where I have to save the emebedded entity first. The Id of neither entity is set. I want objectify to automatically create those. But if don't save the embedded entity I get an exception.
com.googlecode.objectify.SaveException: Error saving com.devmoon.meadule.backend.entities.Recipe#59e4ff19: You cannot create a Key for an object with a null #Id. Object was com.devmoon.meadule.backend.entities.Step#589a3afb
Since my object structure will get a lot more complex, I need to find a way to skip this manual step.
I presume you are trying to create real embedded objects, not separate objects stored in the datastore and linked. Your extra save() is actually saving separate entities. You don't want that.
You have two options:
Don't give your embedded object an id. Don't give it #Entity and don't give it an id field (or at least eliminate #Id). It's just a POJO. 90% of the time, this is what people want with embedded objects.
Allocate the id yourself with the allocator, typically in your (non-default) constructor.
Assuming you want a true embedded entity with a real key, #2 is probably what you should use. Keep in mind that this key is somewhat whimsical since you can't actually load it; only the container object can be looked up in the datastore.
I suggest going one step further and never use automatic id generation for any entities ever. Always use the allocator in the (non-default) constructor of your entities. This ensures that entities always have a valid, stable id. If you always allocate the id before a transaction start, it fixes duplicate entities that can be created when a transaction gets retried. Populating null ids is just a bad idea all around and really should not have been added to GAE.
The concept of the embedded is that the embedded content is persisted inside the main entity.
Is this the behaviour you are trying to configure?
The default behaviour of a Collection (List) of #Entity annoted class is to refer them instead of embed them. As you current configuration, the List<Step> variable does not have any annotation to override the default configuration, which is a different entity related to another one.
The error you are getting is because Objectify, when it saves the recipe entity, is trying to get the key of each step to create the relationship (and save them in the recipe entity), but if the entity step is not saved yet on the datastore, does not have a key
If you are trying to persist the steps inside the recipe entity, you need to setup objectify like this
#Entity
public class Recipe {
#Id
private Long id;
private List<Step> steps;
}
public class Step {
private Long id;
private String instruction;
}
As you can see, I removed the #Id annotation (an embedded Entity does not require an ID because is inside another entity) and the #Entity from the Step class. With this configuration, Objectify save the step entities inside the recipe entity
Source: https://code.google.com/p/objectify-appengine/wiki/Entities#Embedded_Object_Native_Representation

Mapping JSON object to Hibernate entity

I'm going to start a project of a REST application managed with Spring and with Hibernate for my model.
I know that Spring allows you to get Java object from the HTTP Request (with #Consumes(JSON) annotation). Is there any conflict if this Java object is also a Hibernate entities? And is nested object working (like #ManyToOne relation)?
Maven dependency
The first thing you need to do is to set up the following Hibernate Types Maven dependency in your project pom.xml configuration file:
<dependency>
<groupId>com.vladmihalcea</groupId>
<artifactId>hibernate-types-52</artifactId>
<version>${hibernate-types.version}</version>
</dependency>
Domain model
Now, if you are using PostgreSQL, you need to use the JsonType from Hibernate Types.
In order to use it in your entities, you will have to declare it on either class level or in a package-info.java package-level descriptor, like this:
#TypeDef(name = "json", typeClass = JsonType.class)
And, the entity mapping will look like this:
#Type(type = "json")
#Column(columnDefinition = "json")
private Location location;
If you're using Hibernate 5 or later, then the JSON type is registered automatically by the Postgre92Dialect.
Otherwise, you need to register it yourself:
public class PostgreSQLDialect extends PostgreSQL91Dialect {
public PostgreSQL92Dialect() {
super();
this.registerColumnType( Types.JAVA_OBJECT, "json" );
}
}
The JsonType works with Oracle, SQL Server, PostgreSQL, MySQL, and H2 as well. Check out the project page for more details about how you can map JSON column types on various relational database systems.
Yes, this wouldn't be a problem and is actually a fairly common practice.
In the recent years I have come to realize that sometimes, however, it is not a good idea to always build your views based on your domain directly. You can take a look at this post:
http://codebetter.com/jpboodhoo/2007/09/27/screen-bound-dto-s/
It is also known as "Presentation Model":
http://martinfowler.com/eaaDev/PresentationModel.html
The idea behind that is basically the following:
Imagine you have the domain entry User, who looks like that :
#Entity
#Data
public class User {
#Id private UUID userId;
private String username;
#OneToMany private List<Permission> permissions;
}
Let's now imagine you have a view where you wanna display that user's name, and you totally don't care about the permissions. If you use your approach of immediately returning the User to the view, Hibernate will make an additional join from the Permissions table because event though the permissions are lazily loaded by default, there is no easy way to signal to the jackson serializer or whatever you are using, that you don't care about them in this particular occasion, so jackson will try to unproxy them (if your transaction is still alive by the time your object is put for json serialization, otherwise you get a nasty exception). Yes, you can add a #JsonIgnore annotation on the permissions field, but then if you need it in some other view, you are screwed.
That a very basic example, but you should get the idea that sometimes your domain model can't be immediately used to be returned to the presentation layer, due to both code maintainability and performance issues.
We were using such approach to simplify design and get rid of many dtos (we were abusing them too much). Basically, it worked for us.
However, in our REST model we were trying to do not expose other relations for an object as you can always create another REST resources to access them.
So we just put #JsonIgnore annotations to relations mappings like #OneToMany or #ManyToOnemaking them transient.
Another problem I see that if you still like to return these relations you would have to use Join.FETCH strategy for them or move transaction management higher so that transaction still exists when a response is serialized to JSON (Open Session In View Pattern).
On my opinion these two solutions are not so good.
You can map the json request without using any library at REST web-services (Jersy)
this sample of code:
This hibernate entity called book:
#Entity
#Table(name = "book", schema = "cashcall")
public class Book implements java.io.Serializable {
private int id;
private Author author; // another hibernate entity
private String bookName;
//setters and getters
}
This web-services function
#POST
#Produces(MediaType.APPLICATION_JSON)
#Consumes(MediaType.APPLICATION_JSON)
public String addBook(Book book) {
String bookName=book.getName();
return bookName;
}
This is sample json request:
{
"bookName" : "Head First Java",
"author" : {
"id" : 1
}
}
Since you are just starting, perhaps you could use Spring Data REST?
This is the project: http://projects.spring.io/spring-data-rest/
And here are some simple examples:
https://github.com/spring-projects/spring-data-book/tree/master/rest
https://github.com/olivergierke/spring-restbucks
As you can see in the examples, there are no extra DTOs beyond the #Entity annotated POJOs.

Persisting third-party classes with no ID's

Say I have the following Java class, which is owned by a vendor so I can't change it:
public class Entry {
private String user;
private String city;
// ...
// About 10 other fields
// ...
// Getters, setters, etc.
}
I would like to persist it to a table, using JPA 2.0 (OpenJPA implementation). I cannot annotate this class (as it is not mine), so I'm using orm.xml to do that.
I'm creating a table containing a column per field, plus another column called ID. Then, I'm creating a sequence for it.
My question is: is it at all possible to tell JPA that the ID that I would like to use for this entity doesn't even exist as a member attribute in the Entry class? How do I go about creating a JPA entity that will allow me to persist instances of this class?
EDIT
I am aware of the strategy of extending the class and adding an ID property it. However, I'm looking for a solution that doesn't involve extending this class, because I need this solution to also be applicable for the case when it's not only one class that I have to persist, but a collection of interlinked classes - none of which has any ID property. In such a scenario, extending doesn't work out.
Eventually, I ended up doing the following:
public class EntryWrapper {
#Id
private long id;
#Embedded
private Entry entry;
}
So, I am indeed wrapping the entity but differently from the way that had been suggested. As the Entry class is vendor-provided, I did all its ORM work in an orm.xml file. When persisting, I persist EntryWrapper.
I don't have much experience with JPA, but I wouldn't extend your base classes, instead I would wrap them:
public class PersistMe<T> {
#Id
private long id;
private T objToWrap;
public(T objToWrap) {
this.objToWrap = objToWrap;
}
}
I can't test it, if it doesn't work let me know so I can delete the answer.

Categories

Resources