I have the following entities:
public class Activity
{
private Long activityId;
private String name;
private Long year;
}
public class Course extends Activity
{
private Long duration;
private Date startDate;
private Date endDate;
....
}
public class Conference extends Activity
{
private Date dueDate;
private Person speaker;
....
}
I have modeled this in my database as one Activity table with all attrbitues for child entities, and then mapped them on Hibernate using single-table strategy.
I want to retrieve all Activities for a given year. I know how to do that on the data access layer through Hibernate, my problem comes with mapping those polymorphic objects (some of them are courses and some of them are conferences) using Orika mapper: I always end up with Activity objects without each concrete entity's attributes.
More specifically, I've got a fourth class, let's call it A, which has a list of Activity elements which may be of Course or Conference class, and I would like to map it like this:
ADTO adto = map(A, ADTO.class);
I haven't found any info on this issue on the internet...
Orika support polymorphic mapping and within collection also
Please take a look at this PolicyElementsTestCase
There is lot of use cases within test code of Orika, you can refer to that. Hope this can help you.
Related
First, here are my entities.
Player :
#Entity
#JsonIdentityInfo(generator=ObjectIdGenerators.UUIDGenerator.class,
property="id")
public class Player {
// other fields
#ManyToOne
#JoinColumn(name = "pla_fk_n_teamId")
private Team team;
// methods
}
Team :
#Entity
#JsonIdentityInfo(generator=ObjectIdGenerators.UUIDGenerator.class,
property="id")
public class Team {
// other fields
#OneToMany(mappedBy = "team")
private List<Player> members;
// methods
}
As many topics already stated, you can avoid the StackOverflowExeption in your WebService in many ways with Jackson.
That's cool and all but JPA still constructs an entity with infinite recursion to another entity before the serialization. This is just ugly ans the request takes much longer. Check this screenshot : IntelliJ debugger
Is there a way to fix it ? Knowing that I want different results depending on the endpoint. Examples :
endpoint /teams/{id} => Team={id..., members=[Player={id..., team=null}]}
endpoint /members/{id} => Player={id..., team={id..., members=null}}
Thank you!
EDIT : maybe the question isn't very clear giving the answers I get so I'll try to be more precise.
I know that it is possible to prevent the infinite recursion either with Jackson (#JSONIgnore, #JsonManagedReference/#JSONBackReference etc.) or by doing some mapping into DTO. The problem I still see is this : both of the above are post-query processing. The object that Spring JPA returns will still be (for example) a Team, containing a list of players, containing a team, containing a list of players, etc. etc.
I would like to know if there is a way to tell JPA or the repository (or anything) to not bind entities within entities over and over again?
Here is how I handle this problem in my projects.
I used the concept of data transfer objects, implemented in two version: a full object and a light object.
I define a object containing the referenced entities as List as Dto (data transfer object that only holds serializable values) and I define a object without the referenced entities as Info.
A Info object only hold information about the very entity itself and not about relations.
Now when I deliver a Dto object over a REST API, I simply put Info objects for the references.
Let's assume I deliever a PlayerDto over GET /players/1:
public class PlayerDto{
private String playerName;
private String playercountry;
private TeamInfo;
}
Whereas the TeamInfo object looks like
public class TeamInfo {
private String teamName;
private String teamColor;
}
compared to a TeamDto
public class TeamDto{
private String teamName;
private String teamColor;
private List<PlayerInfo> players;
}
This avoids an endless serialization and also makes a logical end for your rest resources as other wise you should be able to GET /player/1/team/player/1/team
Additionally, the concept clearly separates the data layer from the client layer (in this case the REST API), as you don't pass the actually entity object to the interface. For this, you convert the actual entity inside your service layer to a Dto or Info. I use http://modelmapper.org/ for this, as it's super easy (one short method call).
Also I fetch all referenced entities lazily. My service method which gets the entity and converts it to the Dto there for runs inside of a transaction scope, which is good practice anyway.
Lazy fetching
To tell JPA to fetch a entity lazily, simply modify your relationship annotation by defining the fetch type. The default value for this is fetch = FetchType.EAGER which in your situation is problematic. That is why you should change it to fetch = FetchType.LAZY
public class TeamEntity {
#OneToMany(mappedBy = "team",fetch = FetchType.LAZY)
private List<PlayerEntity> members;
}
Likewise the Player
public class PlayerEntity {
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "pla_fk_n_teamId")
private TeamEntity team;
}
When calling your repository method from your service layer, it is important, that this is happening within a #Transactional scope, otherwise, you won't be able to get the lazily referenced entity. Which would look like this:
#Transactional(readOnly = true)
public TeamDto getTeamByName(String teamName){
TeamEntity entity= teamRepository.getTeamByName(teamName);
return modelMapper.map(entity,TeamDto.class);
}
In my case I realized I did not need a bidirectional (One To Many-Many To One) relationship.
This fixed my issue:
// Team Class:
#OneToMany(fetch = FetchType.LAZY, cascade = CascadeType.ALL)
private Set<Player> members = new HashSet<Player>();
// Player Class - These three lines removed:
// #ManyToOne
// #JoinColumn(name = "pla_fk_n_teamId")
// private Team team;
Project Lombok might also produce this issue. Try adding #ToString and #EqualsAndHashCode if you are using Lombok.
#Data
#Entity
#EqualsAndHashCode(exclude = { "members"}) // This,
#ToString(exclude = { "members"}) // and this
public class Team implements Serializable {
// ...
This is a nice guide on infinite recursion annotations https://www.baeldung.com/jackson-bidirectional-relationships-and-infinite-recursion
You can use #JsonIgnoreProperties annotation to avoid infinite loop, like this:
#JsonIgnoreProperties("members")
private Team team;
or like this:
#JsonIgnoreProperties("team")
private List<Player> members;
or both.
I have a class that is suitable for a builder pattern, there are many params and I'd rather not use a ton of telescopic constructors.
My problem is that this class is a JPA entity and that is very new to me.
Having private final data members is throwing an error as I they are not initialized in the constructor and as far as I'm aware, JPA requires an empty protected constructor.
Can anyone help please? An example would be fantastic, I've included a basic example of the code below but it's very generic. I've omitted many of the accessors and data members to save space/time.
#Entity//(name= "TABLE_NAME") //name of the entity / table name
public class Bean implements Serializable {
private static final long serialVersionUID = 1L;
#Id //primary key
#GeneratedValue
Long id;
private final DateTime date;
private final String title;
private final String intro;
//used by jpa
protected Bean(){}
private Bean(Bean Builder beanBuilder){
this.date = beanBuilder;
this.title = beanBuilder;
this.intro = beanBuilder;
}
public DateTime getDate() {
return date;
}
public String getTitle() {
return title;
}
public static class BeanBuilder Builder{
private final DateTime date;
private final String title;
//private optional
public BeanBuilder(DateTime date, String title) {
this.date = date;
this.title = title;
}
public BeanBuilder intro(String intro){
this.intro = intro;
return this;
}
public BeanBuilder solution(String solution){
this.intro = solution;
return this;
}
public Bean buildBean(){
return new Bean(this);
}
}
}
Member fields marked as final must have a value assigned during construction and this value is final (i.e. cannot change). As a consequence, all declared constructors must assign a value to all final fields.
This explain your compiler error.
From the JLS:
A blank final instance variable must be definitely assigned at the end of every constructor of the class in which it is declared, or a compile-time error occurs (§8.8, §16.9).
Not sure why you want to do that. Maybe it is better to define the member variable as
#Column(name = "id", nullable = false, updatable = false)
for example
The JPA 2.1 specification, section "2.1 The Entity Class", says:
No methods or persistent instance variables of the entity class may be
final.
..meaning that there's no way for you to build a truly immutable JPA entity. But, I don't really see how that can be such a big issue. Just don't let the entity class expose public setters?
I'm not sure what you meant for that, but having immutable objects is not a great idea when working in Hibernate (not to say you cannot do it, or you shouldn't).
Think about it, because Hibernate/JPA defines "states" for objects they are meant to be mutable; otherwise you would have a static database, or something like insert-once-and-never-modify database.
The immutable concept is a very known (nowadays) concept borrowed mainly from Functional Programming that doesn't really apply in the same way to OOP. And if you are working with Hibernate you shouldn't have immutable objects...at least till today's date.
UPDATE
If you want to have what they call read-only entities, you can use the #Immutable annotation from Hibernate itself. Pay close attention to collections as entity members.
Entities are meant to be mutable when it comes to strict Java immutability. For example, lazily loaded associations will change the object state once the association is accessed.
If you need to use entity data in a real immutable fashion (for multi-threaded purposes for example), then consider using DTOs (because entities are not meant to be accessed cuncurrently either).
I have the following entities:
#Entity
public class Person {
#Id public Long id;
public String name;
public Ref<Picture> picture;
public String email;
public byte age;
public short birthday; // day of year
public String school;
public String very_long_life_story;
... some extra fields ...
}
#Entity
public class Place {
#Id public Long id;
public String name;
public String comment;
public long createdDateMS;
public long visitors;
#Load public List<Ref<Person>> owners;
}
Few notes:
(A) Maximum size of owners, in Place entity, is 4 (~)
(B) The person class is presumable very big, and when querying place, I would like to only show a subset of the person data. This optimizations is aimed both at server-client and server-database communications. Since objectify (gae actually) only load/save entire entities, I would like to do the following:
#Entity
pubilc class PersonShort {
#Id public Long id;
public Ref<Picture> picture;
public String name;
}
and inside Place, I would like to have (instead of owners):
#Load public List<PersonShort> owners;
(C) The problem with this approach, is that now I have a duplication inside the datastore.
Although this isn't such a bad thing, the real problem is when a Person will try to save a new picture, or change name; I will not only have to update it in his Person class,
but also search for every Place that has a PersonShort with same id, and update that.
(D) So the question is, is there any solution? Or am I simply forced to select between the options?
(1) Loading multiple Person class, which are big, when all I need is some really small information about it.
(2) Data duplication with many writes
If so, which one is better (Currently, I believe it's 1)?
EDIT
What about loading the entire class (1), but sending only part of it?
#Entity
public class Person {
#Id public Long id;
public String name;
public Ref<Picture> picture;
public String email;
public byte age;
public short birthday; // day of year
public String school;
public String very_long_life_story;
... some extra fields ...
}
public class PersonShort {
public long id;
public String name;
}
#Entity
public class Place {
#Id public Long id;
public String name;
public String comment;
public long createdDateMS;
public long visitors;
// Ignore saving to datastore
#Ignore
public List<PersonShort> owners;
// Do not serialize when sending to client
#ApiResourceProperty(ignored = AnnotationBoolean.TRUE)
#Load public List<Ref<Person>> ownersRef;
#OnLoad private void loadOwners() {
owners = new List<PersonShort>();
for (Ref<Person> r : ownersRef) {
owners.add(nwe PersonShort(r.get()));
}
}
}
It sounds like you are optimizing prematurely. Do you know you have a performance issue?
Unless you're talking about hundreds of K, don't worry about the size of your Person object in advance. There is no practical value in hiding a few extra fields unless the size is severe - and in that case, you should extract the big fields into some sort of meaningful entity (PersonPicture or whatnot).
No definite answer, but some suggestions to look at:
Lifecycle callbacks.
When you put your Person entity, you can have an #OnSave handler to automatically store your new PersonShort entity. This has the advantage of being transparent to the caller, but obviously you are still dealing with 2 entity writes instead of 1.
You may also find you are having to fetch two entities too; initially you may fetch the PersonShort and then later need some of the detail in the corresponding Person. Remember Objectify's caching can reduce your trips to Datastore: it's arguably better to have a bigger, cached, entity than two separate entities (meaning two RPCs).
Store your core properties (the ones in PersonShort) as separate properties in your Person class and then have the extended properties as a single JSON string which you can deserialize with Gson.
This has the advantage that you are not duplicating properties, but the disadvantage is that anything you want to be able to search on cannot be in the JSON blob.
Projection Queries. You can tell Datastore to return only certain properties from your entities. The problem with this method is that you can only return indexed properties, so you will probably find you need too many indexes for this to be viable.
Also, use #Load annotations with care. For example, in your Place class, think whether you really need all those owners' Person details when you fetch the owners. Perhaps you only need one of them? i.e., instead of getting a Place and 4 Persons every time you fetch a Place, maybe you are better off just loading the required Person(s) when you need them? (It will depend on your application.)
It is a good practice to return a different entity to your client than the one you get from your database. So you could create a ShortPerson or something that is only used as a return object in your REST endpoints. It will accept a Person in its constructor and fill in the properties you want to return to the client from this more complete object.
The advantage of this approach is actually less about optimization and more that your server models will change over time, but that change can be completely independent of your API. Also, you choose which data is publicly accessible, which is what you are trying to do.
As for the optimization between db and server, I wouldn't worry about it until it is an issue.
Abstract
I have a working application in Appengine using Java and JDO 3.
I found these arguments (auto_now and auto_now_add) which correspond exactly what I want to implement in Java. So essentially the question is: How to convert AppEngine's Python DateTimeProperty to Java JDO?
Constraints
Converting my application to Python is not an option.
Adding two Date properties and manually populating these values whenever a create/update happens is not an option.
I'm looking for a solution which corresponds to what JDO/Appengine/Database authors had in mind for this scenario when they created the APIs.
It would be preferable to have a generic option: say I have 4 entities in classes: C1, C2, C3, C4 and the solution is to add a base class C0, which all 4 entities would extend, so the 4 entities don't even know they're being "audited".
[update] I tried (using a simple entity)
#PersistenceCapable public class MyEntity {
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY, primaryKey = "true")
private Long id;
#Persistent private String name;
...
1. #Persistent public void getLastUpdate() { return new Date(); }
As suggested by answer, but it seems to always update the value, even when I just load the value from the datastore or just modify an unrelated field (e.g. String name).
You can easily enough have a property (setter/getter) on a java class and have the property persistable (rather than the field). Within that getter you can code whatever you want to control what value goes into the datastore.
If I didn't do the following hack, I can't read the value stored in the datastore [neither with the hack :( ]:
#Persistent public Date getLastUpdate() { return new Date(); }
private Date prevUpdate;
public void setLastUpdate(Date lastUpdate) { this.prevUpdate = lastUpdate; }
public Date getPrevUpdate() { return prevUpdate; }
Is there any way to differentiate if a persistence operation is in progress or my code is calling the getter?
2. #Persistent(customValueStrategy = "auto_now_add") private Date lastUpdate;
I modeled auto_now_add after org.datanucleus.store.valuegenerator.TimestampGenerator replacing Timestamp with java.util.Date.
But it was only populated once at the first makePersistent call, regardless of how many times I modified other fields and called makePersistent. Also note that it doesn't seem to behave as the documentation says (or my English is rusty):
Please note that by defining a value-strategy for a field then it will, by default, always generate a value for that field on persist. If the field can store nulls and you only want it to generate the value at persist when it is null (i.e you haven't assigned a value yourself) then you can add the extension "strategy-when-notnull" as false
3. preStore using PersistenceManager.addInstanceLifecycleListener
Works as expected, but I could make it work across multiple entities using a base class.
pm.addInstanceLifecycleListener(new StoreLifecycleListener() {
#Override public void preStore(InstanceLifecycleEvent event) {
MyEntity entity = (MyEntity)event.getPersistentInstance();
entity.setLastUpdate(new Date());
}
#Override public void postStore(InstanceLifecycleEvent event) {}
}, MyEntity.class);
4. implements StoreCallback and public void jdoPreStore() { this.setLastUpdate(new Date()); }
Works as expected, but I could make it work across multiple entities using a base class.
To satisfy my 4th constraint (using solutions 3 or 4)
Whatever I do I can't make the following structure work:
public abstract class Dateable implements StoreCallback {
#Persistent private Date created;
#Persistent private Date lastUpdate;
public Dateable() { created = new Date(); }
public void jdoPreStore() { this.setLastUpdate(new Date()); }
// ... normal get/set properties for the above two
}
#PersistenceCapable public class MyEntity extends Dateable {
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY, primaryKey = "true") private Long id;
#Persistent private String name;
The problems when the enhancer runs:
public abstract class Dateable:
DataNucleus.MetaData Registering class "[...].Dateable" as not having MetaData.
public abstract class Dateable with the above log, but running the code anyway:
Creation date changes whenever I create or read the data from datastore.
#PersistenceCapable public abstract class Dateable:
DataNucleus.MetaData Class "[...].MyEntity" has been specified with 1 primary key fields, but this class is using datastore identity and should be application identity.
JDO simply provides persistence of Java classes (and its fields/properties) so don't see what the design of JDO has to do with it.
You can easily enough have a property (setter/getter) on a java class and have the property persistable (rather than the field). Within that getter you can code whatever you want to control what value goes into the datastore. Either that or you use a preStore listener to be able to set things just before persistence so the desired value goes into the datastore.
We use annotations for mapping the entity class with the database table by simply specifying #Entity and more like #Id, table joins and many things. I do not know how these entity variables are getting mapped with database table. Can anyone give a short description for understanding.
Thanks :)
Well the idea is to translate your objects and their connections with other objects into a relational database. These two ways of representing data (objects defined by classes and in tables in a database) are not directly compatible and that is where a so called Object Relational Mapper framework comes into play.
So a class like
class MyObject
{
private String name;
private int age;
private String password;
// Getters and setters
}
Will translate into a database table containing a column name which is of type varchar, age of type int and password of type varchar.
Annotations in Java simply add additional information (so called meta data) to your class definitions, which can be read by any other class (e.g. JavaDoc) and in the case of the Java Persistence API will be used by an ORM framework like Hibernate to read additional information you need to translate your object into the database (your database table needs a primary id and some information - like what type of a relation an object has to another - can't be automatically determined by just looking at your class definition).
Annotations are very well explained here:
http://docs.jboss.org/hibernate/stable/annotations/reference/en/html_single/
annotations are just metadata on a class, nothing magical. You can write your own annotations. Those annotations are given retention policies of runtime (which means you have access to that metadata at runtime). When you call persist etc the persistence provider iterates through the fields (java.lang.reflect.Field) in your class and checks what annotations are present to build up your SQL statement. Try writing your own annotation and doing something with it. It won't seem very magical after that.
in your case annotation working means mapping with tablename with entity class is look like as ....
#Entity
#Table(name = "CompanyUser")
public class CompanyUserCAB implements java.io.Serializable
{
private long companyUserID;
private int companyID;
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "companyUserID")
public long getCompanyUserID()
{
return this.companyUserID;
}
public void setCompanyUserID(long companyUserID)
{
this.companyUserID = companyUserID;
}
#Column(name = "companyID")
public int getCompanyID()
{
return this.companyID;
}
public void setCompanyID(int companyID)
{
this.companyID = companyID;
}
}