I have a class House and a class Room. Each of these classes are a NodeEntity and are persisted in separate repositories, HouseRepository and RoomRepository. House contains a Set<Room>. I want to be able to delete a Room instance and automatically have that instance removed from the Set<Room> inside of the House class. Since Neo4j is a graph database, I figured that I should be able to declare a relationship between the House and each Room and deleting a Room instance will take care of this automatically. The two following classes represent House and Room.
#NodeEntity
public class House {
#GraphId
private Long id;
#Indexed(unique=true) String uid;
#RelatedTo (type="addition", direction=Direction.OUTGOING)
#Fetch
private Set<Room> rooms;
public House(String uid, Set<Room> rooms) {
this.uid = uid;
this.rooms = rooms;
}
public House() {
this.uid = //random uid;
}
}
#NodeEntity
public class Room {
#GraphId
private Long id;
#Indexed(unique=true) String uid;
public Room(String uid) {
this.uid = uid;
}
public Room() {
this.uid = //random uid;
}
}
I am thinking that I should be able to write a Cypher query in the RoomRepository that would take care of this, but I am not sure. I have thought of something like this:
public interface RoomRepository extends BaseRepository<Room>{
#Query("start room=node({0}) " +
"match house-[:addition*]->room" +
"delete room")
public void deleteRoomAndRemoveRoomFromHouse(String uid);
}
What is the recommended way to handle these types of deletes?
Your approach should almost work. You may bump up against an exception because you try to delete a node without first having deleted its relationship. (Unless SDN magic takes care of that for you, but I don't think it intercepts cypher queries.)
As your MATCH clause stands, it will act as a filter, meaning that whatever is bound to the room identifier in your start clause is only retained by the time you reach the delete clause if it has the relevant relationship of at least depth 1. If the room you pass as a parameter does not have at least one incoming [:addition] relationship, it is no longer bound and won't be deleted. Perhaps this is intentional, if so keep it, but add (otherwise replace it with) a match clause that binds all the rooms relationships and deletes them before you delete the room. Try something like
START room=node({0})
MATCH room-[r]-()
DELETE r, room
But I think either the repository or Neo4jTemplate should have plenty of voodoo to take care of this kind of operation for you. Or if you use the advanced mapping with AspectJ, you may have all kinds of chocolate chips baked into your room entity. I know there is a NodeEntity#persist(), I think there is a NodeEntity#remove() as well.
Once the node is deleted, it won't show up in the Set<Room> field of your room class. (If you use simple mapping you may have to retrieve it or sync it manually to the database, possibly a Neo4jTemplate#fetch(), passing the Set<Room> field, will do that for you.)
Related
First, here are my entities.
Player :
#Entity
#JsonIdentityInfo(generator=ObjectIdGenerators.UUIDGenerator.class,
property="id")
public class Player {
// other fields
#ManyToOne
#JoinColumn(name = "pla_fk_n_teamId")
private Team team;
// methods
}
Team :
#Entity
#JsonIdentityInfo(generator=ObjectIdGenerators.UUIDGenerator.class,
property="id")
public class Team {
// other fields
#OneToMany(mappedBy = "team")
private List<Player> members;
// methods
}
As many topics already stated, you can avoid the StackOverflowExeption in your WebService in many ways with Jackson.
That's cool and all but JPA still constructs an entity with infinite recursion to another entity before the serialization. This is just ugly ans the request takes much longer. Check this screenshot : IntelliJ debugger
Is there a way to fix it ? Knowing that I want different results depending on the endpoint. Examples :
endpoint /teams/{id} => Team={id..., members=[Player={id..., team=null}]}
endpoint /members/{id} => Player={id..., team={id..., members=null}}
Thank you!
EDIT : maybe the question isn't very clear giving the answers I get so I'll try to be more precise.
I know that it is possible to prevent the infinite recursion either with Jackson (#JSONIgnore, #JsonManagedReference/#JSONBackReference etc.) or by doing some mapping into DTO. The problem I still see is this : both of the above are post-query processing. The object that Spring JPA returns will still be (for example) a Team, containing a list of players, containing a team, containing a list of players, etc. etc.
I would like to know if there is a way to tell JPA or the repository (or anything) to not bind entities within entities over and over again?
Here is how I handle this problem in my projects.
I used the concept of data transfer objects, implemented in two version: a full object and a light object.
I define a object containing the referenced entities as List as Dto (data transfer object that only holds serializable values) and I define a object without the referenced entities as Info.
A Info object only hold information about the very entity itself and not about relations.
Now when I deliver a Dto object over a REST API, I simply put Info objects for the references.
Let's assume I deliever a PlayerDto over GET /players/1:
public class PlayerDto{
private String playerName;
private String playercountry;
private TeamInfo;
}
Whereas the TeamInfo object looks like
public class TeamInfo {
private String teamName;
private String teamColor;
}
compared to a TeamDto
public class TeamDto{
private String teamName;
private String teamColor;
private List<PlayerInfo> players;
}
This avoids an endless serialization and also makes a logical end for your rest resources as other wise you should be able to GET /player/1/team/player/1/team
Additionally, the concept clearly separates the data layer from the client layer (in this case the REST API), as you don't pass the actually entity object to the interface. For this, you convert the actual entity inside your service layer to a Dto or Info. I use http://modelmapper.org/ for this, as it's super easy (one short method call).
Also I fetch all referenced entities lazily. My service method which gets the entity and converts it to the Dto there for runs inside of a transaction scope, which is good practice anyway.
Lazy fetching
To tell JPA to fetch a entity lazily, simply modify your relationship annotation by defining the fetch type. The default value for this is fetch = FetchType.EAGER which in your situation is problematic. That is why you should change it to fetch = FetchType.LAZY
public class TeamEntity {
#OneToMany(mappedBy = "team",fetch = FetchType.LAZY)
private List<PlayerEntity> members;
}
Likewise the Player
public class PlayerEntity {
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "pla_fk_n_teamId")
private TeamEntity team;
}
When calling your repository method from your service layer, it is important, that this is happening within a #Transactional scope, otherwise, you won't be able to get the lazily referenced entity. Which would look like this:
#Transactional(readOnly = true)
public TeamDto getTeamByName(String teamName){
TeamEntity entity= teamRepository.getTeamByName(teamName);
return modelMapper.map(entity,TeamDto.class);
}
In my case I realized I did not need a bidirectional (One To Many-Many To One) relationship.
This fixed my issue:
// Team Class:
#OneToMany(fetch = FetchType.LAZY, cascade = CascadeType.ALL)
private Set<Player> members = new HashSet<Player>();
// Player Class - These three lines removed:
// #ManyToOne
// #JoinColumn(name = "pla_fk_n_teamId")
// private Team team;
Project Lombok might also produce this issue. Try adding #ToString and #EqualsAndHashCode if you are using Lombok.
#Data
#Entity
#EqualsAndHashCode(exclude = { "members"}) // This,
#ToString(exclude = { "members"}) // and this
public class Team implements Serializable {
// ...
This is a nice guide on infinite recursion annotations https://www.baeldung.com/jackson-bidirectional-relationships-and-infinite-recursion
You can use #JsonIgnoreProperties annotation to avoid infinite loop, like this:
#JsonIgnoreProperties("members")
private Team team;
or like this:
#JsonIgnoreProperties("team")
private List<Player> members;
or both.
How is this called and how to solve my next problem on API. I have to return same object with different views. Some data should not be returned to user. Here is example:
Parent:
public class OrginalObject{
private int id;
private String name;
private String surname;
private int age;
private String school;
private String secret;
private Address Address;
child:
public class Address{
private int id;
private String street;
private String zipCode;
private String Country;
If i want to load list of complete objects i would call:
session.createCriteria(OrginalObject.class).list();
1.) But if don't want someone to know my property secret, i need to hide it. But i don't know how to call it from database the way it would have every other property. Something like:
session.createCriteria(OrginalObjectPublic.class).list();
2.) Also I would like to have option to load only "important" data. That means only properties id, name, school.
session.createCriteria(OrginalObjectImportant.class).list();
Is there a way to do an adapter/"custom view" to directly load it from database? I know i can write pure sql, but i would like to use it on objects with 20+ properties that have nested lists/objects.
3.) Also how to use this transformation to load only few properties of nested object with those from orginal. Example json (only id, name, school from OrginalOBject and id, street from Address:
{
"id": 1,
"name": "testname",
"school": "testschool",
"Address": {
{
"id": 33,
"street": "testStreet 33"
}
}
4.) also how to use it on nested Lists if Address would be array:
public class OrginalObject{
...
private List<Address> AddressList;
Since hibernate is a persistence-framework and you can not save/persist to a view, this is not possible. Yes you can make a view having the name like the table and preferr the view but you will not be able to store to that entity anymore.
You can remove the getter (getSecret) from the entity. So the database still have the field but your entity is not aware of it. This may cause problems if you try to store data using that entity, you may not be able to set the secret.
You can make the getter default (package-level-access) and seal the package to let noone else than the sealed projects access the getter.
You can use spring's method authorization mechanism
First no one have access to your secret,
only you the programmer who is supposed to see it.
Second if no one is supposed to have it why store it.
And if you want to pull it out you can use inheritance.
something like
public abstract PublicObject {
...
}
public OriginalObject extends PublicObject {
String secret;
}
Edit:
2nd & 4th questions you can solve them with hql:
String hql = "SELECT O.id, O.name, O.school FROM OrginalObject O";
Query query = session.createQuery(hql);
List results = query.list();
as for your 3th question it depends on your api. if you're using jackson for example you can use #JsonIgnore
You can map more than one entity to the same table, each one with the set of properties you want to expose.
Take a look to this question.
I'm trying to get a list picking by Reference.
My classes are like:
A License run on a Daemon,
A License can be a LicenseCountryCondition or another subclass with a Ref (for LicenseCountryCondition, the parameter is a Ref of a Country).
License:
#Entity
#Cache
#Index
public class License {
#Id
Long id;
private String name;
private String startDate;
private String expDate;
private int timeStamp;
private int status;
Ref<Daemon> daemon;
private boolean inactive;
}
LicenseCountryCondition :
#Index
#Subclass(index=true)
public class LicenseCountryCondition extends License{
Ref<Country> country;
}
If I want to find a list of the LicenseCountryCondition working on a specific Daemon, I do this:
Daemon dae=ofy().load().type(Daemon.class).filter("name", "example").first().now();
List<LicenseCountryCondition>test=ofy().load().type(LicenseCountryCondition.class).filter("daemon",dae).list();
for(LicenseCountryCondition i:test){
System.out.println(i.getName());
System.out.println(i.getDaemon().getName());
}
And I got the good results.
But, when I try to get a list of LicenseCountryCondition working on a specific Country, it doesn't work:
Country ctr=ofy().load().type(Country.class).filter("name", "France").first().now();
List<LicenseCountryCondition> test=ofy().load().type(LicenseCountryCondition.class).filter("country",ctr).list();
for(LicenseCountryCondition i:test){
System.out.println(i.getName());
}
Can I get this list? (I saw this but it's not the same problem)
Thanks for your attention.
Make sure your query for France actually returns a real country (not null).
There's nothing obviously wrong in what you have written, but there's too much stuff going on here and too much unspecified database state for someone to be able to answer the question. The best thing to do is put together a test case that creates some entities (so the db state is known) and then demonstrates queries that you think should succeed but nevertheless fail.
I have the following entities:
#Entity
public class Person {
#Id public Long id;
public String name;
public Ref<Picture> picture;
public String email;
public byte age;
public short birthday; // day of year
public String school;
public String very_long_life_story;
... some extra fields ...
}
#Entity
public class Place {
#Id public Long id;
public String name;
public String comment;
public long createdDateMS;
public long visitors;
#Load public List<Ref<Person>> owners;
}
Few notes:
(A) Maximum size of owners, in Place entity, is 4 (~)
(B) The person class is presumable very big, and when querying place, I would like to only show a subset of the person data. This optimizations is aimed both at server-client and server-database communications. Since objectify (gae actually) only load/save entire entities, I would like to do the following:
#Entity
pubilc class PersonShort {
#Id public Long id;
public Ref<Picture> picture;
public String name;
}
and inside Place, I would like to have (instead of owners):
#Load public List<PersonShort> owners;
(C) The problem with this approach, is that now I have a duplication inside the datastore.
Although this isn't such a bad thing, the real problem is when a Person will try to save a new picture, or change name; I will not only have to update it in his Person class,
but also search for every Place that has a PersonShort with same id, and update that.
(D) So the question is, is there any solution? Or am I simply forced to select between the options?
(1) Loading multiple Person class, which are big, when all I need is some really small information about it.
(2) Data duplication with many writes
If so, which one is better (Currently, I believe it's 1)?
EDIT
What about loading the entire class (1), but sending only part of it?
#Entity
public class Person {
#Id public Long id;
public String name;
public Ref<Picture> picture;
public String email;
public byte age;
public short birthday; // day of year
public String school;
public String very_long_life_story;
... some extra fields ...
}
public class PersonShort {
public long id;
public String name;
}
#Entity
public class Place {
#Id public Long id;
public String name;
public String comment;
public long createdDateMS;
public long visitors;
// Ignore saving to datastore
#Ignore
public List<PersonShort> owners;
// Do not serialize when sending to client
#ApiResourceProperty(ignored = AnnotationBoolean.TRUE)
#Load public List<Ref<Person>> ownersRef;
#OnLoad private void loadOwners() {
owners = new List<PersonShort>();
for (Ref<Person> r : ownersRef) {
owners.add(nwe PersonShort(r.get()));
}
}
}
It sounds like you are optimizing prematurely. Do you know you have a performance issue?
Unless you're talking about hundreds of K, don't worry about the size of your Person object in advance. There is no practical value in hiding a few extra fields unless the size is severe - and in that case, you should extract the big fields into some sort of meaningful entity (PersonPicture or whatnot).
No definite answer, but some suggestions to look at:
Lifecycle callbacks.
When you put your Person entity, you can have an #OnSave handler to automatically store your new PersonShort entity. This has the advantage of being transparent to the caller, but obviously you are still dealing with 2 entity writes instead of 1.
You may also find you are having to fetch two entities too; initially you may fetch the PersonShort and then later need some of the detail in the corresponding Person. Remember Objectify's caching can reduce your trips to Datastore: it's arguably better to have a bigger, cached, entity than two separate entities (meaning two RPCs).
Store your core properties (the ones in PersonShort) as separate properties in your Person class and then have the extended properties as a single JSON string which you can deserialize with Gson.
This has the advantage that you are not duplicating properties, but the disadvantage is that anything you want to be able to search on cannot be in the JSON blob.
Projection Queries. You can tell Datastore to return only certain properties from your entities. The problem with this method is that you can only return indexed properties, so you will probably find you need too many indexes for this to be viable.
Also, use #Load annotations with care. For example, in your Place class, think whether you really need all those owners' Person details when you fetch the owners. Perhaps you only need one of them? i.e., instead of getting a Place and 4 Persons every time you fetch a Place, maybe you are better off just loading the required Person(s) when you need them? (It will depend on your application.)
It is a good practice to return a different entity to your client than the one you get from your database. So you could create a ShortPerson or something that is only used as a return object in your REST endpoints. It will accept a Person in its constructor and fill in the properties you want to return to the client from this more complete object.
The advantage of this approach is actually less about optimization and more that your server models will change over time, but that change can be completely independent of your API. Also, you choose which data is publicly accessible, which is what you are trying to do.
As for the optimization between db and server, I wouldn't worry about it until it is an issue.
In the CustomerTransactions entity, I have the following field to record what the customer bought:
#ManyToMany
private List<Item> listOfItemsBought;
When I think more about this field, there's a chance it may not work because merchants are allowed to change item's information (e.g. price, discount, etc...). Hence, this field will not be able to record what the customer actually bought when the transaction occurred.
At the moment, I can only think of 2 ways to make it work.
I will record the transaction details into a String field. I feel that this way would be messy if I need to extract some information about the transaction later on.
Whenever the merchant changes an item's information, I will not update directly to that item's fields. Instead, I will create another new item with all the new information and keep the old item untouched. I feel that this way is better because I can easily extract information about the transaction later on. However, the bad side is that my Item table may contain a lot of rows.
I'd be very grateful if someone could give me an advice on how I should tackle this problem.
I would try a third option something like this.
public class Item {
private String sku;
private double currentPrice;
}
public class Customer {
private String name;
private List<Transaction> transactions;
}
public class Transaction {
private Item item;
private Customer customer;
private double pricePerItem;
private double quantity;
private String discountCode;
}
I will leave you to work out the JPA mappings.