I need to convert to json an Entity with JsonManagedReference and JsonBackReference implementations:
#Entity
#Table(name = "myparenttable", schema = "myschema", catalog = "mydb")
#JsonIgnoreProperties(ignoreUnknown = true)
public class Parent implements Serializable {
private Integer id_parent;
private String name;
#JsonManagedReference
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonProperty(access = JsonProperty.Access.WRITE_ONLY)
private List<Child> children;
//getters and setters
}
#Entity
#Table(name = "mychildtable", schema = "myschema", catalog = "mydb")
public class Child implements Serializable {
private Integer id_child;
private String description;
#JsonBackReference
private Parent parent;
//getters and setters
}
With this setup, the persist function is straightforward, I just perform a
em.persist(parent);
and both entities are inserted into the database; but also I need to convert those entities into json for audit purposes. I get a infinite recursion error when doing this:
ObjectMapper mapper = new ObjectMapper();
String jsonString = mapper
.writerWithDefaultPrettyPrinter()
.writeValueAsString(parent);
Is there a way to do both?
You may want to annotate your parent object into Child class with
#JsonIgnore
private Parent parent;
in this way the reference of the parent object isn't put into the serialized json object.
Check if you realy need to implement Serializable interface
This is a perfect use case for using DTOs with Blaze-Persistence Entity Views.
I created the library to allow easy mapping between JPA models and custom interface or abstract class defined models, something like Spring Data Projections on steroids. The idea is that you define your target structure(domain model) the way you like and map attributes(getters) via JPQL expressions to the entity model.
A DTO model for your use case could look like the following with Blaze-Persistence Entity-Views:
#EntityView(Parent.class)
public interface ParentDto {
#IdMapping
Integer getId();
String getName();
List<ChildDto> getChildren();
#EntityView(Child.class)
interface ChildDto {
#IdMapping
Integer getId();
String getDescription();
}
}
Querying is a matter of applying the entity view to a query, the simplest being just a query by id.
ParentDto a = entityViewManager.find(entityManager, ParentDto.class, id);
The Spring Data integration allows you to use it almost like Spring Data Projections: https://persistence.blazebit.com/documentation/entity-view/manual/en_US/index.html#spring-data-features
On top solving your serialization issue, using Blaze-Persistence Entity-Views will also improve performance because it only select the columns that are actually needed.
Related
Supposedly I have two entities joined as such following:
#Entity
public class User {
#Id
private Long id;
private String username;
private String password;
#ManyToOne
#JoinColumn(name="role_id", referencedColumnName = "id") //Table user in database has foreign key role_id
private Role role;
}
#Entity
public class Role {
#Id
private Long id;
private String name;
}
How do I only create User entity with only one of role attributes instead of the whole? (for example, role name only)
I expect something like
#Entity
public class User {
#Id
private Long id;
private String username;
private String password;
// Some prefix or annotation maybe?
private String role_name;
}
Like you read in the comments already, you will need a DTO approach for this and I think this is a perfect use case for Blaze-Persistence Entity Views.
I created the library to allow easy mapping between JPA models and custom interface or abstract class defined models, something like Spring Data Projections on steroids. The idea is that you define your target structure(domain model) the way you like and map attributes(getters) via JPQL expressions to the entity model.
A DTO model for your use case could look like the following with Blaze-Persistence Entity-Views:
#EntityView(User.class)
public interface UserDto {
#IdMapping
Long getId();
String getUsername();
String getPassword();
#Mapping("role.name")
String getRoleName();
}
Querying is a matter of applying the entity view to a query, the simplest being just a query by id.
UserDto a = entityViewManager.find(entityManager, UserDto.class, id);
The Spring Data integration allows you to use it almost like Spring Data Projections: https://persistence.blazebit.com/documentation/entity-view/manual/en_US/index.html#spring-data-features
Page<UserDto> findAll(Pageable pageable);
The best part is, it will only fetch the state that is actually necessary!
I am copying incoming DTO objects to my JPA entities having bidirectional OneToMany relation, using orika library(tried different other libraries like Dozer,SpringBeanUtils etc and all have same effect), though copy works fine but persisting entity is not updating foreign key of child entities. I am aware that it's happening due to missing child entity synchronisation with parent entity.
But the whole idea of using orika or any similar library is to avoid boilerplate code of copying each entity/objects separately. So I want to know is there anyway I can do this synchronisation during copy itself?
Following are my entity and DTO class
#Entity
#Data
public class Parent
{
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id;
private String name;
#OneToMany(mappedBy=“parent”,cascade=CascadeType.ALL,OrphanRemoval=true)
private List<Child> childs;
public void addChild(Child child)
{
childs.add(child);
child.setParent(this);
}
public void removeChild(Child child)
{
childs.remove(child);
child.setParent(null);
}
}
#Entity
#Data
public class Child
{
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id;
private String name;
#ManyToOne(fetch=FetchType.LAZY)
#JoinColumn(name=“parent_id”)
private Parent parent;
}
#Data
public class ParentDto
{
private Long id;
private String name;
private List<ChildDto> childs;
}
#Data
public class ChildDto
{
private Long id;
private String name;
}
And Orkia copy logic -
DefaultMapperFactory factory=new DefaultMapperFactory().Builder().build();
factory.classMap(ParentDto.class,Parent.class).byDefault().register();
Parent parentEntity=factory.getMapperFacade().map(parentDtoObject,Parent.class);
//parentEntity.getChilds().forEach(child -> child.setParent(parentEntity));
repository.save(parentEntity);
When repository.save() is completed foreign_key of child table parent_id is persisted as null instead of actual value. But if I uncomment the line commented it works fine, which I don't want to do as my entity has many child objects and looping will have performance impact and code looks ugly as well. Is there a better way of doing this? Or having unidirectional OneToMany will work ?
You can do that with Blaze-Persistence Entity-Views which was designed with this in mind.
I created the library to allow easy mapping between JPA models and custom interface or abstract class defined models, something like Spring Data Projections on steroids. The idea is that you define your target structure(domain model) the way you like and map attributes(getters) via JPQL expressions to the entity model.
A DTO model for your use case could look like the following with Blaze-Persistence Entity-Views:
#EntityView(Parent.class)
#UpdatableEntityView
public interface ParentDto {
#IdMapping
Long getId();
String getName();
void setName(String name);
#UpdatableMapping
Set<ChildDto> getChilds();
#EntityView(Child.class)
#UpdatableEntityView
interface ChildDto {
#IdMapping
Long getId();
String getName();
void setName(String name);
}
}
Querying is a matter of applying the entity view to a query, the simplest being just a query by id.
ParentDto a = entityViewManager.find(entityManager, ParentDto.class, id);
The Spring Data integration allows you to use it almost like Spring Data Projections: https://persistence.blazebit.com/documentation/entity-view/manual/en_US/index.html#spring-data-features
Page<ParentDto> findAll(Pageable pageable);
The best part is, it will only fetch the state that is actually necessary!
Saving the state is also easy:
entityViewManager.save(entityManager, parentDto);
I am using Spring Data and Mapstruct and I don't want hibernate to blindly load all the elements while mapping entity to dto.
Example:
public class VacancyEntity {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
Integer id;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "job_category_id", nullable = false)
JobCategoryEntity jobCategory;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "company_id", nullable = false)
CompanyEntity company;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "employer_created_by", nullable = false)
EmployerProfileEntity employerCreatedBy;
#Column(nullable = false)
String title;
.... }
DTO:
public class VacancyDto {
Integer id;
String title;
CompanyDto company;
EmployerProfileDto employerCreatedBy;
JobCategoryDto jobCategory;
...}
So I have two methods findByIdWithCompanyAndCity and findByIdWithJobAndCityAndEmployer in VacancyRepository to perform only one SQL request.
And two #Transactional methods in my VacancyService: findWithCompanyAndCity and findWithCompanyAndCityAndEmployer.
Best practice is returning Dto from Service layer, so we need to parse Entity to Dto in the Service.
And I really don't want to just leave whole mapping in #Transactional (session) because if I add some field really deep into my entity, Mapstruct just trigger N+1 problem.
Best that I came up with, is to include each inner entity into method and check manually that Mapstruct don't add some new methods. (it is faster then checking names)
Ex:
#Mapping(target = "id", source = "entity.id")
#Mapping(target = "description", source = "entity.description")
#Mapping(target = "jobCategory", source = "jobCategoryDto")
#Mapping(target = "employerCreatedBy", source = "employerProfileDto")
#Mapping(target = "city", source = "cityDto")
#Mapping(target = "company", ignore = true)
VacancyDto toDto(VacancyEntity entity,
JobCategoryDto jobCategoryDto,
EmployerProfileDto employerProfileDto,
CityDto cityDto);
....
But this doesn't fix the real issue. There are still session while mapping, so it can lead to N+1 problem.
So I came up with several solutions
Use special method in Service to trigger #Transactional method and then map into DTO out of session scope. But it seems really ugly to double methods in Service
Return Entity from Service (which is Bad Practice) and map into DTO there.
I know that I'll get LazyInitializationException in both cases, but it seems to me like it more robust and scalable then just unpredictably SELECT.
How do I perform the mapping from entity to DTO in the service layer but outside the Hibernate session in an elegant way?
You didn't ask a question but it seems the question is supposed to be:
How do I perform the mapping from entity to DTO in the service layer but outside the Hibernate session in an elegant way.
I'd recommend the TransactionTemplate for this.
Usage looks like this:
#Autowired
VacancyRepository repo;
#Autowired
TransactionTemplate tx;
void someMethod(String company, String city){
VacancyEntity vac = tx.execute(__ -> repo.findWithCompanyAndCity(company, city));
return mappToDto(vac);
}
That said, I think you are using the wrong a approach to solve the underlying problem.
I suggest you take a look at having a test to verify the number of SQL statements executed.
See https://vladmihalcea.com/how-to-detect-the-n-plus-one-query-problem-during-testing/ for a way to do that.
To avoid the N + 1 problem you still need to use an entity graph, although I think this is a perfect use case for Blaze-Persistence Entity Views.
I created the library to allow easy mapping between JPA models and custom interface or abstract class defined models, something like Spring Data Projections on steroids. The idea is that you define your target structure(domain model) the way you like and map attributes(getters) via JPQL expressions to the entity model.
A DTO model for your use case could look like the following with Blaze-Persistence Entity-Views:
#EntityView(VacancyEntity.class)
public interface VacancyDto {
#IdMapping
Integer getId();
String getTitle();
CompanyDto getCompany();
EmployerProfileDto getEmployerCreatedBy();
JobCategoryDto getJobCategory();
#EntityView(CompanyEntity.class)
interface CompanyDto {
#IdMapping
Integer getId();
String getName();
}
#EntityView(EmployerProfileEntity.class)
interface EmployerProfileDto {
#IdMapping
Integer getId();
String getName();
}
#EntityView(JobCategoryEntity.class)
interface JobCategoryDto {
#IdMapping
Integer getId();
String getName();
}
}
Querying is a matter of applying the entity view to a query, the simplest being just a query by id.
VacancyDto a = entityViewManager.find(entityManager, VacancyDto.class, id);
The Spring Data integration allows you to use it almost like Spring Data Projections: https://persistence.blazebit.com/documentation/entity-view/manual/en_US/index.html#spring-data-features
Page<VacancyDto> findAll(Pageable pageable);
The best part is, it will only fetch the state that is actually necessary!
I have an entity called "Review" that has a OneToOne relationship with a "User" entity and a OneToMany relationship with a "ReviewStage" entity. I have implemented a DTO pattern so, I also have ReviewDTO which is actually what is being sent to the UI. I am using mapstruct to map the entity to dto. All is working well however, I would rather use the UserDTO and ReviewStageDTO in the relationship mappings.
This works well:
#Entity
#Getter #Setter #NoArgsConstructor
public class Review {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long reviewId;
#OneToOne(cascade = CascadeType.ALL)
#JoinColumn(name = "ownerId")
private User owner;
#OneToMany(mappedBy = "reviewId")
private Set<ReviewStage> stages;
}
For fun, I tried this but, obviously doesn't work:
#Entity
#Getter #Setter #NoArgsConstructor
public class Review {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long reviewId;
#OneToOne(cascade = CascadeType.ALL)
#JoinColumn(name = "ownerId")
private UserDTO owner;
#OneToMany(mappedBy = "reviewId")
private Set<ReviewStageDTO> stages;
}
I just need a nudge in the right direction. Thanks,
The relationships should be between entities only and if you want to make a dto for Review and inside this dto you want to return the UserDto for example you should create a mapstruct class to map between UserEntity to UserDTO
Example
class UserDto {
/// put any fields here that you want to map
}
class ReviewDto {
UserDto user;
}
#Mapper(componentModel = "spring")
class UserMapper {
UserDto map(User user);
}
#Mapper(componentModel = "spring", uses={UserMapper.class})
class ReviewMapper {
ReviewDto map(Review review);
}
If you are concerned about the performance, I can recommend you take a look at what Blaze-Persistence Entity Views has to offer.
I created the library to allow easy mapping between JPA models and custom interface or abstract class defined models, something like Spring Data Projections on steroids. The idea is that you define your target structure(domain model) the way you like and map attributes(getters) via JPQL expressions to the entity model.
A DTO model for your use case could look like the following with Blaze-Persistence Entity-Views:
#EntityView(Review.class)
public interface ReviewDTO {
#IdMapping
Long getReviewId();
UserDTO getOwner();
Set<ReviewStageDTO> getStages();
#EntityView(User.class)
interface UserDTO {
#IdMapping
Long getId();
String getName();
}
#EntityView(ReviewStage.class)
interface ReviewStageDTO {
#IdMapping
Long getId();
String getName();
}
}
Querying is a matter of applying the entity view to a query, the simplest being just a query by id.
ReviewDTO a = entityViewManager.find(entityManager, ReviewDTO.class, id);
The Spring Data integration allows you to use it almost like Spring Data Projections: https://persistence.blazebit.com/documentation/entity-view/manual/en_US/index.html#spring-data-features
Optional<ReviewDTO> findByReviewId(long reviewId);
Note that this will only fetch the state that is actually required. With MapStruct or other bean-mapping solutions you have to handle efficient fetching yourself.
I'm trying to create a query with Criteria, but I don't succeed to map data from a joined entity.
With this Criteria query the id of the Order entity is override with the id of the ShippingCondition entity :
final Criteria criteria = session.createCriteria(Order.class, "o")
.createAlias("o.shippingCondition", "sc", JoinType.INNER_JOIN)
.setProjection(Projections.projectionList()
.add(Projections.property("o.id"), "id")
.add(Projections.property("o.orderNum"), "orderNum")
.add(Projections.property("o.notes"), "notes")
.add(Projections.property("sc.id"), "id"))
.add(Restrictions.eq("o.id", id))
.setResultTransformer(Transformers.aliasToBean(Order.class));
return (Order) criteria.uniqueResult();
My entities :
#Table(name = "order", schema = "myschema")
public class Order {
private Integer id;
private String orderNum;
private String notes;
private ShippingCondition shippingCondition;
...
}
#Table(name = "shipping_condition", schema = "myschema")
public class ShippingCondition {
private Integer id;
private String shippingCondition;
private Integer sorting;
...
}
I have tryed to replace .add(Projections.property("sc.id"), "id") by .add(Projections.property("sc.id"), "shippingCondition.id") but then I get a ClassCastException (java.lang.ClassCastException: entity.Order cannot be cast to java.util.Map)
Do you have any idea how I can do that ?
Thanks
Hibernate doesn't support nested projections. You will need to create DTO for that. You can extend DTO from Order class and add methods to set fields of ShippingCondition.
class OrderDto extends Order {
public OrderDto() {
setShippingCondition(new ShippingCondition());
}
public void setShippingConditionId(Integer id) {
getShippingCondition().setId(id);
}
}
You can use a special nested transformer if you don't want to use DTO
How to transform a flat result set using Hibernate
Additional notes
JPA doesn't support any transformers at all. And it is hard to implement such transformer by consistent way. For example, my transformer doesn't support child collections like #OneToMany, only single associations. Also, you can't use nested projections with HQL, because HQL doesn't support parent.child aliases.