I have the below structure and i want to map this using mapstruct.
class DTO
{
private Integer id;
String comment;
//getters & setters
}
class ParentEntity
{
private Integer id;
CommentEntity comment;
//getters & setters
}
class CommentEntity
{
private Integer id;
private String text;
//getters & setters
}
#Mapper(componentModel = "spring")
public interface SampleMapper
{
#Mapping(source = "entity.comment.text", target = "comment")
public DTO toDTO(final ParentEntity entity);
#Mapping(source = "dto.comment", target = "comment.text")
public ParentEntity toEntity(final DTO dto);
}
The below is the implementation generated by mapstruct for toDTO method
#Override
public DTO toDTO(ParentEntity entity) {
if ( entity == null ) {
return null;
}
DTO dto = new DTO();
dto.setComment( entityCommentText( entity ) );
....................
}
private String entityCommentText(ParentEntity entity) {
if ( entity == null ) {
return null;
}
Comment comment = entity.getComment();
if ( comment == null ) {
return null;
}
String text = comment.getText();
if ( text == null ) {
return null;
}
return text;
}
The below is the implementation generated by mapstruct for toEntity method
#Override
public ParentEntity toEntity(DTO dto) {
if ( dto == null ) {
return null;
}
ParentEntity entity = new ParentEntity();
entity.setComment( dtoToCommentEntity( dto ) );
.............
}
protected CommentEntity dtoToCommentEntity(DTO dto) {
if ( dto == null ) {
return null;
}
CommentEntity commentEntity = new CommentEntity();
commentEntity.setText( dto.getComment() );
return commentEntity;
}
My question is the toDTO() method is setting the comment only if the text is not null. But the toEntity() method is not checking for the null or empty text.
So if i get "comment":null in my DTO, it is creating a new comment object and setting text as null.
How to avoid this?
Can someone explain the behavior and suggest me the proper way to do it?
Thanks!
Like this:
#Mapper( componentModel = "spring" )
public interface MyMapper {
#Mapping(source = "entity.comment.text", target = "comment")
DTO toDTO(final ParentEntity entity);
// make sure the entire parameter dto is mapped to comment
#Mapping(source = "dto", target = "comment")
ParentEntity toEntity(final DTO dto);
// and MapStruct will select your own implementation
default CommentEntity dTOToCommentEntity(DTO dTO) {
if ( dTO == null ) {
return null;
}
CommentEntity commentEntity = null;
if ( dTO.getComment()!= null ) {
commentEntity = new CommentEntity();
commentEntity.setText( dTO.getComment() );
}
return commentEntity;
}
}
Related
I am trying to use the JPA Criteria API to filter the results and aggregate them using simple count, min, avg and max. I am using Spring Boot 2.7.8, so I am trying to use Interface-projections such that these aggregated results look the same as the simpler queries done automatically by the Spring repositories.
My domain entity (simplified for brevity) looks like this:
#Entity
#Table(name = "vehicle_stopped")
#IdClass(VehicleStopped.VehicleStoppedPK.class)
public class VehicleStopped implements Serializable {
#Id
#Column(name = "stopped_session_uuid", nullable = false)
private String stoppedSessionUuid;
#Id
#Column(name = "start_ts", nullable = false)
private OffsetDateTime startTs;
#Column(name = "end_ts", nullable = false)
private OffsetDateTime endTs;
#Column(name = "duration_seconds")
private Double durationSeconds;
#ManyToOne(fetch = FetchType.EAGER)
#JoinColumn(name = "zone_id")
private CameraZone cameraZone;
#Override
public VehicleStoppedPK getId() {
VehicleStopped.VehicleStoppedPK pk = new VehicleStopped.VehicleStoppedPK();
pk.setStartTs(this.getStartTs());
pk.setStoppedSessionUuid(this.getStoppedSessionUuid());
return pk;
}
public OffsetDateTime getEndTs() {
return endTs;
}
public void setEndTs(OffsetDateTime endTs) {
this.endTs = endTs;
}
public Double getDurationSeconds() {
return durationSeconds;
}
public void setDurationSeconds(Double durationSeconds) {
this.durationSeconds = durationSeconds;
}
public CameraZone getCameraZone() {
return cameraZone;
}
public void setCameraZone(CameraZone cameraZone) {
this.cameraZone = cameraZone;
}
public VehicleType getVehicleType() {
return vehicleType;
}
public void setVehicleType(VehicleType vehicleType) {
this.vehicleType = vehicleType;
}
public String getStoppedSessionUuid() {
return stoppedSessionUuid;
}
public void setStoppedSessionUuid(String stoppedSessionUuid) {
this.stoppedSessionUuid = stoppedSessionUuid;
}
//some details removed for brevity
#Override
public static class VehicleStoppedPK implements Serializable {
private OffsetDateTime startTs;
private String stoppedSessionUuid;
public VehicleStoppedPK() {
}
public OffsetDateTime getStartTs() {
return startTs;
}
public void setStartTs(OffsetDateTime startTs) {
this.startTs = startTs;
}
public String getStoppedSessionUuid() {
return stoppedSessionUuid;
}
public void setStoppedSessionUuid(String stoppedSessionUuid) {
this.stoppedSessionUuid = stoppedSessionUuid;
}
#Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
VehicleStoppedPK that = (VehicleStoppedPK) o;
return Objects.equals(startTs, that.startTs) && Objects.equals(stoppedSessionUuid, that.stoppedSessionUuid);
}
#Override
public int hashCode() {
return Objects.hash(startTs, stoppedSessionUuid);
}
#Override
public String toString() {
return "VehicleStoppedPK{" +
"startTs=" + startTs +
", stoppedSessionUuid='" + stoppedSessionUuid + '\'' +
'}';
}
}
}
#Entity
#Table(name = "camera_zone")
public class CameraZone implements Serializable {
#Id
#SequenceGenerator(name = "camera_zone_id_seq", sequenceName = "camera_zone_id_seq", allocationSize = 1)
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "camera_zone_id_seq")
#Column(name = "id", updatable=false)
private Integer id;
#Column(name = "uuid", unique = true)
private String uuid;
#Column(name = "type")
private String type;
#Column(name = "name")
private String name;
public Integer getId() {
return id;
}
public void setId(Integer id) {
this.id = id;
}
public String getUuid() {
return uuid;
}
public void setUuid(String uuid) {
this.uuid = uuid;
}
public String getType() {
return type;
}
public void setType(String type) {
this.type = type;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
#Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
CameraZone that = (CameraZone) o;
return Objects.equals(id, that.id) && Objects.equals(uuid, that.uuid) && Objects.equals(camera, that.camera) && Objects.equals(type, that.type) && Objects.equals(name, that.name);
}
#Override
public int hashCode() {
return Objects.hash(id, uuid, camera, type, name);
}
}
The code that I have in my Repository implementation looks like this:
public class SpecificationVehicleStoppedRepositoryImpl
implements SpecificationVehicleStoppedRepository {
#Autowired private EntityManager em;
#Autowired ProjectionFactory projectionFactory;
#Override
public List<VehicleStoppedAggregate> getStoppedVehiclesCount(Specification<VehicleStopped> spec) {
CriteriaBuilder builder = em.getCriteriaBuilder();
CriteriaQuery<Tuple> query = builder.createTupleQuery();
Root<VehicleStopped> root = query.from(VehicleStopped.class);
Predicate predicate = spec.toPredicate(root, query, builder);
if (predicate != null) {
query.where(predicate);
}
Path<Number> duration = root.get("durationSeconds");
Path<CameraZone> zone = root.get("cameraZone");
query
.multiselect(zone,
builder.count(root).alias("totalVehicles"),
builder.min(duration).alias("minDuration"),
builder.avg(duration).alias("avgDuration"),
builder.max(duration).alias("maxDuration"))
.groupBy(zone);
List<Tuple> rawResultList = em.createQuery(query).getResultList();
return project(rawResultList, VehicleStoppedAggregate.class);
}
private <P> List<P> project(List<Tuple> results, Class<P> projectionClass) {
return results.stream()
.map(tuple -> {
Map<String, Object> mappedResult = new HashMap<>(tuple.getElements().size());
for (TupleElement<?> element : tuple.getElements()) {
String name = element.getAlias();
mappedResult.put(name, tuple.get(name));
}
return projectionFactory.createProjection(projectionClass, mappedResult);
})
.collect(Collectors.toList());
}
}
The interface-based projection I am trying to populate (using SpelAwareProxyProjectionFactory) is this:
public interface VehicleStoppedAggregate {
CameraZone getCameraZone();
Integer getTotalVehicles();
Double getMinDuration();
Double getAvgDuration();
Double getMaxDuration();
}
The call to getStoppedVehiclesCount() fails with the following error:
ERROR: column "camerazone1_.id" must appear in the GROUP BY clause or be used in an aggregate function
This error is coming from the PostgreSQL database, and rightly so because the SQL hibernate generates is incorrect:
select
vehiclesto0_.zone_id as col_0_0_,
count(*) as col_1_0_,
min(vehiclesto0_.duration_seconds) as col_2_0_,
avg(vehiclesto0_.duration_seconds) as col_3_0_,
max(vehiclesto0_.duration_seconds) as col_4_0_,
camerazone1_.id as id1_2_,
camerazone1_.name as name2_2_,
camerazone1_.type as type3_2_,
camerazone1_.uuid as uuid4_2_
from
vehicle_stopped vehiclesto0_
inner join
camera_zone camerazone1_
on vehiclesto0_.zone_id=camerazone1_.id cross
where
vehiclesto0_.start_ts>=?
and vehiclesto0_.start_ts<=?
and 1=1
and 1=1
and 1=1
group by
vehiclesto0_.zone_id
It is not grouping by the other fields it is requesting from the joined table.
If I had to use a normal class, instead of a Tuple, it would work, but it would mean I would have to create a class with a huge constructor for all fields for Hibernate to populate it.
Somehow, when I use Interface-based projections with Spring's repositories rather than my criteriaquery, the same scenario works. They manage to populate the one-to-many relationships just fine.
Is there a way to fix this and make Hibernate ask for the right fields?
I am using Hibernate 5.6.14.Final (as bundled with Spring Boot 2.7.8).
I believe the "solution" is two create two "independent" query roots and join them together:
CriteriaBuilder builder = session.getCriteriaBuilder();
CriteriaQuery<Tuple> query = builder.createTupleQuery();
Root<VehicleStopped> root = query.from(VehicleStopped.class);
// instead of Path<CameraZone> zone = root.get("cameraZone")
Root<CameraZone> zone = query.from(CameraZone.class);
query.where(builder.equal(zone, root.get("cameraZone")));
Path<Number> duration = root.get("durationSeconds");
query
.multiselect(zone,
builder.count(root).alias("totalVehicles"),
builder.min(duration).alias("minDuration"),
builder.avg(duration).alias("avgDuration"),
builder.max(duration).alias("maxDuration"))
.groupBy(zone);
session.createQuery(query).getResultList();
In that case Hibernate 5 produces following SQL (which actually looks weird from my perspective due to missing columns in group by clause):
select
naturalidc1_.id as col_0_0_,
count(*) as col_1_0_,
min(naturalidc0_.duration_seconds) as col_2_0_,
avg(naturalidc0_.duration_seconds) as col_3_0_,
max(naturalidc0_.duration_seconds) as col_4_0_,
naturalidc1_.id as id1_0_,
naturalidc1_.name as name2_0_,
naturalidc1_.type as type3_0_,
naturalidc1_.uuid as uuid4_0_
from
vehicle_stopped naturalidc0_ cross
join
camera_zone naturalidc1_
where
naturalidc1_.id=naturalidc0_.zone_id
group by
naturalidc1_.id
FYI. Your initial query does work in Hibernate 6 and produced SQL does look more correct but still weird:
select
c1_0.id,
c1_0.name,
c1_0.type,
c1_0.uuid,
count(*),
min(v1_0.duration_seconds),
avg(v1_0.duration_seconds),
max(v1_0.duration_seconds)
from
vehicle_stopped v1_0
join
camera_zone c1_0
on c1_0.id=v1_0.zone_id
group by
1,
2,
3,
4
I'm trying to make a mapping using MapStruct but I don't know how to deal with the fields from one to the other.
I have the classes below:
class DataDomain {
private List<Domain> data;
}
class Domain {
private String codDist;
private String numFun;
private String txtJust;
private Boolean valPar;
private LocalDateTime dateHr;
private Integer numPn;
}
class DataEntity {
private String codDist;
private String numFun;
private List<ParEntity> pares;
}
class ParEntity {
private String numFun;
private String txtJus;
private String indValPar;
private String dateHr;
private String numPn;
}
interface ParOutMapper{
ParOutMapper INSTANCE = Mappers.getMapper(ParOutMapper.class);
#Mapping(target = "data", source = "entity")
DataDomain map(DataEntity entity);
Domain toDomain(DataEntity entity);
default List<Domain> toList(DataEntity entity) {
return entity != null ? singletonList(toDomain(entity)) : new ArrayList<>();
}
default DataEntity map(DataDomain domain) {
return domain != null
&& domain.getData() != null
&& !domain.getData().isEmpty() ? toEntity(domain.getData().get(0)) : null;
}
DataEntity toEntity(Domain domains);
List<Domain> toDomainList(List<DataEntity> domainList);
}
That's what I've done so far, but it's giving divergence in the mapping because both have different structures and I ended up getting lost in how to apply their origin and destination field to field.
If possible and someone knows how to do it in an interesting correct way I would be very grateful.
I would suggest the following solution
#Mapper(unmappedTargetPolicy = ReportingPolicy.ERROR,
componentModel = "spring",
collectionMappingStrategy = CollectionMappingStrategy.ADDER_PREFERRED,
builder = #Builder(disableBuilder = true))
public interface ParOutMapper {
#Mapping(target = "data", source = "entity")
DataDomain map(DataEntity entity);
#Mapping(target = "txtJust", source = "pares", qualifiedByName = "txtJust")
#Mapping(target = "valPar", source = "pares", qualifiedByName = "valPar")
#Mapping(target = "dateHr", source = "pares", qualifiedByName = "dateHr")
#Mapping(target = "numPn", source = "pares", qualifiedByName = "numPn")
Domain toDomain(DataEntity entity);
default List<Domain> toList(DataEntity entity) {
return entity != null ? singletonList(toDomain(entity)) : new ArrayList<>();
}
default DataEntity map(DataDomain domain) {
return domain != null
&& domain.getData() != null
&& !domain.getData().isEmpty() ? toEntity(domain.getData().get(0)) : null;
}
#Mapping(target = "pares", ignore = true)
DataEntity toEntity(Domain domains);
List<Domain> toDomainList(List<DataEntity> domainList);
#AfterMapping
default DataEntity valuesToList(Domain domains, #MappingTarget DataEntity dataEntity){
ParEntity parEntity = new ParEntity();
parEntity.setDateHr(domains.getDateHr().toString()); // alternative call custom entity to list mapping here !
parEntity.setTxtJus(domains.getTxtJust());
parEntity.setNumPn(domains.getNumPn().toString());
parEntity.setNumFun(domains.getNumFun());
parEntity.setIndValPar(domains.getValPar().toString());
dataEntity.setPares(List.of(parEntity));
return dataEntity;
}
#Named("txtJust")
default String mapTxtJust(List<ParEntity> pares) {
return pares.get(0).getTxtJus(); // or custom mapping logic here
}
#Named("valPar")
default Boolean mapValPar(List<ParEntity> pares) {
return Boolean.valueOf(pares.get(0).getIndValPar()); // or custom mapping logic here
}
#Named("dateHr")
default LocalDateTime mapDateHr(List<ParEntity> pares) {
DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm");
return LocalDateTime.parse(pares.get(0).getDateHr(),formatter); // or custom mapping logic here
}
#Named("numPn")
default Integer mapNumPn(List<ParEntity> pares) {
return Integer.valueOf(pares.get(0).getNumPn()); // or custom mapping logic here
}
}
Since you tagged your question with spring-boot i assume you are using it. Therefore i would suggest to use the provided component model by mapstruct in its configuration
I am unsure how you want to do your mapping of list to entitiy or entity to list. With my approach you can do it value by value or with the entire list. Both workes either way.
The solution compiles and workes for DataEntity toEntity(Domain domains); and Domain toDomain(DataEntity entity); i did not recognize any other problems since mapstruct is able to generate the required mappings.
I have 2 micro-services: let's call them A and B.
The entities handled by B in which there is a reference to an entity of A are implemented as a simple Long id (eg. the groupId)
#Getter
#Setter
#NoArgsConstructor
#Entity
#Table(name = "WorkShifts")
#AllArgsConstructor
#Builder(toBuilder = true)
public class WorkShiftEntity extends BaseEntitySerial {
#Column(name = "group_id")
private Long groupId;
private String description;
#Column(name = "start_time")
private Time startTime;
#Column(name = "end_time")
private Time endTime;
#Column(name = "work_shift")
private Integer workShift;
}
What I want to achieve is to populate the missing group data of B (held by A) using mapstruct.
So far I tryed to use an #AfterMapping function to request the missing data from A. My mapper is:
#Mapper(componentModel = "spring", uses = LocalTimeMapper.class, builder = #Builder(disableBuilder = true), config = CommonMapperConfig.class)
public abstract class WorkShiftMapper extends BaseMapper implements IBaseMapper<WorkShiftEntity, WorkShiftDTO, Long>, LogSupport {
#Autowired
RestTemplate restTemplate;
#Mapping(target = "groupId", source = "group.id")
public abstract WorkShiftEntity dtoToEntity(WorkShiftDTO workShiftDTO);
#AfterMapping
public void afterEntityToDto(final WorkShiftEntity workShiftEntity, #MappingTarget final WorkShiftDTO workShiftDTO) {
if (workShiftEntity == null) {
return;
}
GroupDTO groupDTO = EcofinderUtils.getGroupDTO(restTemplate, workShiftEntity.getGroupId());
try {
GenericUtils.enhanceDTOForAttributeWithDTO(workShiftDTO, groupDTO, "group");
} catch (InvocationTargetException | NoSuchMethodException | IllegalAccessException e) {
getLogger().error(e.getMessage());
}
}
#AfterMapping
public void afterEntityToDtoList(final List<WorkShiftEntity> workShiftEntity, #MappingTarget final List<WorkShiftDTO> workShiftDTO) {
if (workShiftEntity == null || workShiftEntity.size() == 0) {
return;
}
List<GroupDTO> groups = EcofinderUtils.getAllGroupDTOById(restTemplate, workShiftEntity.stream().map(WorkShiftEntity::getGroupId).distinct().toList());
try {
//Compile the resulting DTOs with the data got from the registry
//Group
GenericUtils.enhanceDTOListWithDataFromDTOJoiningEntities(workShiftDTO, groups, workShiftEntity, "group", "groupId");
} catch (AppException | InvocationTargetException | NoSuchMethodException | IllegalAccessException e) {
getLogger().error(e.getMessage());
}
}
}
The implemented interface which gives me the mapping functions is:
public interface IBaseMapper<T extends BaseEntity<K>, D extends IBaseDTO<K>, K extends Serializable> {
D entityToDto(T entity);
List<D> entityToDtoList(List<T> entity);
T dtoToEntity(D dto);
}
The problem with the generated code is that the function that maps a list of entities to a list of DTOs uses the entityToDto for every entity, resulting in n requests to A. After that , the other #AfterMapping function is called (the one that collects all the ids and pulls all the data in only one request, which is the ONLY one that should be used while mapping a list).
//GENERATED CODE BY MAPSTRUCT
#Override
public WorkShiftDTO entityToDto(WorkShiftEntity workShiftEntity) {
if ( workShiftEntity == null ) {
return null;
}
WorkShiftDTO workShiftDTO = new WorkShiftDTO();
workShiftDTO.setId( workShiftEntity.getId() );
workShiftDTO.setDescription( workShiftEntity.getDescription() );
workShiftDTO.setStartTime( localTimeMapper.map( workShiftEntity.getStartTime() ) );
workShiftDTO.setEndTime( localTimeMapper.map( workShiftEntity.getEndTime() ) );
workShiftDTO.setWorkShift( workShiftEntity.getWorkShift() );
afterEntityToDto( workShiftEntity, workShiftDTO );
return workShiftDTO;
}
#Override
public List<WorkShiftDTO> entityToDtoList(List<WorkShiftEntity> entity) {
//more code...
List<WorkShiftDTO> list = new ArrayList<WorkShiftDTO>( entity.size() );
for ( WorkShiftEntity workShiftEntity : entity ) {
list.add( entityToDto( workShiftEntity ) );
}
afterEntityToDtoList( entity, list );
return list;
}
Is there a way to make mapstruct implement the entityToDto function twice where one version uses the #AfterMapping function and the other doesn't in order to make the entityToDtoList function use the version without the #AfterMapping call?
Something like:
#Override
public WorkShiftDTO entityToDto(WorkShiftEntity workShiftEntity) {
if ( workShiftEntity == null ) {
return null;
}
WorkShiftDTO workShiftDTO = new WorkShiftDTO();
workShiftDTO.setId( workShiftEntity.getId() );
workShiftDTO.setDescription( workShiftEntity.getDescription() );
workShiftDTO.setStartTime( localTimeMapper.map( workShiftEntity.getStartTime() ) );
workShiftDTO.setEndTime( localTimeMapper.map( workShiftEntity.getEndTime() ) );
workShiftDTO.setWorkShift( workShiftEntity.getWorkShift() );
afterEntityToDto( workShiftEntity, workShiftDTO );
return workShiftDTO;
}
public WorkShiftDTO entityToDtoNoAfter(WorkShiftEntity workShiftEntity) {
if ( workShiftEntity == null ) {
return null;
}
WorkShiftDTO workShiftDTO = new WorkShiftDTO();
workShiftDTO.setId( workShiftEntity.getId() );
workShiftDTO.setDescription( workShiftEntity.getDescription() );
workShiftDTO.setStartTime( localTimeMapper.map( workShiftEntity.getStartTime() ) );
workShiftDTO.setEndTime( localTimeMapper.map( workShiftEntity.getEndTime() ) );
workShiftDTO.setWorkShift( workShiftEntity.getWorkShift() );
return workShiftDTO;
}
#Override
public List<WorkShiftDTO> entityToDtoList(List<WorkShiftEntity> entity) {
//more code...
List<WorkShiftDTO> list = new ArrayList<WorkShiftDTO>( entity.size() );
for ( WorkShiftEntity workShiftEntity : entity ) {
list.add( entityToDtoNoAfter( workShiftEntity ) );
}
afterEntityToDtoList( entity, list );
return list;
}
Other approaches are welcome, this one only feels the more natural to me.
Thanks in advance!
After around two days of intensive research and many attempts I think I have found a decent solution.
I can give the functions a name and by doing so they sort of work like they have a specific scope.
The mapper interface becomes:
public interface IBaseMapper<T extends BaseEntity<K>, D extends IBaseDTO<K>, K extends Serializable> {
#BeanMapping(qualifiedByName = "EntityToDTO")
D entityToDto(T entity);
#Named("EntityToDTOList")
public abstract D entityToDTOListEntity(T entity);
#IterableMapping(qualifiedByName = "EntityToDTOList")
List<D> entityToDtoList(List<T> entity);
T dtoToEntity(D dto);
}
So that I can group toghether all the functions I need.
The mapper abstract class:
#Mapping(target = "groupId", source = "group.id")
public abstract WorkShiftEntity dtoToEntity(WorkShiftDTO workShiftDTO);
#AfterMapping
#Named("EntityToDTO")
public void afterEntityToDto(final WorkShiftEntity workShiftEntity, #MappingTarget final WorkShiftDTO workShiftDTO) {
if (workShiftEntity == null) {
return;
}
GroupDTO groupDTO = EcofinderUtils.getGroupDTO(restTemplate, workShiftEntity.getGroupId());
try {
GenericUtils.enhanceDTOForAttributeWithDTO(workShiftDTO, groupDTO, "group");
} catch (InvocationTargetException | NoSuchMethodException | IllegalAccessException e) {
getLogger().error(e.getMessage());
}
}
#AfterMapping
#Named("EntityToDTOList")
public void afterEntityToDtoList(final List<WorkShiftEntity> workShiftEntity, #MappingTarget final List<WorkShiftDTO> workShiftDTO) {
if (workShiftEntity == null || workShiftEntity.size() == 0) {
return;
}
List<GroupDTO> groups = EcofinderUtils.getAllGroupDTOById(restTemplate, workShiftEntity.stream().map(WorkShiftEntity::getGroupId).distinct().toList());
try {
//Compile the resulting DTOs with the data got from the registry
//Group
GenericUtils.enhanceDTOListWithDataFromDTOJoiningEntities(workShiftDTO, groups, workShiftEntity, "group", "groupId");
} catch (AppException | InvocationTargetException | NoSuchMethodException | IllegalAccessException e) {
getLogger().error(e.getMessage());
}
}
By doing so, the generated Impl class is:
#Generated(
value = "org.mapstruct.ap.MappingProcessor",
date = "2022-05-20T15:08:38+0200",
comments = "version: 1.4.2.Final, compiler: javac, environment: Java 17.0.1 (Oracle Corporation)"
)
#Component
public class WorkShiftMapperImpl extends WorkShiftMapper {
#Autowired
private LocalTimeMapper localTimeMapper;
#Override
public WorkShiftDTO entityToDto(WorkShiftEntity entity) {
if ( entity == null ) {
return null;
}
WorkShiftDTO workShiftDTO = new WorkShiftDTO();
workShiftDTO.setId( entity.getId() );
workShiftDTO.setDescription( entity.getDescription() );
workShiftDTO.setStartTime( localTimeMapper.map( entity.getStartTime() ) );
workShiftDTO.setEndTime( localTimeMapper.map( entity.getEndTime() ) );
workShiftDTO.setWorkShift( entity.getWorkShift() );
afterEntityToDto( entity, workShiftDTO );
return workShiftDTO;
}
#Override
public WorkShiftDTO entityToDTOListEntity(WorkShiftEntity entity) {
if ( entity == null ) {
return null;
}
WorkShiftDTO workShiftDTO = new WorkShiftDTO();
workShiftDTO.setId( entity.getId() );
workShiftDTO.setDescription( entity.getDescription() );
workShiftDTO.setStartTime( localTimeMapper.map( entity.getStartTime() ) );
workShiftDTO.setEndTime( localTimeMapper.map( entity.getEndTime() ) );
workShiftDTO.setWorkShift( entity.getWorkShift() );
return workShiftDTO;
}
#Override
public List<WorkShiftDTO> entityToDtoList(List<WorkShiftEntity> entity) {
if ( entity == null ) {
return null;
}
List<WorkShiftDTO> list = new ArrayList<WorkShiftDTO>( entity.size() );
for ( WorkShiftEntity workShiftEntity : entity ) {
list.add( entityToDTOListEntity( workShiftEntity ) );
}
afterEntityToDtoList( entity, list );
return list;
}
#Override
public WorkShiftEntity dtoToEntity(WorkShiftDTO workShiftDTO) {
if ( workShiftDTO == null ) {
return null;
}
WorkShiftEntity workShiftEntity = new WorkShiftEntity();
workShiftEntity.setGroupId( workShiftDTOGroupId( workShiftDTO ) );
workShiftEntity.setId( workShiftDTO.getId() );
workShiftEntity.setDescription( workShiftDTO.getDescription() );
workShiftEntity.setStartTime( localTimeMapper.map( workShiftDTO.getStartTime() ) );
workShiftEntity.setEndTime( localTimeMapper.map( workShiftDTO.getEndTime() ) );
workShiftEntity.setWorkShift( workShiftDTO.getWorkShift() );
return workShiftEntity;
}
private Long workShiftDTOGroupId(WorkShiftDTO workShiftDTO) {
if ( workShiftDTO == null ) {
return null;
}
GroupDTO group = workShiftDTO.getGroup();
if ( group == null ) {
return null;
}
Long id = group.getId();
if ( id == null ) {
return null;
}
return id;
}
}
Which is exaclty what I was looking for.
Mapstruct is not using a mapper declared in the "uses" definition in the #Mapper Annotation
I expected the productCategory and productPrices to be mapped by the mappers: ProductPricesMapper.class, ProductCategoryMapper.class that are declared below but the generated code does not have them
ProductMapper Interface
#Mapper(componentModel = "spring", nullValuePropertyMappingStrategy = NullValuePropertyMappingStrategy.IGNORE, uses = { ``
ProductPricesMapper.class, ProductCategoryMapper.class })`
public interface ProductMapper {`
#Mapping(target = "lastUpdatedTime", ignore = true)
#Mapping(target = "creationTime", ignore = true)
Product createEntityFromDto(ProductDto productDto);
}
Generated Code:
#Autowired
private ProductPricesMapper productPricesMapper;
#Autowired
private ProductCategoryMapper productCategoryMapper;
#Override
public Product createEntityFromDto(ProductDto productDto) {
if ( productDto == null ) {
return null;
}
Product product = new Product();
product.setDescription( productDto.getDescription() );
product.setId( productDto.getId() );
product.setName( productDto.getName() );
product.setProductCategory( productCategoryDtoToProductCategory(
productDto.getProductCategory() ) );
product.setProductPrices( productPricesDtoToProductPrices( productDto.getProductPrices() ) );
return product;
}
Expected Generated Code
#Component
public class ProductMapperImpl implements ProductMapper {
#Autowired
private ProductPricesMapper productPricesMapper;
#Autowired
private ProductCategoryMapper productCategoryMapper;
#Override
public Product createEntityFromDto(ProductDto productDto) {
if ( productDto == null ) {
return null;
}
Product product = new Product();
product.setDescription( productDto.getDescription() );
product.setId( productDto.getId() );
product.setName( productDto.getName() );
product.setProductCategory(productCategoryMapper.createEntityFromDto( productDto.getProductCategory() ));
product.setProductPrices( productPricesMapper.createEntityFromDto(productDto.getProductPrices()));
return product;
}
CODE FOR PRODUCT CATEGORY DTO TO PRODUCT CATEGORY
protected ProductCategory productCategoryDtoToProductCategory(ProductCategoryDto productCategoryDto) {
if ( productCategoryDto == null ) {
return null;
}
ProductCategory productCategory = new ProductCategory();
productCategory.setDescription( productCategoryDto.getDescription() );
productCategory.setId( productCategoryDto.getId() );
productCategory.setName( productCategoryDto.getName() );
return productCategory;
}
CODE FOR PRODUCT PRICES DTO TO PRODUCT PRICES
protected ProductPrices productPricesDtoToProductPrices(ProductPricesDto productPricesDto) {
if ( productPricesDto == null ) {
return null;
}
ProductPrices productPrices = new ProductPrices();
productPrices.setBuyQuantityOffer( productPricesDto.getBuyQuantityOffer() );
productPrices.setDiscount( productPricesDto.getDiscount() );
productPrices.setGetQuantityOffer( productPricesDto.getGetQuantityOffer() );
productPrices.setGetQuantityOfferProduct( productDtoToProduct( productPricesDto.getGetQuantityOfferProduct() ) );
productPrices.setId( productPricesDto.getId() );
productPrices.setSellingPrice( productPricesDto.getSellingPrice() );
return productPrices;
}
ANOTHER SCENARIO
when I have the following code in the ProductMapper but now it is vice versa the generated code is okay, so what is making the previous fail?
CODE ADDED TO PRODUCT MAPPER
ProductDto createDtoFromEntity(Product product);
CORRECT GENERATED CODE
#Override
public ProductDto createDtoFromEntity(Product product) {
if ( product == null ) {
return null;
}
ProductDto productDto = new ProductDto();
productDto.setDescription( product.getDescription() );
productDto.setId( product.getId() );
productDto.setName( product.getName() );
productDto.setProductCategory( productCategoryMapper.createDtoFromEntity( product.getProductCategory() ) );
productDto.setProductPrices( productPricesMapper.createDtoFromEntity( product.getProductPrices() ) );
return productDto;
}
We have a POJO class with Optional getter fields. The mapstruct 1.3.0.Final generating wrong code for the Optional collection field.
We have ProgramAggregate POJO which contains Collection of Program (Collection) and it is an Optional getter type.
When we run with mapstruct 1.2.0.Final, we are seeing proper code generation.
But the same code with 1.3.0.Final generating wrong code. It is not generating the Collection mapping method for Optional Collection getter methods.
Generated code image
#Data
#NoArgsConstructor
#ToString(callSuper = true)
#FieldDefaults(makeFinal = false, level = AccessLevel.PRIVATE)
public class ProgramAggregate {
public static final long serialVersionUID = 1L;
Collection<Program> programs;
public Optional<Collection<Program>> getPrograms() {
return Optional.ofNullable(programs);
}
}
#Data
#NoArgsConstructor
#FieldDefaults(makeFinal = false, level = AccessLevel.PRIVATE)
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, property = "_type")
public class Program {
String name;
String number;
public Optional<String> getName() {
return Optional.ofNullable(name);
}
public Optional<String> getNumber() {
return Optional.ofNullable(number);
}
}
#Data
#NoArgsConstructor
#FieldDefaults(makeFinal = false, level = AccessLevel.PRIVATE)
public class ProgramResponseDto {
Collection<ProgramDto> programs;
public Optional<Collection<ProgramDto>> getPrograms() {
return Optional.ofNullable(programs);
}
}
#Data
#NoArgsConstructor
#FieldDefaults(makeFinal = false, level = AccessLevel.PRIVATE)
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, property = "_type")
public class ProgramDto {
String name;
String number;
String oldNumber;
public Optional<String> getName() {
return Optional.ofNullable(name);
}
public Optional<String> getNumber() {
return Optional.ofNullable(number);
}
public Optional<String> getOldNumber() {
return Optional.ofNullable(oldNumber);
}
}
#Mapper(nullValueCheckStrategy = NullValueCheckStrategy.ALWAYS, unmappedTargetPolicy = ReportingPolicy.WARN,
collectionMappingStrategy = CollectionMappingStrategy.TARGET_IMMUTABLE)
public interface IProgramMapper extends IOptionalMapper, IDefaultMapper {
ProgramResponseDto map(ProgramAggregate programAggregate);
ProgramDto map(Program sourceProgramDto);
Collection<ProgramDto> mapPrograms(Collection<Program> sourcePrograms);
}
Result:
#Generated(
value = "org.mapstruct.ap.MappingProcessor",
date = "2020-10-12T13:10:32+0530",
comments = "version: 1.3.0.Final, compiler: javac, environment: Java 1.8.0_231 (Oracle Corporation)"
)
public class IProgramMapperImpl implements IProgramMapper {
#Override
public ProgramResponseDto map(ProgramAggregate programAggregate) {
if ( programAggregate == null ) {
return null;
}
ProgramResponseDto programResponseDto = new ProgramResponseDto();
**Collection<ProgramDto> collection = fromOptional( programAggregate.getPrograms() );**
if ( collection != null ) {
programResponseDto.setPrograms( collection );
}
return programResponseDto;
}
#Override
public ProgramDto map(Program sourceProgramDto) {
if ( sourceProgramDto == null ) {
return null;
}
ProgramDto programDto = new ProgramDto();
if ( sourceProgramDto.getName() != null ) {
programDto.setName( fromOptional( sourceProgramDto.getName() ) );
}
if ( sourceProgramDto.getNumber() != null ) {
programDto.setNumber( fromOptional( sourceProgramDto.getNumber() ) );
}
return programDto;
}
#Override
public Collection<ProgramDto> mapPrograms(Collection<Program> sourcePrograms) {
if ( sourcePrograms == null ) {
return null;
}
Collection<ProgramDto> collection = new ArrayList<ProgramDto>( sourcePrograms.size() );
for ( Program program : sourcePrograms ) {
collection.add( map( program ) );
}
return collection;
}
}
Below is the error after maven build.
Error:
[ERROR] //mapstruct_latest_example/target/generated-sources/annotations/IProgramMapperImpl.java:[21,57] incompatible types: inference variable T has incompatible bounds
[ERROR] equality constraints: java.util.Collection
[ERROR] upper bounds: java.util.Collection,java.lang.Object
Issue:
Allow generic of generics in types when matching
e.g. method signature such as
<T> T fromOptional(Optional<T> optional)
when T resolves to another generic class such as Collection
This might be resolved in mapstruct 1.4.2. version.