nullValuePropertyMappingStrategy not working - java

I have the following mapper
#Mapper(config = MappingConfig.class)
public interface PokerRoomMapper {
#Mapping(target = "phase", nullValuePropertyMappingStrategy = NullValuePropertyMappingStrategy.IGNORE)
PokerRoom pokerRoomDtoToPokerRoom(PokerRoomDto pokerRoomDto);
}
The pokerRoomDto which is passed to it has a "phase" field which can be null. I want this field to be ignored when it is null. But right now the "null" value still gets mapped to the pokerRoom entity.
If I just ignore the field in the mapper it works and the default value for phase in PokerRoom stays untouched however I dont want to always ignore it.
#Mapper(config = MappingConfig.class)
public interface PokerRoomMapper {
#Mapping(target = "phase", ignore = true)
PokerRoom pokerRoomDtoToPokerRoom(PokerRoomDto pokerRoomDto);
}

This works as designed. NullValuePropertyMappingStrategy is only applied to update method. It is not used for normal mappings.
I think that you are looking for NullValueCheckStrategy, if you use NullValueCheckStrategy#ALWAYS then MapStruct will always do a null check non the PokerRoomDto and only invoke the setter on the PokerRoom if the value was not null

If you initialise your field at declaration and want to keep that value, I've come up with a solution.
A bit hacky, not very general (depends on generated variable name), but works.
Assuming:
class PokerRoom {
Integer phase = 0;
}
You can use
#Mapping(target = "phase", defaultExpression = "java( pokerRoom.getPhase() )")
PokerRoom pokerRoomDtoToPokerRoom(PokerRoomDto pokerRoomDto);
A simpler solution would be to use the same constant you use at field declaration.
#Mapping(target = "phase", defaultValue = "0")
PokerRoom pokerRoomDtoToPokerRoom(PokerRoomDto pokerRoomDto);

Related

Map target property with no source property, while avoiding constant and expression

I have a personal war going against writing java code in strings. What I'm trying to achieve is 100% doable with an expression. I would, however, like to try and find a way of doing it without.
Situation:
TargetJpa {
...
Instant createdAt;
}
What I want to do:
// using 1.5.2.Final
#Mapper(...)
public interface JpaMapper {
#Mapping(target = "createdAt", qualifiedByName = "now")
TargetJpa toJpa(Srouce s);
#Named("now")
default Instant now() {
return Instant.now();
}
}
This obviously does not work. It complains that I'm missing a source. Which is correct, I don't need a source, I'm generating a value. But this can't be a constant. And I personally wish to use java code in strings as little as humanly possible.
Ofc, there is an obvious workaround, pass a dummy source value.
#Mapper(...)
public interface JpaMapper {
#Mapping(target = "createdAt", source = "s", qualifiedByName = "now")
TargetJpa toJpa(Srouce s);
#Named("now")
default Instant now(Source s) {
return Instant.now();
}
}
But this has a huge disatvantage. Had I not have to name a source, I could remove this "now" from this interface alltogeather, make a "InstantMapper" and reuse the code:
#InstantMapper
public class InstantMapperImpl {
#Now
default Instant now() {
return Instant.now();
}
}
Then use it on anything:
... uses = { ..., InstantMapperImpl.class }
...
#Mapping(target = "createdAt", qualifiedBy = {InstantMapper.class, Now.class})
TargetJpa1 toJpa(Srouce1 s);
#Mapping(target = "createdAt", qualifiedBy = {InstantMapper.class, Now.class})
TargetJpa2 toJpa(Srouce2 s);
#Mapping(target = "createdAt", qualifiedBy = {InstantMapper.class, Now.class})
TargetJpa3 toJpa(Srouce3 s);
Any way to achieve this without tying myself to some source that I simply don't need?
MapStruct currently doesn't support using a random method to map into a specific property. When using Mapping#target you have few options that you can use for the source of the mapping:
source - map from a specific source
expression - map using an expression
constant - map using some constant value
ignore - ignore the mapping
All the rest of the properties are in one way or another used for specifying some other things for the mapping.
When none of those 4 options are provided it is assumed that the value that is in target is the value in source.
e.g.
#Mapping(target = "createdAt", qualifiedByName = "now")
actually means
#Mapping(target = "createdAt", source = "createdAt", qualifiedByName = "now")
If you are interested in seeing something like you are requesting in MapStruct I would suggest raising a feature request

Map null values to default using builder with MapStruct

I want to map field from Source to Target class, and if the source value is null, I would like to convert it to default value based on the data type ("" for strings, 0 for numeric types etc.). For setting the values, I am not using regular setters, but builder (with protobuf, so the names of the methods is newBuilder() and build()).
class Source {
private final String value; // getter
}
class Target {
private final String value;
public static Builder newBuilder() {return new Builder()}
public static class Builder {
public static setValue() {/*Set the field*/}
public static Target build() {/*Return the constructed instance*/}
}
My mapper looks like this:
#Mapper(
nullValuePropertyMappingStrategy = NullValuePropertyMappingStrategy.SET_TO_DEFAULT,
nullValueMappingStrategy = NullValueMappingStrategy.RETURN_DEFAULT
)
public interface TargetMapper {
Target map(Source source);
}
The generated mapper implementation with this code calls target.setValue(source.getValue()), instead of performing the null check and setting default value if source returns null. The interesting part is when I add the following annotation to the map method, the null check is present in the implementation.
#Mapping(source="value", target="value", nullValuePropertyMappingStrategy = NullValuePropertyMappingStrategy.SET_TO_DEFAULT)
Is this a bug in MapStruct with builders, or am I missing some configuration to be ably to set the null mapping as a default policy, instead of duplicating it on all field mappings?
EDIT: For some reason, adding nullValueCheckStrategy = NullValueCheckStrategy.ALWAYS to the class level #Mapper annotation adds the null check, but does not explicitly set the value, just skips the call to setValue. For protobuf, this is okay, since this functionality is in the library, but for other implementations the field would remain null.
#Mapping(source="value", target="value", nullValuePropertyMappingStrategy = NullValuePropertyMappingStrategy.SET_TO_DEFAULT)
applies to update method (so methods that have the #MappingTarget annotated parameter
There's no real counterpart for regular methods:
1. NullValueMappingStragegy applies to the bean argument itself.
2. NullValueCheckStragegy does perform a check on bean properties, but does not return a default.
Naming is not really brilliant and it has a long history. We still have the intention to align this one day.
A solution would be to use an Object factory creating the builder target object and pre-populate it with default values and then let MapStuct override these one day.
Perhaps you could do something like this:
#Mapper(
// to perform a null check
nullValueCheckStrategy = NullValueCheckStrategy.ALWAYS
)
public interface TargetMapper {
Target map(Source source);
}
// to create a pre-defined object (defaults set a-priori). Not sure
// whether this works with builders.. just try
#ObjectFactory
default Target.Builder create() {
Target.Builder builder = Target.newBuilder();
builder.setValueX( "someDefaultValue" );
return builder;
}

Mapstruct, mapping to nested objects from sevral input parameters

Given a set of five objects:
KeyDto{String id}
ValueDto{String name, String value, String description}
Key{String id, String name}
Value{String value, String description}
Target{Key key, Value value}
I would like to create a mapper with two parameters:
Target dtosToTarget(KeyDto keyDto, ValueDto valueDto);
However, just defining helper methods for Key and Value seems not to be enough:
#Mapping(source = "keyDto.id", target = "id")
#Mapping(source = "valueDto.name", target = "name")
Key keyDtoAndValueDtoToKey(KeyDto keyDto, ValueDto valueDto);
Value valueDtoToValue(ValueDto valueDto);
This gives an error on the actual dtosToTarget method:
Error:(17, 19) java: Can't map property "java.lang.String value" to "mapping.Value value". Consider to declare/implement a mapping method: "mapping.Value map(java.lang.String value)".
The only solution I could think of - is defining custom java expressions to call necessary methods, like
#Mapping(target = "key", expression = "java(keyDtoAndValueDtoToKey(keyDto, valueDto))")
#Mapping(target = "value", expression = "java(valueDtoToValue(valueDto))")
Is there a cleaner approach?
The error you are seeing is because by default MapStruct will try to map valueDto.value into Target.value which is String to Value.
However, you can configure this like this:
#Mapper
public MyMapper {
#Mapping( target = "key.id", source = "keyDto.id")
#Mapping( target = "key.name", source = "valueDto.name")
#Mapping( target = "value", source = "valueDto")
Target dtosToTarget(KeyDto keyDto, ValueDto valueDto);
Value valueDtoToValue(ValueDto valueDto);
}
Try :
#Mapping(source = "valueDto.name", target = "name")
void keyDtoAndValueDtoToKey(#MappingTarget KeyDto keyDto, ValueDto valueDto);
This will keep all fields from Key Dto as it is and mapping required fields from valueDto as configured.

Json to object deserialization issue in Graphql-spqr

Json to GraphQLArgumetn object conversion failing in graphql-spqr.
tried adding GraphQLInterface(with autodiscovery true and scanpackage) to above abstract classes
and GraphQLtype type all concrete classes.
My graph query:
query contactsQuery($searchQuery : QueryInput) { contacts(searchQuery:$searchQuery){id}}
variables:{"searchQuery":{"bool":{"conditions":[{"must":{"matches":[{"singleFieldMatch":{"boost":null,"field":"firstname","value":"siddiq"}}],"bool":null}}]}})
Java code:
#JsonTypeInfo(use=JsonTypeInfo.Id.NAME,include=JsonTypeInfo.As.WRAPPER_OBJECT)
#JsonSubTypes({#type(value = Must.class, name="must"),#type(value = MustNot.class, name="mustNot")})
public abstract class Condition
#JsonTypeInfo(use=JsonTypeInfo.Id.NAME,include=JsonTypeInfo.As.WRAPPER_OBJECT)
#JsonSubTypes({#type(value = SingleFieldMatch.class, name="singleFieldMatch"),#type(value = MultiFieldMatch.class, name="multiFieldMatch")})
public abstract class Match
#GraphQLQuery(name = "contacts")
public List getContacts(#GraphQLArgument(name ="searchQuery") Query query)
Still it's throwing error unknown field error etc. Not sure which configuration is missing.
Building GraphQLSchema with AnnotatedResolvedBuilder, base package configured JacksonValueMappperFactory and singleton services.
Hi this may be a similar issue to what I ended up having.
Initially I had the following
#JsonTypeInfo(use = Id.NAME, include = As.PROPERTY, property = "type")
#GraphQLInterface(name = "AbstractClass", implementationAutoDiscovery = true)
public abstract class AbstractClass{
with the following query called
addNewObject(object: {name: "soft2", id: "asdas"})
To get conversion functioning what I needed to do was the following change
#JsonTypeInfo(use = Id.NAME, include = As.EXISTING_PROPERTY, property = "type")
#GraphQLInterface(name = "AbstractClass", implementationAutoDiscovery = true)
public abstract class AbstractClass{
private String type = this.getClass().getSimpleName();
/**
* #return the type
*/
#GraphQLQuery(name = "type", description = "The concrete type of the node. This should match the initialised class. E.g. \"Concrete\", \"DecafCoffee\"")
public String getType() {
return type;
}
with the query now being
addNewConcreteObject(concrete: {name: "soft2", id: "asdas", type: "Concrete"})
Why this worked (I think):
When converting from JSON to objects in my code using the Jackson converter (ObjectMapper). I had previously noticed that the JSON required knowledge of what class to convert to. Thus the initial use of #JsonTypeInfo(use = Id.NAME, include = As.PROPERTY, property = "type") put a type property in the JSON when it was written to string.
The inclusion of the #JSON tag may be picked up by SPQR and it then seems to use a Jackson converter to try to convert your query to the required object.
If I am right, here is the issue.
As the query doesn't contain type the query can not be correctly converted. Moreover as the type property was not a member variable of the object but was instead only added by the ObjectMapper, SPQR didn't pick it up and so it wasn't part of the schema for the object. Thus to get around it, I added type as a member variable which is always equal to the actual class, then changed my JsonTypeInfo to look for an existing property.
I appreciate this isn't a direct answer to your question (and definitely isn't a pretty answer), but hopefully it will help you find your solution.

Mapstruct source is List and target is direct attribute

I want to map UserDTO and UserGroupDTO where User has address as a list in which it has all address fields and Usergroup has individual address fields. Please let me know how could I map these fields.
There is currently no official support for that, but there is a workaround using expressions, as described in the ticket : https://github.com/mapstruct/mapstruct/issues/1321#issuecomment-339807380
This would work your case:
#Mapper
public abstract class UserDTOMapper {
#Mapping( expression = "java(userDTO.getAddress().get(0))", target = "street")
#Mapping( expression = "java(userDTO.getAddress().get(1))", target = "zipCode")
#Mapping( expression = "java(userDTO.getAddress().get(2))", target = "country")
abstract public UserGroupDTO mapTo(UserDTO userDTO);
}
But you have to make sure that the address property implemented as List will always contain the same number of fields and in correct order, else the mapping based on list index will not work as expected.

Categories

Resources