Class A {
private String a;
private String b;
private B innerObject;
}
Class B {
private String c;
}
In my case, String b might come in with a null value. My modelmapper configuration is like below:
ModelMapper mapper = new ModelMapper();
mapper.getConfiguration()
.setFieldMatchingEnabled(true)
.setMatchingStrategy(MatchingStrategies.LOOSE)
.setFieldAccessLevel(AccessLevel.PRIVATE)
.setSkipNullEnabled(true)
.setSourceNamingConvention(NamingConventions.JAVABEANS_MUTATOR);
when I map the object, I get the target object with b=null value.
Trying to stay away from a strategy shown here: SO- Question
What am I missing?
Have you tried this configuration:
modelMapper.getConfiguration().setPropertyCondition(Conditions.isNotNull());
I'd rather in this way:
#Configuration
public class ModelMapperConfig {
#Bean
public ModelMapper modelMapper() {
ModelMapper modelMapper = new ModelMapper();
modelMapper.getConfiguration().setSkipNullEnabled(true);
return modelMapper;
}
}
Looks, like it's not possible.
public <D> D map(Object source, Class<D> destinationType) {
Assert.notNull(source, "source");
Assert.notNull(destinationType, "destinationType");
return this.mapInternal(source, (Object)null, destinationType (String)null);
}
I solved it with next wrapper function.
private static <D> D map(Object source, Type destination) {
return source == null ? null : mapper.map(source, destination);
}
Check this question too Modelmapper: How to apply custom mapping when source object is null?
Related
I am struggling with mapping List into responseDTO.getList()
My code:
MessageDTO
#Getter
#Setter
public Class MessageDTO() {
private String message;
...
}
MessagesDTO
#Getter
#Setter
public Class MessagesDTO() {
private List<> message;
}
MyConverter
public class MyConverter extends AbstractConverter<List<MessageDTO>, MessagesDTO> {
#Override
protected ChatMessagesResponseDTO convert(List<MessageDTO> source) {
MessagesDTO destination = new MessagesDTO();
destination.setMessages(source);
return destination;
}
}
Controller
...
List<MessageDTO> messages = ... // result of service and succesfull mapping entity to dto
ModelMapper mm = new ModelMapper();
Converter conv = new MyConverter();
mm.addConverter(conv);
MessagesDTO messagesDTO = mm.map(messages, MessagesDTO.class)
return messagesDTO; // always null
Any ideas why it is not working ? I am sucessfuly using modelmapper in many other places of my project even with custom TypeMap(s) and Converter(s), but cannot find a way how to map list of some type into DTO attribute which is list of that type.
This is because of type erasure. ModelMapper is unable to recognize the generic type of a List and thus does not apply your converter. I'm not sure if it is possible to achieve with classes you presented but if it is it might be quite complicated task.
One solution would be to declare class that has the type stored runtime. So like:
#SuppressWarnings("serial")
public static class MessageDTOList extends ArrayList<MessageDTO> {};
and make required changes to your converter, so to be:
public class MyConverter extends AbstractConverter<MessageDTOList, MessagesDTO> {
#Override
protected MessagesDTO convert(MessageDTOList source) {
MessagesDTO destination = new MessagesDTO();
destination.setMessages(source);
return destination;
}
}
If it is hard to get the response directly as a MessageDTOList you can always:
List<MessageDTO> messages = ... // result of service and succesfull mapping entity
MessageDTOList messagesDerived = new MessageDTOList();
messagesDerived.addAll(messages);
and then just:
MessagesDTO messagesDTO = mm.map(messagesDerived, MessagesDTO.class);
I have 2 objects :
#Setter
#Getter
public class Agent {
public int userID;
public String name;
public boolean isVoiceRecorded;
public boolean isScreenRecorded;
public boolean isOnCall;
public LocalDateTime startEventDateTime;
}
public class AgentLine {
public int userID;
public String name;
public boolean isVoiceRecorded;
public boolean isScreenRecorded;
public boolean isOnCall;
public String startEventDateTime;
}
I would like to map between AgentLine to Agent. I can't use the default mapping because of the Localdatetime conversion.
I have defined :
#Bean
ModelMapper getModelMapper() {
ModelMapper modelMapper = new ModelMapper();
Converter<AgentLine, Agent> orderConverter = new Converter<AgentLine, Agent>() {
#Override
public Agent convert(MappingContext<AgentLine, Agent> mappingContext) {
AgentLine s = mappingContext.getSource();
Agent d = mappingContext.getDestination();
/* d.userID = s.userID;
d.name = s.name;*/
d.startEventDateTime = LocalDateTime.parse(s.startEventDateTime, DateTimeFormatter.ISO_LOCAL_DATE_TIME);
return d;
}
};
modelMapper.addConverter(orderConverter);
return modelMapper;
}
In order to use it:
AgentLine line;
#Autowired
private ModelMapper modelMapper;
Agent agent = modelMapper.map(line, Agent.class);
It works , but I do not want to specify all Agent properties in the convert method, I would like to specify the startEventDateTime conversion and the rest of the properties would be mapped by default.
In addition I have tried to define :
PropertyMap<AgentLine, Agent> orderMap = new PropertyMap<AgentLine, Agent>() {
#Override
protected void configure() {
map().setName(source.name);
}
};
modelMapper.addMappings(orderMap);
but , in the mapping you can't handle the date conversion.
If I define for the mapper PropertyMap and Converter the PropertyMap is ignored.
I do not want to specify all Agent properties in the convert method, I would like to specify the startEventDateTime conversion and the rest of the properties would be mapped by default.
Do not use Converter for mapping complex objects. You should use TypeMap for such purposes. Use Converter for custom conversion (for your case String to LocalDateTime).
ModelMapper modelMapper = new ModelMapper();
Converter<String, LocalDateTime> dateTimeConverter = ctx -> ctx.getSource() == null ? null : LocalDateTime.parse(ctx.getSource(), DateTimeFormatter.ISO_LOCAL_DATE_TIME);
modelMapper.typeMap(AgentLine.class, Agent.class)
.addMappings(mapper -> mapper.using(dateTimeConverter).map(AgentLine::getStartEventDateTime, Agent::setStartEventDateTime));
I'm using Spring with Gson to object serialization.
I have model objects that use #Expose annotation e.g.:
public class Zone {
#Expose
private String name;
#Expose
private String description;
#Expose
private List<String> longList;
private String someIrrelevantVar;
}
I'm have 2 controllers which serves Zone objects list to user e.g.:
#RestController
class ZoneController {
#GetMapping(value = "fullData")
List<Zone> getFullZones() {
return zoneService.getZones();
}
}
#RestController
class SimpleZoneController {
#GetMapping(value = "simpleData")
List<Zone> getSimpleZones() {
return zoneService.getZones();
}
}
The problem is List<String> longList var - it usually has a lot of entries (String is only example, in code it could be complex object).
In my getFullZones() I want to serve to user zones with this longList but in getSimpleZones() I want ot serve zones without longList (it's not used in any way on the client side).
How to do that?
I could iterate list with zones and set longList to null but it's not very elegant solution.
I'm setting up Spring to use Gson like this:
#Configuration
public class WebMvcConfig extends WebMvcConfigurerAdapter {
#Override
public void configureMessageConverters(List<HttpMessageConverter<?>> converters) {
converters.add(createGsonHttpMessageConverter());
super.configureMessageConverters(converters);
}
private GsonHttpMessageConverter createGsonHttpMessageConverter() {
Gson gson = new GsonBuilder()
.excludeFieldsWithoutExposeAnnotation()
//.registerTypeAdapter - register some deserializers
.create();
GsonHttpMessageConverter gsonConverter = new GsonHttpMessageConverter();
gsonConverter.setGson(gson);
return gsonConverter;
}
}
Create a base class ZoneSimple and extend another class Zone extends ZoneSimple. Move the #Expose from fields to methods.
In the base class the method has no annotation. In the Zone the method is annotated.
Alternatively you can add a ProxyZone class which keeps zone instance and delegates all the calls to the instance. Annotate the Proxy as you need.
Another way is
Gson gson = new GsonBuilder()
.addSerializationExclusionStrategy(new ExclusionStrategy() {
#Override
public boolean shouldSkipField(FieldAttributes f) {
return f.getName().toLowerCase().contains("fieldName");
}
#Override
public boolean shouldSkipClass(Class<?> aClass) {
return false;
}
})
.create();
Got from the answer
I'm using Spring Cloud Brixton.SR4 with Spring Data MongoDB.
I have a very simple entity:
#Document
public class Foo{
private Period period;
//getter & setter
}
Because java.time.Period is not supported by jsr310 I'm creating custom converters:
class Converters {
#Component
#WritingConverter
static class PeriodToStringConverter implements Converter<Period, String> {
#Override
public String convert(Period period) {
return period.toString();
}
}
#ReadingConverter
#Component
static class StringToPeriodConverter implements Converter<String, Period> {
#Override
public Period convert(String s) {
return Period.parse(s);
}
}
Now I register them in my configuration class extending AbstractMongoConfiguration:
#Bean
#Override
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
final CustomConversions conversions = customConversions();
log.info("hasCustomWriteTarget(Period.class): " + conversions.hasCustomWriteTarget(Period.class));
log.info("hasCustomWriteTarget(Period.class, String.class): " + conversions.hasCustomWriteTarget(Period.class, String.class));
log.info("hasCustomReadTarget(String.class, Period.class): " + conversions.hasCustomReadTarget(String.class, Period.class));
converter.setCustomConversions(conversions);
converter.afterPropertiesSet(); //probably not needed, trying out of despair
return converter;
}
#Bean
#Override
public CustomConversions customConversions() {
List<Converter> converters = new ArrayList<>();
converters.add(new Converters.PeriodToStringConverter());
converters.add(new Converters.StringToPeriodConverter());
return new CustomConversions(converters);
}
When I start my app I see in the logs:
hasCustomWriteTarget(Period.class): true
hasCustomWriteTarget(Period.class, String.class): true
hasCustomReadTarget(String.class, Period.class): true
Now I create a new Foo and save it to my repository:
Foo foo = new Foo();
foo.setPeriod(Period.of(2, 0, 1));
fooRepository.save(foo);
Now the weirdness happens:
In Mongodb I see:
{
"_id": ObjectId("xxxx"),
"period": {
"years" : 0,
"months" : 2,
"days" : 1
}
}
So already that's something wrong. It should be saved as a String
When I try to read the object in Java I get:
org.springframework.data.mapping.model.MappingException: No property null found on entity class java.time.Period to bind constructor parameter to!
I debugged the code in MappingMongoConverter:
if (conversions.hasCustomReadTarget(dbo.getClass(), rawType)) {
return conversionService.convert(dbo, rawType);
}
because my object was not store as a String the dbo variable is actually a BasicDbObject and therefore I don't have converter for this.
Any idea why my write converter is not being used to persist the Period?
I have jackson-datatype-jdk8 on my classpath, could it be the issue? Would jackson be involved at all for persisting in Mongodb?
EDIT
It seems to be a registration issue. When I debug the code, the CustomConversion object used in MappingMongoConverter is different than the one I create. And it doesn't have the custom converters which I create
OK it was extremely stupid...
I was also creating my own MongoTemplate:
#Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory());
}
Which basically ignores my custom converter. To fix it:
#Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
}
I am trying a simple JSON to de-serialize in to java object. I am however, getting empty String values for java.lang.String property values. In rest of the properties, blank values are converting to null values(which is what I want).
My JSON and related Java class are listed below.
JSON string:
{
"eventId" : 1,
"title" : "sample event",
"location" : ""
}
EventBean class POJO:
public class EventBean {
public Long eventId;
public String title;
public String location;
}
My main class code:
ObjectMapper mapper = new ObjectMapper();
mapper.disable(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES);
mapper.enable(DeserializationFeature.ACCEPT_EMPTY_STRING_AS_NULL_OBJECT);
try {
File file = new File(JsonTest.class.getClassLoader().getResource("event.txt").getFile());
JsonNode root = mapper.readTree(file);
// find out the applicationId
EventBean e = mapper.treeToValue(root, EventBean.class);
System.out.println("It is " + e.location);
}
I was expecting print "It is null". Instead, I am getting "It is ". Obviously, Jackson is not treating blank String values as NULL while converting to my String object type.
I read somewhere that it is expected. However, this is something I want to avoid for java.lang.String too. Is there a simple way?
Jackson will give you null for other objects, but for String it will give empty String.
But you can use a Custom JsonDeserializer to do this:
class CustomDeserializer extends JsonDeserializer<String> {
#Override
public String deserialize(JsonParser jsonParser, DeserializationContext context) throws IOException, JsonProcessingException {
JsonNode node = jsonParser.readValueAsTree();
if (node.asText().isEmpty()) {
return null;
}
return node.toString();
}
}
In class you have to use it for location field:
class EventBean {
public Long eventId;
public String title;
#JsonDeserialize(using = CustomDeserializer.class)
public String location;
}
It is possible to define a custom deserializer for the String type, overriding the standard String deserializer:
this.mapper = new ObjectMapper();
SimpleModule module = new SimpleModule();
module.addDeserializer(String.class, new StdDeserializer<String>(String.class) {
#Override
public String deserialize(JsonParser p, DeserializationContext ctxt) throws IOException, JsonProcessingException {
String result = StringDeserializer.instance.deserialize(p, ctxt);
if (StringUtils.isEmpty(result)) {
return null;
}
return result;
}
});
mapper.registerModule(module);
This way all String fields will behave the same way.
You might first like to see if there has been any progress on the Github issue requesting this exact feature.
For those using Spring Boot: The answer from jgesser was the most helpful to me, but I spent a while trying to work out the best way to configure it in Spring Boot.
Actually, the documentation says:
Any beans of type com.fasterxml.jackson.databind.Module are
automatically registered with the auto-configured
Jackson2ObjectMapperBuilder and are applied to any ObjectMapper
instances that it creates.
So here's jgesser's answer expanded into something you can copy-paste into a new class in a Spring Boot application
#Configuration
public class EmptyStringAsNullJacksonConfiguration {
#Bean
SimpleModule emptyStringAsNullModule() {
SimpleModule module = new SimpleModule();
module.addDeserializer(
String.class,
new StdDeserializer<String>(String.class) {
#Override
public String deserialize(JsonParser parser, DeserializationContext context)
throws IOException {
String result = StringDeserializer.instance.deserialize(parser, context);
if (StringUtils.isEmpty(result)) {
return null;
}
return result;
}
});
return module;
}
}
I could get this by following configuration.
final ObjectMapper mapper = new ObjectMapper();
mapper.configure(DeserializationFeature.ACCEPT_EMPTY_STRING_AS_NULL_OBJECT, true);
it is possible to use JsonCreator annotation. It worked for me
public class Foo {
private String field;
#JsonCreator
public Foo(
#JsonProrerty("field") String field) {
this.field = StringUtils.EMPTY.equals(field) ? null : field ;
}
}