I am trying to read a JSON into the class. Jackson wants to apply a field of a subelement to the element itself, where it of course does not exist.
This is the JSON:
{
"authorizationRequest":{
"scope":["write","read"],
"resourceIds":["metadata"],
"approved":true,
"authorities":[],
"authorizationParameters":{
"scope":"write read",
"response_type":"token",
"redirect_uri":"",
"state":"",
"stateful":"false",
"client_id":"5102686_metadata"
},
"approvalParameters":{},
"state":"",
"clientId":"5102686_metadata",
"redirectUri":"",
"responseTypes":["token"],
"denied":false
},
"credentials":"",
"clientOnly":false,
"name":"testuser"
}
The classes look like the following:
// The main class that I try do deserialize:
public class DeserializedOAuth2Authentication extends OAuth2Authentication{
private String name;
private boolean clientOnly;
private AuthorizationRequest authorizationRequest = new DefaultAuthorizationRequest("", new ArrayList<>());
public DeserializedOAuth2Authentication() {
super(new DefaultAuthorizationRequest("", new ArrayList<>()), null);
}
#Override
#JsonProperty
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
#Override
#JsonProperty
public boolean isClientOnly() {
return clientOnly;
}
public void setClientOnly(boolean clientOnly) {
this.clientOnly = clientOnly;
}
#Override
#JsonProperty
public AuthorizationRequest getAuthorizationRequest() {
return authorizationRequest;
}
public void setAuthorizationRequest(AuthorizationRequest authorizationRequest) {
this.authorizationRequest = authorizationRequest;
}
}
AuthorizationRequest is an interface with all the getters for the listed elements; it is configured to be serialized by a DefaultAuthorizationRequest class also containing the respective setters and implementing fileds with corresponding names.
public class DefaultAuthorizationRequest implements AuthorizationRequest, Serializable {
private Set<String> scope = new LinkedHashSet<String>();
private Set<String> resourceIds = new HashSet<String>();
private boolean approved = false;
private Collection<GrantedAuthority> authorities = new HashSet<GrantedAuthority>();
private Map<String, String> authorizationParameters = new ConcurrentHashMap<String, String>();
private Map<String, String> approvalParameters = new HashMap<String, String>();
private String resolvedRedirectUri;
public Map<String, String> getAuthorizationParameters() {
return Collections.unmodifiableMap(authorizationParameters);
}
public Map<String, String> getApprovalParameters() {
return Collections.unmodifiableMap(approvalParameters);
}
public String getClientId() {
return authorizationParameters.get(CLIENT_ID);
}
public Set<String> getScope() {
return Collections.unmodifiableSet(this.scope);
}
public Set<String> getResourceIds() {
return Collections.unmodifiableSet(resourceIds);
}
public Collection<GrantedAuthority> getAuthorities() {
return Collections.unmodifiableSet((Set<? extends GrantedAuthority>) authorities);
}
public boolean isApproved() {
return approved;
}
public boolean isDenied() {
return !approved;
}
public String getState() {
return authorizationParameters.get(STATE);
}
public String getRedirectUri() {
return resolvedRedirectUri == null ? authorizationParameters.get(REDIRECT_URI) : resolvedRedirectUri;
}
public Set<String> getResponseTypes() {
return OAuth2Utils.parseParameterList(authorizationParameters.get(RESPONSE_TYPE));
}
public void setRedirectUri(String redirectUri) {
this.resolvedRedirectUri = redirectUri;
}
public void setScope(Set<String> scope) {
this.scope = scope == null ? new LinkedHashSet<String>() : new LinkedHashSet<String>(scope);
authorizationParameters.put(SCOPE, OAuth2Utils.formatParameterList(scope));
}
public void setResourceIds(Set<String> resourceIds) {
this.resourceIds = resourceIds == null ? new HashSet<String>() : new HashSet<String>(resourceIds);
}
public void setApproved(boolean approved) {
this.approved = approved;
}
public void setAuthorities(Collection<? extends GrantedAuthority> authorities) {
this.authorities = authorities == null ? new HashSet<GrantedAuthority>() : new HashSet<GrantedAuthority>(
authorities);
}
public void setAuthorizationParameters(Map<String, String> authorizationParameters) {
String clientId = getClientId();
Set<String> scope = getScope();
this.authorizationParameters = authorizationParameters == null ? new HashMap<String, String>()
: new HashMap<String, String>(authorizationParameters);
}
public void setApprovalParameters(Map<String, String> approvalParameters) {
this.approvalParameters = approvalParameters == null ? new HashMap<String, String>()
: new HashMap<String, String>(approvalParameters);
}
....
}
On calling read on the above JSON string I get an exception
com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "scope" (class de.mvbonline.vlx.auth.oauth2.DeserializedOAuth2Authentication), not marked as ignorable (3 known properties: "name", "authorizationRequest", "clientOnly"])
at [Source: (String)"{ "credentials":"", "clientOnly":false, "authorizationRequest":{ "scope":["write","read"], "resourceIds":["metadata"], "approved":true, "authorities":[], "authorizationParameters":{ "scope":"write read", "response_type":"token", "redirect_uri":"", "state":"", "stateful":"false", "[truncated 316 chars]; line: 1, column: 111] (through reference chain: de.mvbonline.vlx.auth.oauth2.DeserializedOAuth2Authentication["scope"])
Of course the field "scope" is not in the context of DeserializedOAuth2Authentication, but in the context of DefaultAuthorizationRequest. Why is Jackson searching in the wrong class for it?
I am unsing Jackson version 2.12.4
Make sure that DefaultAuthorizationRequest can be serialized and deserialized by Jackson. I guess that they are not for several reasons. Two that I can think of:
You have to let Jackson know how to deserialize DefaultAuthorizationRequest class. One possible solution would be to add a #JsonCreator and #JsonProperty to the class. The same applies to GrantedAuthority class.
DefaultAuthorizationRequest has fields of type Map, which need special attention. See these links on how to convert a JSON String to a Map<String, String> or, if the Map has custom objects, how to deserialize into a HashMap of custom objects
Also, you can take a look at Map Serialization and Deserialization with Jackson
I found my problem.
I formerly mapped my concrete implementation of the interface AuthorizationRequest via a handler:
mapper.addHandler(new DeserializationProblemHandler() {
#Override
public Object handleMissingInstantiator(DeserializationContext ctxt, Class<?> instClass, ValueInstantiator valueInsta, JsonParser p, String msg) throws IOException {
if(instClass.isAssignableFrom(AuthorizationRequest.class)) {
return new DeserializedAuthorizationRequest();
}
return super.handleMissingInstantiator(ctxt, instClass, valueInsta, p, msg);
}
});
This seems to be definitely not the same as annotating the field with the concrete class. This now works without problems:
public class DeserializedOAuth2Authentication extends OAuth2Authentication{
...
#Override
#JsonProperty("authorizationRequest")
#JsonDeserialize(as = DeserializedAuthorizationRequest.class)
public AuthorizationRequest getAuthorizationRequest() {
return authorizationRequest;
}
public void setAuthorizationRequest(AuthorizationRequest authorizationRequest) {
this.authorizationRequest = authorizationRequest;
}
}
Related
I am facing a issue where when i collecting object from flink flatmap collector than i am not getting value collected correctly. I am getting object reference and its not giving me actual value.
dataStream.filter(new FilterFunction<GenericRecord>() {
#Override
public boolean filter(GenericRecord record) throws Exception {
if (record.get("user_id") != null) {
return true;
}
return false;
}
}).flatMap(new ProfileEventAggregateFlatMapFunction(aggConfig))
.map(new MapFunction<ProfileEventAggregateEmittedTuple, String>() {
#Override
public String map(
ProfileEventAggregateEmittedTuple profileEventAggregateEmittedTupleNew)
throws Exception {
String res=null;
try {
ObjectMapper mapper = new ObjectMapper();
mapper.setVisibility(PropertyAccessor.FIELD, Visibility.ANY);
res= mapper.writeValueAsString(profileEventAggregateEmittedTupleNew);
} catch (Exception e) {
e.printStackTrace();
}
return res;
}
}).print();
public class ProfileEventAggregateFlatMapFunction extends
RichFlatMapFunction<GenericRecord, ProfileEventAggregateEmittedTuple> {
private final ProfileEventAggregateTupleEmitter aggregator;
ObjectMapper mapper = ObjectMapperPool.getInstance().get();
public ProfileEventAggregateFlatMapFunction(String config) throws IOException {
this.aggregator = new ProfileEventAggregateTupleEmitter(config);
}
#Override
public void flatMap(GenericRecord event,
Collector<ProfileEventAggregateEmittedTuple> collector) throws Exception {
try {
List<ProfileEventAggregateEmittedTuple> aggregateTuples = aggregator.runAggregates(event);
for (ProfileEventAggregateEmittedTuple tuple : aggregateTuples) {
collector.collect(tuple);
}
}}
Debug Results:
tuple that i am collecting in collector
tuple = {ProfileEventAggregateEmittedTuple#7880}
profileType = "userprofile"
key = "1152473"
businessType = "keyless"
name = "consumer"
aggregates = {ArrayList#7886} size = 1
0 = {ProfileEventAggregate#7888} "geo_id {geo_id=1} {keyless_select_destination_cnt=1, total_estimated_distance=12.5}"
entityType = "geo_id"
dimension = {LinkedHashMap#7891} size = 1
"geo_id" -> {Integer#7897} 1
key = "geo_id"
value = {Integer#7897} 1
metrics = {LinkedHashMap#7892} size = 2
"keyless_select_destination_cnt" -> {Long#7773} 1
key = "keyless_select_destination_cnt"
value = {Long#7773} 1
"total_estimated_distance" -> {Double#7904} 12.5
key = "total_estimated_distance"
value = {Double#7904} 12.5
This i get in my map function .map(new MapFunction<ProfileEventAggregateEmittedTuple, String>()
profileEventAggregateEmittedTuple = {ProfileEventAggregateEmittedTuple#7935}
profileType = "userprofile"
key = "1152473"
businessType = "keyless"
name = "consumer"
aggregates = {GenericData$Array#7948} size = 1
0 = {ProfileEventAggregate#7950} "geo_id {geo_id=java.lang.Object#863dce2} {keyless_select_destination_cnt=java.lang.Object#7cdb4bfc, total_estimated_distance=java.lang.Object#52e81f57}"
entityType = "geo_id"
dimension = {HashMap#7952} size = 1
"geo_id" -> {Object#7957}
key = "geo_id"
value = {Object#7957}
Class has no fields
metrics = {HashMap#7953} size = 2
"keyless_select_destination_cnt" -> {Object#7962}
key = "keyless_select_destination_cnt"
value = {Object#7962}
Class has no fields
"total_estimated_distance" -> {Object#7963}
Please help me to understand what is happening why i am not getting correct data.
public class ProfileEventAggregateEmittedTuple implements Cloneable, Serializable {
private String profileType;
private String key;
private String businessType;
private String name;
private List<ProfileEventAggregate> aggregates = new ArrayList<ProfileEventAggregate>();
private long startTime;
private long endTime;
public String getProfileType() {
return profileType;
}
public void setProfileType(String profileType) {
this.profileType = profileType;
}
public String getKey() {
return key;
}
public void setKey(String key) {
this.key = key;
}
public String getBusinessType() {
return businessType;
}
public void setBusinessType(String businessType) {
this.businessType = businessType;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public List<ProfileEventAggregate> getAggregates() {
return aggregates;
}
public void addAggregate(ProfileEventAggregate aggregate) {
this.aggregates.add(aggregate);
}
public void setAggregates(List<ProfileEventAggregate> aggregates) {
this.aggregates = aggregates;
}
public long getStartTime() {
return startTime;
}
public void setStartTime(long startTime) {
this.startTime = startTime;
}
public long getEndTime() {
return endTime;
}
public void setEndTime(long endTime) {
this.endTime = endTime;
}
#Override
public ProfileEventAggregateEmittedTuple clone() {
ProfileEventAggregateEmittedTuple clone = new ProfileEventAggregateEmittedTuple();
clone.setProfileType(this.profileType);
clone.setKey(this.key);
clone.setBusinessType(this.businessType);
clone.setName(this.name);
for (ProfileEventAggregate aggregate : this.aggregates) {
clone.addAggregate(aggregate.clone());
}
return clone;
}
public class ProfileEventAggregate implements Cloneable, Serializable {
private String entityType;
private Map<String, Object> dimension =new LinkedHashMap<String, Object>();
private Map<String, Object> metrics = new LinkedHashMap<String, Object>();
public Map<String, Object> getDimension() {
return dimension;
}
public void setDimension(Map<String, Object> dimension) {
this.dimension.putAll(dimension);
}
public void addDimension(String dimensionKey, Object dimensionValue) {
this.dimension.put(dimensionKey, dimensionValue);
}
public Map<String, Object> getMetrics() {
return metrics;
}
public void addMetric(String metricKey, Object metricValue) {
this.metrics.put(metricKey, metricValue);
}
public void setMetrics(Map<String, Object> metrics) {
this.metrics.putAll(metrics);
}
public String getEntityType() {
return entityType;
}
public void setEntityType(String entityType) {
this.entityType = entityType;
}
#Override
public ProfileEventAggregate clone() {
ProfileEventAggregate clone = new ProfileEventAggregate();
clone.setEntityType(this.entityType);
clone.getDimension().putAll(this.getDimension());
clone.getMetrics().putAll(this.metrics);
return clone;
}
When you don't enableObjectReuse, objects are copied with your configured serializer (seems to be Avro?).
In your case, you use Map<String, Object> where you cannot infer a plausible schema.
The easiest fix would be to enableObjectReuse. Else make sure your serializer matches your data. So you could add a unit test where you use AvroSerializer#copy and make sure your POJO is properly annotated if you want to stick with Avro reflect or even better go with a schema first approach, where you generate your Java POJO with a Avro schema and use specific Avro.
Let's discuss some alternatives:
Use GenericRecord. Instead of converting it to a Java type, directly access GenericRecord. This is usually the only way when the full record is flexible (e.g. your job takes any input and writes it out to S3).
Denormalize schema. Instead of having some class Event { int id; Map<String, Object> data; } you would use class EventInformation { int id; String predicate; Object value; }. You would need to group all information for processing. However, you will run into the same type issues with Avro.
Use wide-schema. Looking at the previous approach, if the different predicates are known beforehand, then you can use that to craft a wide-schema class Event { int id; Long predicate1; Integer predicate2; ... String predicateN; } where all oft he entries are nullable and most of them are indeed null. To encode null is very cheap.
Ditch Avro. Avro is fully typed. You may want to use something more dynamic. Protobuf has Any to support arbitrary submessages.
Use Kryo. Kryo can serialize arbitrary object trees at the cost of being slower and having more overhead.
If you want to write the data, you also need to think about a solution where the type information is added for proper deserialization. For an example, check out this JSON question. But there are more ways to implement it.
My application is a Kafka consumer which receives a big fat custom message from the producer.
We use Jackson to serialize and deserialize the messages.
A dummy of my consumer is here.
public class LittleCuteConsumer {
#KafkaListener(topics = "${kafka.bigfat.topic}", containerFactory = “littleCuteConsumerFactory")
public void receive(BigFatMessage message) {
// do cute stuff
}
}
And the message that's been transferred
#JsonIgnoreProperties(ignoreUnknown = true)
public class BigFatMessage {
private String fieldOne;
private String fieldTwo;
...
private String fieldTen;
private CustomeFieldOne cf1;
...
private CustomeFieldTen cf10;
// setters and getters
}
Here is the object I want to deserialize the original message to.
#JsonIgnoreProperties(ignoreUnknown = true)
public class ThinMessage {
private String fieldOne;
private String fieldTwo;
// setters and getters
}
Original deserializer
public class BigFatDeserializer implements Deserializer<BigFatMessage> {
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
// Default implementation of configure method
}
#Override
public BigFatMessage deserialize(String topic, byte[] data) {
ObjectMapper mapper = new ObjectMapper();
BigFatMessage biggie = null;
try {
biggie = mapper.readValue(data, BigFatMessage.class);
} catch (Exception e) {
// blame others
}
return biggie;
}
#Override
public void close() {
// Default implementation of close method
}
}
As we can see here, the message contains a lot of fields and dependent objects which are actually useless for my consumer, and I don't want to define all the dependent classes in my consumer as well.
Hence, I need a way I to receive the message using a simple different model class and deserialize it to ignore the unnecessary fields from the original message!
How I'm trying to deserialize
public class ThinDeserializer implements Deserializer<ThinMessage> {
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
// Default implementation of configure method
}
#Override
public ThinMessage deserialize(String topic, byte[] data) {
ObjectMapper mapper = new ObjectMapper();
ThinMessage cutie = null;
try {
cutie = mapper.readValue(data, ThinMessage.class);
} catch (Exception e) {
// blame others
}
return cutie;
}
#Override
public void close() {
// Default implementation of close method
}
}
And get the below Jackson error:
com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Cannot construct instance of com.myapp.ThinMessage (no Creators, like default construct, exist): cannot deserialize from Object value (no delegate- or property-based Creator)\n
Accompanied by below Kafka exception.
org.springframework.kafka.listener.ListenerExecutionFailedException: Listener method could not be invoked with the incoming message\n
org.springframework.messaging.handler.annotation.support.MethodArgumentNotValidException: Could not resolve method parameter at index 0
Try to change
public class ThinMessage {
private String fieldOne;
private String fieldTwo;
}
to
#JsonIgnoreProperties(ignoreUnknown = true)
public class ThinMessage {
private String fieldOne;
private String fieldTwo;
public ThinMessage() {
}
public String getFieldOne() {
return fieldOne;
}
public void setFieldOne(String fieldOne) {
this.fieldOne = fieldOne;
}
public String getFieldTwo() {
return fieldTwo;
}
public void setFieldTwo(String fieldTwo) {
this.fieldTwo = fieldTwo;
}
}
and set
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
check this link : (https://docs.spring.io/spring-kafka/docs/2.3.x/reference/html/#json)
you have two options : remove typeInfo from producer or ingnore typeInfo from consumer
#Bean
public DefaultKafkaProducerFactory pf(KafkaProperties properties) {
Map<String, Object> props = properties.buildProducerProperties();
DefaultKafkaProducerFactory pf = new DefaultKafkaProducerFactory(props,
new JsonSerializer<>(MyKeyType.class)
.forKeys()
.noTypeInfo(),
new JsonSerializer<>(MyValueType.class)
.noTypeInfo());
}
#Bean
public DefaultKafkaConsumerFactory pf(KafkaProperties properties) {
Map<String, Object> props = properties.buildConsumerProperties();
DefaultKafkaConsumerFactory pf = new DefaultKafkaConsumerFactory(props,
new JsonDeserializer<>(MyKeyType.class)
.forKeys()
.ignoreTypeHeaders(),
new JsonSerializer<>(MyValueType.class)
.ignoreTypeHeaders());
}
I'm trying to convert a List<String[]> into List<Object> using Dozer but unable to map the index values to the property fields using mapper API configuration.
How can I map the members of the String[] into individual object fields with each index targeting a specific field? (e.g. [0] -> name, and [1] -> role)
DozerBeanMapper mapper = new DozerBeanMapper();
BeanMappingBuilder builder = new BeanMappingBuilder() {
#Override
protected void configure() {
mapping(String[].class, User.class)
.fields(this_(), "name"); // HOW do I specify index?**
}
};
mapper.addMapping(builder);
List<String[]> users = new ArrayList<>();
String[] user1 = {"Jill", "SDE"};
String[] user2 = {"Jack", "PM"};
users.add(user1);
users.add(user2);
List<User> userList = mapObjects(mapper, users, User.class);
where mapObjects() is;
private static <T1, T2> List<T2> mapObjects(DozerBeanMapper mapper, List<T1> sourceList, Class<T2> destinationClazz) {
try {
return sourceList.stream()
.map(i -> mapper.map(i, destinationClazz))
.collect(Collectors.toList());
} catch (Exception e) {
...
}
return new ArrayList<>();
}
and User class;
class User {
String name;
String role;
// getter & setter
}
It worked perfectly with the following configuration;
DozerBeanMapper mapper = new DozerBeanMapper();
BeanMappingBuilder builder = new BeanMappingBuilder() {
#Override
protected void configure() {
mapping(String[].class, User.class)
.fields(this_(), "name", FieldsMappingOptions.customConverterId("arrToName"))
.fields(this_(), "role", FieldsMappingOptions.customConverterId("arrToRole"));
}
};
final Map<String, CustomConverter> customConverterMap = new HashMap<>();
customConverterMap.put("arrToName", new ArrToNameConverter());
customConverterMap.put("arrToRole", new ArrToRoleConverter());
mapper.setCustomConvertersWithId(customConverterMap);
mapper.addMapping(builder);
Utilizing a logic where String[] is mapped into name and role fields separately via custom converters, which are targeting a specific index of the input String[]. With dozer, you can essentially define custom converters and assign them an id, and refer them with those ids inside of field mappings FieldsMappingOptions.customConverterId("{id}")
where ArrToNameConverter;
public class ArrToNameConverter extends DozerConverter<String[], String> {
public ArrToNameConverter() {
super(String[].class, String.class);
}
#Override
public String convertTo(String[] strings, String user) {
return strings[0];
}
#Override
public String[] convertFrom(String user, String[] strings) {
return new String[0];
}
}
and ArrToRoleConverter;
public class ArrToRoleConverter extends DozerConverter<String[], String> {
public ArrToRoleConverter() {
super(String[].class, String.class);
}
#Override
public String convertTo(String[] strings, String user) {
return strings[1];
}
#Override
public String[] convertFrom(String user, String[] strings) {
return new String[0];
}
}
With the above mapper, I was able to get the following result;
[User(name=Jill, role=SDE), User(name=Jack, role=PM)]
I receive data in JSON
{
"status": "INVALID_DATA",
"errors":{ "invalid_id": "Id isn't available",
...
"wrong_address": "Address error msg"
}
}
Keys and their quantity in structure "errors" are unknown for me. I'm trying to map this with class
#JsonIgnoreProperties(ignoreUnknown = true)
public class StatusErrors
{
private String status;
private HashMap<String, String> errors = new HashMap<String, String>();
public String getStatus() {
return status;
}
public void setStatus(String status) {
this.status = status;
}
public HashMap<String, String> getErrors() {
return errors;
}
public void setErrors(HashMap<String, String> errors) {
this.errors = errors;
}
}
It works fine if I have "errors", but when server say "OK" and has no errors it send me
{
"status": "OK",
"errors":[]
}
(Don't ask me who write the server)
So mapper crashes.
I'm trying to write a custom JsonDeserializer (generic way)
public abstract class ExcludeEmptyArrayDeserializer<T> extends JsonDeserializer<T> {
private final Class<T> clazz;
protected ExcludeEmptyArrayDeserializer(Class<T> clazz) {
this.clazz = clazz;
}
#Override
public T deserialize(JsonParser jp, DeserializationContext ctxt) throws IOException {
ObjectCodec oc = jp.getCodec();
JsonNode node = oc.readTree(jp);
if(node.has("errors")) {
if (node.get("errors").isArray() && !node.get("errors").getElements().hasNext())
((ObjectNode)node).remove("errors");
}
ObjectMapper objectMapper = new ObjectMapper();
return objectMapper.readValue(node, clazz); // doesn't work
//return oc.treeToValue(node, clazz); // doesn't work too
}
}
public class StatusErrorsDeserializer extends ExcludeEmptyArrayDeserializer<StatusErrors> {
public StatusErrorsDeserializer() {
super(StatusErrors.class);
}
}
The result usage code has a view
SimpleModule module = new SimpleModule("", Version.unknownVersion());
module.addDeserializer(StatusErrors.class, new StatusErrorsDeserializer());
ObjectMapper mapper = new ObjectMapper().setVisibility(JsonMethod.FIELD, JsonAutoDetect.Visibility.ANY).withModule(module);
MappingJacksonHttpMessageConverter messageConverter = new MappingJacksonHttpMessageConverter();
messageConverter.setObjectMapper(mapper);
getRestTemplate().getMessageConverters().clear();
getRestTemplate().getMessageConverters().add(messageConverter);
The "errors" node deleted correctly but this solution still doesn't work.
I suppose I make a mistake in JsonDeserializer.deserialize method but don't get an idea.
BTW StatusErrors class can be a base class for other complicated messages from server.
The easiest solution is to change errors variable declaration to Object:
#JsonIgnoreProperties(ignoreUnknown = true)
class StatusErrors {
private String status;
private Object errors;
public Map<String, String> getErrorsMap() {
if (this.errors instanceof Map) {
return (Map)this.errors;
}
return null;
}
....
You don't need any serializers and deserializers:
ObjectMapper objectMapper = new ObjectMapper();
StatusErrors result1 = objectMapper.readValue(JSON1, StatusErrors.class);
System.out.println(result1);
System.out.println(result1.getErrors().getClass());
System.out.println(result1.getErrorsMap());
StatusErrors result2 = objectMapper.readValue(JSON2, StatusErrors.class);
System.out.println(result2);
The code above will print:
StatusErrors(status=INVALID_DATA, errors={invalid_id=Id isn't available, wrong_address=Address error msg})
class java.util.LinkedHashMap
{invalid_id=Id isn't available, wrong_address=Address error msg}
StatusErrors(status=OK, errors=[])
I'm looking for possibility to serialize transient information only in some cases:
#JsonInclude(Include.NON_NULL)
#Entity
public class User {
public static interface AdminView {}
... id, email and others ...
#Transient
private transient Details details;
#JsonIgnore // Goal: ignore all the time, except next line
#JsonView(AdminView.class) // Goal: don't ignore in AdminView
public Details getDetails() {
if (details == null) {
details = ... compute Details ...
}
return details;
}
}
public class UserDetailsAction {
private static final ObjectWriter writer = new ObjectMapper();
private static final ObjectWriter writerAdmin = writer
.writerWithView(User.AdminView.class);
public String getUserAsJson(User user) {
return writer.writeValueAsString(user);
}
public String getUserAsJsonForAdmin(User user) {
return writerAdmin.writeValueAsString(user);
}
}
If I call getUserAsJson I expected to see id, email and other fields, but not details. This works fine. But I see same for getUserAsJsonForAdmin, also without detail. If I remove #JsonIgnore annotation - I do see details in both calls.
What do I wrong and is there good way to go? Thanks!
You may find the use of the dynamic Jackson filtering slightly more elegant for your use case. Here is an example of the filtering of POJO fields based on a custom annotation sharing one object mapper instance:
public class JacksonFilter {
static private boolean shouldIncludeAllFields;
#Retention(RetentionPolicy.RUNTIME)
public static #interface Admin {}
#JsonFilter("admin-filter")
public static class User {
public final String email;
#Admin
public final String details;
public User(String email, String details) {
this.email = email;
this.details = details;
}
}
public static class AdminPropertyFilter extends SimpleBeanPropertyFilter {
#Override
protected boolean include(BeanPropertyWriter writer) {
// deprecated since 2.3
return true;
}
#Override
protected boolean include(PropertyWriter writer) {
if (writer instanceof BeanPropertyWriter) {
return shouldIncludeAllFields || ((BeanPropertyWriter) writer).getAnnotation(Admin.class) == null;
}
return true;
}
}
public static void main(String[] args) throws JsonProcessingException {
User user = new User("email", "secret");
ObjectMapper mapper = new ObjectMapper();
mapper.setFilters(new SimpleFilterProvider().addFilter("admin-filter", new AdminPropertyFilter()));
System.out.println(mapper.writerWithDefaultPrettyPrinter().writeValueAsString(user));
shouldIncludeAllFields = true;
System.out.println(mapper.writerWithDefaultPrettyPrinter().writeValueAsString(user));
}
}
Output:
{
"email" : "email"
}
{
"email" : "email",
"details" : "secret"
}
It's look like jackson have horrible concept on very cool feature like #JsonView. The only way I discover to solve my problem is:
#JsonInclude(Include.NON_NULL)
#Entity
public class User {
public static interface BasicView {}
public static interface AdminView {}
... id and others ...
#JsonView({BasicView.class, AdminView.class}) // And this for EVERY field
#Column
private String email;
#Transient
private transient Details details;
#JsonView(AdminView.class)
public Details getDetails() {
if (details == null) {
details = ... compute Details ...
}
return details;
}
}
public class UserDetailsAction {
private static final ObjectWriter writer = new ObjectMapper()
.disable(MapperFeature.DEFAULT_VIEW_INCLUSION)
.writerWithView(User.BasicView.class);
private static final ObjectWriter writerAdmin = new ObjectMapper()
.disable(MapperFeature.DEFAULT_VIEW_INCLUSION)
.writerWithView(User.AdminView.class);
public String getUserAsJson(User user) {
return writer.writeValueAsString(user);
}
public String getUserAsJsonForAdmin(User user) {
return writerAdmin.writeValueAsString(user);
}
}
Maybe it's help some one. But I hope to find better solution and because doesn't accept my own answer.
EDIT: because interface can extends (multiple) interfaces, I can use:
public static interface AdminView extends BasicView {}
and just
#JsonView(BasicView.class)
instead of
#JsonView({BasicView.class, AdminView.class})