I read in the documentation of custom converter that for a custom converter on a field mapping I can pass a custom parameter.
This is not good enough for me because this is specified once when building the mapper.
Is there any way to pass this parameter when doing the actual mapping?
mapper.map(sourceObject, Destination.class, "parameter");
My actual problem is that I want to map from one class containing multi lingual properties and destination should only have the "choosen" language properties.
Source class
public class Source
{
// Fields in default language
private String prop1;
private String prop2;
// List containing all translations of properties
private List<SourceName> sourceNames;
}
public class SourceName
{
private int lang_id;
private String prop1;
private String prop2;
}
Destination class
public class Destination
{
// Fields translated in choosen language
private String prop1;
private String prop2;
}
My goal is to be able to do like this:
Destination destination = mapper.map(source, Destination.class, 4); // To lang_id 4
Thanks
I have made this function (FIELDMAP var is "fieldMap"):
public static <T> T mapWithParam(Object source, Class<T> destinationClass, String param) throws MappingException {
T toReturn = null;
DozerBeanMapper dbm = (DozerBeanMapper) MapperFactory.getMapper();
MappingMetadata mmdt = dbm.getMappingMetadata();
ClassMappingMetadata classMapping = mmdt.getClassMapping(source.getClass(), destinationClass);
List<FieldMappingMetadata> fielMappingMetadata = classMapping.getFieldMappings();
List<OriginalFieldMap> originalValues = new ArrayList<OriginalFieldMap>();
for (FieldMappingMetadata fmmd : fielMappingMetadata) {
if (fmmd.getCustomConverter() != null) {
try {
Class<?> cls = Class.forName(fmmd.getCustomConverter());
if (cls.newInstance() instanceof ConfigurableCustomConverter) {
FieldMap modifieldFieldMap = (FieldMap)ReflectionHelper.executeGetMethod(fmmd, FIELDMAP);
originalValues.add(new OriginalFieldMap(modifieldFieldMap, modifieldFieldMap.getCustomConverterParam()));
modifieldFieldMap.setCustomConverterParam(param);
ReflectionHelper.executeSetMethod(fmmd, FIELDMAP, modifieldFieldMap);
}
} catch (ReflectionException | ClassNotFoundException | InstantiationException | IllegalAccessException e) {
e.printStackTrace();
}
}
}
toReturn = dbm.map(source, destinationClass);
for (OriginalFieldMap ofp : originalValues) {
ofp.getFieldMap().setCustomConverterParam(ofp.getOriginalValue());
}
return toReturn;
}
And OriginalFieldMap class:
import org.dozer.fieldmap.FieldMap;
public class OriginalFieldMap{
FieldMap fieldMap;
String originalValue;
public OriginalFieldMap(FieldMap fieldMap, String originalValue) {
super();
this.fieldMap = fieldMap;
this.originalValue = originalValue;
}
public FieldMap getFieldMap() {
return fieldMap;
}
public String getOriginalValue() {
return originalValue;
}
}
Related
My application is a Kafka consumer which receives a big fat custom message from the producer.
We use Jackson to serialize and deserialize the messages.
A dummy of my consumer is here.
public class LittleCuteConsumer {
#KafkaListener(topics = "${kafka.bigfat.topic}", containerFactory = “littleCuteConsumerFactory")
public void receive(BigFatMessage message) {
// do cute stuff
}
}
And the message that's been transferred
#JsonIgnoreProperties(ignoreUnknown = true)
public class BigFatMessage {
private String fieldOne;
private String fieldTwo;
...
private String fieldTen;
private CustomeFieldOne cf1;
...
private CustomeFieldTen cf10;
// setters and getters
}
Here is the object I want to deserialize the original message to.
#JsonIgnoreProperties(ignoreUnknown = true)
public class ThinMessage {
private String fieldOne;
private String fieldTwo;
// setters and getters
}
Original deserializer
public class BigFatDeserializer implements Deserializer<BigFatMessage> {
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
// Default implementation of configure method
}
#Override
public BigFatMessage deserialize(String topic, byte[] data) {
ObjectMapper mapper = new ObjectMapper();
BigFatMessage biggie = null;
try {
biggie = mapper.readValue(data, BigFatMessage.class);
} catch (Exception e) {
// blame others
}
return biggie;
}
#Override
public void close() {
// Default implementation of close method
}
}
As we can see here, the message contains a lot of fields and dependent objects which are actually useless for my consumer, and I don't want to define all the dependent classes in my consumer as well.
Hence, I need a way I to receive the message using a simple different model class and deserialize it to ignore the unnecessary fields from the original message!
How I'm trying to deserialize
public class ThinDeserializer implements Deserializer<ThinMessage> {
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
// Default implementation of configure method
}
#Override
public ThinMessage deserialize(String topic, byte[] data) {
ObjectMapper mapper = new ObjectMapper();
ThinMessage cutie = null;
try {
cutie = mapper.readValue(data, ThinMessage.class);
} catch (Exception e) {
// blame others
}
return cutie;
}
#Override
public void close() {
// Default implementation of close method
}
}
And get the below Jackson error:
com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Cannot construct instance of com.myapp.ThinMessage (no Creators, like default construct, exist): cannot deserialize from Object value (no delegate- or property-based Creator)\n
Accompanied by below Kafka exception.
org.springframework.kafka.listener.ListenerExecutionFailedException: Listener method could not be invoked with the incoming message\n
org.springframework.messaging.handler.annotation.support.MethodArgumentNotValidException: Could not resolve method parameter at index 0
Try to change
public class ThinMessage {
private String fieldOne;
private String fieldTwo;
}
to
#JsonIgnoreProperties(ignoreUnknown = true)
public class ThinMessage {
private String fieldOne;
private String fieldTwo;
public ThinMessage() {
}
public String getFieldOne() {
return fieldOne;
}
public void setFieldOne(String fieldOne) {
this.fieldOne = fieldOne;
}
public String getFieldTwo() {
return fieldTwo;
}
public void setFieldTwo(String fieldTwo) {
this.fieldTwo = fieldTwo;
}
}
and set
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
check this link : (https://docs.spring.io/spring-kafka/docs/2.3.x/reference/html/#json)
you have two options : remove typeInfo from producer or ingnore typeInfo from consumer
#Bean
public DefaultKafkaProducerFactory pf(KafkaProperties properties) {
Map<String, Object> props = properties.buildProducerProperties();
DefaultKafkaProducerFactory pf = new DefaultKafkaProducerFactory(props,
new JsonSerializer<>(MyKeyType.class)
.forKeys()
.noTypeInfo(),
new JsonSerializer<>(MyValueType.class)
.noTypeInfo());
}
#Bean
public DefaultKafkaConsumerFactory pf(KafkaProperties properties) {
Map<String, Object> props = properties.buildConsumerProperties();
DefaultKafkaConsumerFactory pf = new DefaultKafkaConsumerFactory(props,
new JsonDeserializer<>(MyKeyType.class)
.forKeys()
.ignoreTypeHeaders(),
new JsonSerializer<>(MyValueType.class)
.ignoreTypeHeaders());
}
I am trying to convert my POJO into 2 different CSV representations.
My POJO:
#NoArgsConstructor
#AllArgsConstructor
public static class Example {
#JsonView(View.Public.class)
private String a;
#JsonView(View.Public.class)
private String b;
#JsonView(View.Internal.class)
private String c;
#JsonView(View.Internal.class)
private String d;
public static final class View {
interface Public {}
interface Internal extends Public {}
}
}
Public view exposed fields a and b, and Internal view exposes all fields.
The problem is that if I construct the ObjectWriter with .writerWithSchemaFor(Example.class) all my fields are included but ignored as defined by the view. ObjectWriter will create the schema as defined by the Example.class but if I apply .withView it will only hide the fields, not ignore them.
This means that I must construct the schema manually.
Tests:
#Test
public void testJson() throws JsonProcessingException {
final ObjectMapper mapper = new ObjectMapper();
final Example example = new Example("1", "2", "3", "4");
final String result = mapper.writerWithView(Example.View.Public.class).writeValueAsString(example);
System.out.println(result); // {"a":"1","b":"2"}
}
#Test
public void testCsv() throws JsonProcessingException {
final CsvMapper mapper = new CsvMapper();
final Example example = new Example("1", "2", "3", "4");
final String result = mapper.writerWithSchemaFor(Example.class).withView(Example.View.Public.class).writeValueAsString(example);
System.out.println(result); // 1,2,,
}
#Test
public void testCsvWithCustomSchema() throws JsonProcessingException {
final CsvMapper mapper = new CsvMapper();
CsvSchema schema = CsvSchema.builder()
.addColumn("a")
.addColumn("b")
.build();
final Example example = new Example("1", "2", "3", "4");
final String result = mapper.writer().with(schema).withView(Example.View.Public.class).writeValueAsString(example);
System.out.println(result); // 1,2
}
testCsv test has 4 fields, but 2 are excluded. testCsvWithCustomSchema test has only the fields I want.
Is there a way to get CsvSchema that will match my #JsonView without having to construct it myself?
Here is a solution I did with reflection, I am not really happy with it since it is still "manually" building the schema.
This solution is also bad since it ignores mapper configuration like MapperFeature.DEFAULT_VIEW_INCLUSION.
This seems like doing something that should be already available from the library.
#AllArgsConstructor
public class GenericPojoCsvSchemaBuilder {
public CsvSchema build(final Class<?> type) {
return build(type, null);
}
public CsvSchema build(final Class<?> type, final Class<?> view) {
return build(CsvSchema.builder(), type, view);
}
public CsvSchema build(final CsvSchema.Builder builder, final Class<?> type) {
return build(builder, type, null);
}
public CsvSchema build(final CsvSchema.Builder builder, final Class<?> type, final Class<?> view) {
final JsonPropertyOrder propertyOrder = type.getAnnotation(JsonPropertyOrder.class);
final List<Field> fieldsForView;
// DO NOT use Arrays.asList because it uses an internal fixed length implementation which cannot use .removeAll (throws UnsupportedOperationException)
final List<Field> unorderedFields = Arrays.stream(type.getDeclaredFields()).collect(Collectors.toList());
if (propertyOrder != null && propertyOrder.value().length > 0) {
final List<Field> orderedFields = Arrays.stream(propertyOrder.value()).map(s -> {
try {
return type.getDeclaredField(s);
} catch (final NoSuchFieldException e) {
throw new IllegalArgumentException(e);
}
}).collect(Collectors.toList());
if (propertyOrder.value().length < type.getDeclaredFields().length) {
unorderedFields.removeAll(orderedFields);
orderedFields.addAll(unorderedFields);
}
fieldsForView = getJsonViewFields(orderedFields, view);
} else {
fieldsForView = getJsonViewFields(unorderedFields ,view);
}
final JsonIgnoreFieldFilter ignoreFieldFilter = new JsonIgnoreFieldFilter(type.getDeclaredAnnotation(JsonIgnoreProperties.class));
fieldsForView.forEach(field -> {
if (ignoreFieldFilter.matches(field)) {
builder.addColumn(field.getName());
}
});
return builder.build();
}
private List<Field> getJsonViewFields(final List<Field> fields, final Class<?> view) {
if (view == null) {
return fields;
}
return fields.stream()
.filter(field -> {
final JsonView jsonView = field.getAnnotation(JsonView.class);
return jsonView != null && Arrays.stream(jsonView.value()).anyMatch(candidate -> candidate.isAssignableFrom(view));
})
.collect(Collectors.toList());
}
private class JsonIgnoreFieldFilter implements ReflectionUtils.FieldFilter {
private final List<String> fieldNames;
public JsonIgnoreFieldFilter(final JsonIgnoreProperties jsonIgnoreProperties) {
if (jsonIgnoreProperties != null) {
fieldNames = Arrays.asList(jsonIgnoreProperties.value());
} else {
fieldNames = null;
}
}
#Override
public boolean matches(final Field field) {
if (fieldNames != null && fieldNames.contains(field.getName())) {
return false;
}
final JsonIgnore jsonIgnore = field.getDeclaredAnnotation(JsonIgnore.class);
return jsonIgnore == null || !jsonIgnore.value();
}
}
}
I am creating java app which will allow storing objects in database. What I want to do is generic implementation so it could load json and create java class from it. This is what a code should look like:
SomeClass someObject= data.getValue(SomeClass.class);
Lets say that data would be a json object. How should I implement getValue() method so it will allow me to create class from it. I don't want SomeClass to extend anything other then Object. I think that this should be done using generic classes but so far I have not worked with generic classes like this. Can you please point to a best way on how to acomplish this? Example code would be best.
Many thanks
You can consult the source code of Jackson library and look inside (or debug) the method BeanDeserializer#vanillaDeserialize(), there you'll find the loop which traverse through all json tokens, finds the corresponding fields and sets their values.
As a proof of concept, I've extracted part of the logic from Jacskson and wrapped it inside a naive (and fragile) object mapper and a naive (and fragile) json parser:
public static class NaiveObjectMapper {
private Map<String, Object> fieldsAndMethods;
private NaiveJsonParser parser;
public <T> T readValue(String content, Class<T> valueType) {
parser = new NaiveJsonParser(content);
try {
// aggregate all value type fields and methods inside a map
fieldsAndMethods = new HashMap<>();
for (Field field : valueType.getDeclaredFields()) {
fieldsAndMethods.put(field.getName(), field);
}
for (Method method : valueType.getMethods()) {
fieldsAndMethods.put(method.getName(), method);
}
// create an instance of value type by calling its default constructor
Constructor<T> constructor = valueType.getConstructor();
Object bean = constructor.newInstance(new Object[0]);
// loop through all json nodes
String propName;
while ((propName = parser.nextFieldName()) != null) {
// find the corresponding field
Field prop = (Field) fieldsAndMethods.get(propName);
// get and set field value
deserializeAndSet(prop, bean);
}
return (T) bean;
} catch (NoSuchMethodException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
} catch (InstantiationException e) {
e.printStackTrace();
} catch (InvocationTargetException e) {
e.printStackTrace();
}
return null;
}
private void deserializeAndSet(Field prop, Object bean) {
Class<?> propType = prop.getType();
Method setter = (Method) fieldsAndMethods.get(getFieldSetterName(prop));
try {
if (propType.isPrimitive()) {
if (propType.getName().equals("int")) {
setter.invoke(bean, parser.getIntValue());
}
} else if (propType == String.class) {
setter.invoke(bean, parser.getTextValue());
}
} catch (IllegalAccessException e) {
e.printStackTrace();
} catch (InvocationTargetException e) {
e.printStackTrace();
}
}
private String getFieldSetterName(Field prop) {
String propName = prop.getName();
return "set" + propName.substring(0, 1).toUpperCase() + propName.substring(1);
}
}
class NaiveJsonParser {
String[] nodes;
int currentNodeIdx = -1;
String currentProperty;
String currentValueStr;
public NaiveJsonParser(String content) {
// split the content into 'property:value' nodes
nodes = content.replaceAll("[{}]", "").split(",");
}
public String nextFieldName() {
if ((++currentNodeIdx) >= nodes.length) {
return null;
}
String[] propertyAndValue = nodes[currentNodeIdx].split(":");
currentProperty = propertyAndValue[0].replace("\"", "").trim();
currentValueStr = propertyAndValue[1].replace("\"", "").trim();
return currentProperty;
}
public String getTextValue() {
return String.valueOf(currentValueStr);
}
public int getIntValue() {
return Integer.valueOf(currentValueStr).intValue();
}
}
public static class User {
private int id;
private String name;
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
#Override
public String toString() {
return "id = " + id + ", name = \"" + name + "\"";
}
}
To see the deserialization in action run:
String json = "{\"id\":1, \"name\":\"jsmith\"}";
NaiveObjectMapper objectMapper = new NaiveObjectMapper();
User user = objectMapper.readValue(json, User.class);
System.out.println(user);
Or try online.
However I recommend not to reinvent the wheel and use Jackson and in case you need some custom actions you can use custom deserialization, see here and here.
I want to make a Map (String ,Object) like this
{AssessmentId=0, Physical_name='ram', Physical_height=20, Physical_weight=60}
from my Pojo Class - InitialAssessment
public class InitialAssessment {
private long AssessmentId;
private String physical_name;
private String physical_gender;
private int physical_height;
private float physical_weight;
// all getter And setter is Created here
}
without using any external Library like Gson etc.
You can use this approach:
public Map getMapFromPojo(InitialAssessment assessment) throws Exception {
Map<String, Object> map = new HashMap<>();
if (assessment != null) {
Method[] methods = assessment.getClass().getMethods();
for (Method method : methods) {
String name = method.getName();
if (name.startsWith("get") && !name.equalsIgnoreCase("getClass")) {
Object value = "";
try {
value = method.invoke(assessment);
map.put(name.substring(name.indexOf("get") + 3), value);
} catch (Exception e) {
e.printStackTrace();
}
}
}
return map;
}
return null;
}
It will give you map for pojo class like this:
Output:
{AssessmentId=0, Physical_name='ram', Physical_gender='Male' , Physical_height=20, Physical_weight=60}
TLDR: I'd like to know how to extend fit.TypeAdaptor so that I can invoke a method that expects parameters as default implementation of TypeAdaptor invokes the binded (bound ?) method by reflection and assumes it's a no-param method...
Longer version -
I'm using fit to build a test harness for my system (a service that returns a sorted list of custom objects). In order to verify the system, I thought I'd use fit.RowFixture to assert attributes of the list items.
Since RowFixture expects the data to be either a public attribute or a public method, I thought of using a wrapper over my custom object (say InstanceWrapper) - I also tried to implement the suggestion given in this previous thread about formatting data in RowFixture.
The trouble is that my custom object has around 41 attributes and I'd like to provide testers with the option of choosing which attributes they want to verify in this RowFixture. Plus, unless I dynamically add fields/methods to my InstanceWrapper class, how will RowFixture invoke either of my getters since both expect the attribute name to be passed as a param (code copied below) ?
I extended RowFixture to bind on my method but I'm not sure how to extend TypeAdaptor so that it invokes with the attr name..
Any suggestions ?
public class InstanceWrapper {
private Instance instance;
private Map<String, Object> attrs;
public int index;
public InstanceWrapper() {
super();
}
public InstanceWrapper(Instance instance) {
this.instance = instance;
init(); // initialise map
}
private void init() {
attrs = new HashMap<String, Object>();
String attrName;
for (AttrDef attrDef : instance.getModelDef().getAttrDefs()) {
attrName = attrDef.getAttrName();
attrs.put(attrName, instance.getChildScalar(attrName));
}
}
public String getAttribute(String attr) {
return attrs.get(attr).toString();
}
public String description(String attribute) {
return instance.getChildScalar(attribute).toString();
}
}
public class MyDisplayRules extends fit.RowFixture {
#Override
public Object[] query() {
List<Instance> list = PHEFixture.hierarchyList;
return convertInstances(list);
}
private Object[] convertInstances(List<Instance> instances) {
Object[] objects = new Object[instances.size()];
InstanceWrapper wrapper;
int index = 0;
for (Instance instance : instances) {
wrapper = new InstanceWrapper(instance);
wrapper.index = index;
objects[index++] = wrapper;
}
return objects;
}
#Override
public Class getTargetClass() {
return InstanceWrapper.class;
}
#Override
public Object parse(String s, Class type) throws Exception {
return super.parse(s, type);
}
#Override
protected void bind(Parse heads) {
columnBindings = new TypeAdapter[heads.size()];
for (int i = 0; heads != null; i++, heads = heads.more) {
String name = heads.text();
String suffix = "()";
try {
if (name.equals("")) {
columnBindings[i] = null;
} else if (name.endsWith(suffix)) {
columnBindings[i] = bindMethod("description", name.substring(0, name.length()
- suffix.length()));
} else {
columnBindings[i] = bindField(name);
}
} catch (Exception e) {
exception(heads, e);
}
}
}
protected TypeAdapter bindMethod(String name, String attribute) throws Exception {
Class partypes[] = new Class[1];
partypes[0] = String.class;
return PHETypeAdaptor.on(this, getTargetClass().getMethod("getAttribute", partypes), attribute);
}
}
For what it's worth, here's how I eventually worked around the problem:
I created a custom TypeAdapter (extending TypeAdapter) with the additional public attribute (String) attrName. Also:
#Override
public Object invoke() throws IllegalAccessException, InvocationTargetException {
if ("getAttribute".equals(method.getName())) {
Object params[] = { attrName };
return method.invoke(target, params);
} else {
return super.invoke();
}
}
Then I extended fit.RowFixture and made the following overrides:
public getTargetClass() - to return my class reference
protected TypeAdapter bindField(String name) throws Exception - this is a protected method in ColumnFixture which I modified so that it would use my class's getter method:
#Override
protected TypeAdapter bindField(String name) throws Exception {
String fieldName = camel(name);
// for all attributes, use method getAttribute(String)
Class methodParams[] = new Class[1];
methodParams[0] = String.class;
TypeAdapter a = TypeAdapter.on(this, getTargetClass().getMethod("getAttribute", methodParams));
PHETypeAdapter pheAdapter = new PHETypeAdapter(fieldName);
pheAdapter.target = a.target;
pheAdapter.fixture = a.fixture;
pheAdapter.field = a.field;
pheAdapter.method = a.method;
pheAdapter.type = a.type;
return pheAdapter;
}
I know this is not a neat solution, but it was the best I could come up with. Maybe I'll get some better solutions here :-)