I am using SpringBoot with MongoDB, I have a document as it shows in the image below.
document in mongodb(image)
And I mapped to a java class:
#Document(collection = "instrument")
public abstract class Instrument extends BaseMongoEntity<String> {
#NotEmpty
private String feedProviderId;
#NotEmpty
private String code;
#NotEmpty
private String name;
private Map<String, Object> instrumentData;
private Map<String, Object> commonInfo;
private List<InstrumentHistoricalData> historicalData;
private List<DelayedInstrumentData> delayedData;
private String market;
// Getters, setters, builders, etc.
}
Of course, the instrumentData field contains lots of data, but for the sake of the argument I just wrote those two in the document showed.
So my problem is that I can`t convert the NOW_PRICE to BigDecimal. I can write it with with no problem, BigDecimal to Decimal128, but not the other way around.
I have configured both reader and writer as show below:
#Configuration
public class MongoConfig {
#Bean
public MongoCustomConversions mongoCustomConversions() {
return new MongoCustomConversions(Arrays.asList(
new BigDecimalDecimal128Converter(),
new Decimal128BigDecimalConverter()
));
}
#WritingConverter
private static class BigDecimalDecimal128Converter implements
Converter<BigDecimal, Decimal128> {
#Override
public Decimal128 convert(#NonNull BigDecimal source) {
return new Decimal128(source);
}
}
#ReadingConverter
private static class Decimal128BigDecimalConverter implements
Converter<Decimal128, BigDecimal> {
#Override
public BigDecimal convert(#NonNull Decimal128 source) {
return source.bigDecimalValue();
}
}
}
So checking the MappingMongoConverter.class, I noted this:
protected Map<Object, Object> readMap(TypeInformation<?> type, Bson bson, ObjectPath path) {
Assert.notNull(bson, "Document must not be null!");
Assert.notNull(path, "Object path must not be null!");
Class<?> mapType = typeMapper.readType(bson, type).getType();
TypeInformation<?> keyType = type.getComponentType();
TypeInformation<?> valueType = type.getMapValueType();
Class<?> rawKeyType = keyType != null ? keyType.getType() : null;
Class<?> rawValueType = valueType != null ? valueType.getType() : null;
Map<String, Object> sourceMap = asMap(bson);
Map<Object, Object> map = CollectionFactory.createMap(mapType, rawKeyType, sourceMap.keySet().size());
if (!DBRef.class.equals(rawValueType) && isCollectionOfDbRefWhereBulkFetchIsPossible(sourceMap.values())) {
bulkReadAndConvertDBRefMapIntoTarget(valueType, rawValueType, sourceMap, map);
return map;
}
for (Entry<String, Object> entry : sourceMap.entrySet()) {
if (typeMapper.isTypeKey(entry.getKey())) {
continue;
}
Object key = potentiallyUnescapeMapKey(entry.getKey());
if (rawKeyType != null && !rawKeyType.isAssignableFrom(key.getClass())) {
key = conversionService.convert(key, rawKeyType);
}
Object value = entry.getValue();
TypeInformation<?> defaultedValueType = valueType != null ? valueType : ClassTypeInformation.OBJECT;
if (value instanceof Document) {
map.put(key, read(defaultedValueType, (Document) value, path));
} else if (value instanceof BasicDBObject) {
map.put(key, read(defaultedValueType, (BasicDBObject) value, path));
} else if (value instanceof DBRef) {
map.put(key, DBRef.class.equals(rawValueType) ? value
: readAndConvertDBRef((DBRef) value, defaultedValueType, ObjectPath.ROOT, rawValueType));
} else if (value instanceof List) {
map.put(key, readCollectionOrArray(valueType != null ? valueType : ClassTypeInformation.LIST,
(List<Object>) value, path));
} else {
map.put(key, getPotentiallyConvertedSimpleRead(value, rawValueType));
}
}
return map;
}
So it's only asking if the value is instance of Document, BasicDBObject, DBRef or List. Otherwise it assumes the value is already mapped, which is not, because it's a numeric value and that possibility it's not being considered.
Am I missing something? Is there a workoaround for this problem? Thank you!
If you dig down into getPotentiallyConvertedSimpleRead(…) you see that we check the CustomConversionInstance for whether there is a conversion registered. I can only assume there's something wrong in your configuration but there's nothing terribly off I can see. Do you have an example project to share that I can play with?
Related
I have an enum like below. Until recently, all variables were single-valued. However, now TYPE4 can have one of three acceptable values. I was hoping to simply modify this enum to accommodate for TYPE4, but thinking perhaps having only one type that is multi-valued means I need to use an object for mapping rather than an enum. I would be grateful for any insights. Thank you.
public enum Record {
TYPE1("TYPE1"),
TYPE2("TYPE2"),
TYPE3("TYPE3"),
TYPE4_MULTI(TYPE_A or TYPE_B or TYPE_C);
private final String value;
public static final Map<Record, String> enumMap = new EnumMap<Record, String>(
Record.class);
static {
for (Record e : Record.values())
enumMap.put(e, e.getValue());
}
Record(String value) {
this.value = value;
}
public String getValue() {
return value;
}
}
Operationally, I use this enum in a factory class to determine which of 4 types of subclasses I should instantiate. I do this by have each of the subclasses know its own type like this:
#Override
public String getType() {
return Record.TYPE1.getValue();
}
,and then the factory class pre-builds a set of the subclasses like this:
#Component
public class RecordProcessorFactory {
#Autowired
public RecordProcessorFactory(List<RecordProcessor> processors) {
for (RecordProcessor recordProcessor : processors) {
processorCache.put(recordProcessor.getType(), recordProcessor);
}
}
private static final Map<String, RecordProcessor> processorCache = new HashMap<String, RecordProcessor>();
public RecordProcessor getSyncProcessor(String type) {
RecordProcessor service = processorCache.get(type);
if(service == null) throw new RuntimeException("Unknown service type: " + type);
return service;
}
}
You could use a String array to store multiple values, note that your logic may change with enumMap that way.
public enum Record {
TYPE1("TYPE1"),
TYPE2("TYPE2"),
TYPE3("TYPE3"),
TYPE4_MULTI("TYPE_A", "TYPE_B", "TYPE_C");
private final String[] values;
public static final Map<Record, String[]> enumMap = new EnumMap<Record, String[]>(Record.class);
static {
for (Record e : Record.values())
enumMap.put(e, e.getValues());
}
Record(String... values) {
this.values = values;
}
public String[] getValues() {
return values;
}
}
In case you need to get the Enum from a String value, you could add this static method:
public static Optional<Record> optionalValueOf(final String value) {
for (Record record : values()) {
for (String recordValue : record.values) {
if (null == value && null == recordValue || value.equals(recordValue)) {
return Optional.of(record);
}
}
}
return Optional.empty();
}
I think it's better to encapsulate values in the enum. It should be immutable (array is not immutable data storage).
#lombok.Getter
public enum Record {
TYPE1("TYPE1"),
TYPE2("TYPE2"),
TYPE3("TYPE3"),
TYPE4_MULTI("TYPE_A", "TYPE_B", "TYPE_C");
// immutable list
private final List<String> values;
Record(String... values) {
this.values = Arrays.stream(values)
.collect(Collectors.toList());
}
}
P.S. Map<Record, String> enumMap I think is useless, because you have a Record already and all you need just call record.getValues() instead of Record.enumMaps.get(record). Also, this is breakes OOP encapsulation.
I would like to use Guava as cache but I can't seem to find Guava has the capability of allowing me to load multiple items and get multiple items.
I see CacheLoader has the following:
#Override
public Value load(String key) {
return getKey();
}
And what I need to load is:
#Override
public List<Value> load(List<String> keys) {
return getKeys();
}
I would also expect to get one or a list of items from the cache, but I am happy even if I had to wrap that one item into a list just to get it.
I'm new to Guava and I'm not sure if Guava has such functionality?
You can use CacheLoader.loadAll() to load multiple items, and LoadingCache.getAll() to get them.
For example:
new CacheLoader<String, Value>() {
#Override
public Value load(String key) {
return getKey();
}
#Override
public Map<String, Value> load(Iterable<? extends String> keys) {
return getKeys();
}
}
//...
List<String> keys = Arrays.asList("key1", "key2", "key3");
ImmutableMap<String, Value> values = cache.getAll(keys);
You can create a LoadingCache(just for e.g.) as:
private final LoadingCache<String, Object> cache;
where String could be your key's datatype and Object could be your value's datatype.
You can then initialise it using CacheBuilder as:
cache = CacheBuilder.newBuilder().
initialCapacity(10).
maximumSize(50).
recordStats().
build(new CacheLoader<String, Object>() {
#Override
public Object load(String s) throws Exception {
return null;
}
});
and further more implement methods to get a value from the cache based on the key and put a value into the cache for a key value pair in somewhat this format:
public Object get(String key) {
try {
return cache.getIfPresent(key);
} catch (Exception e) {
System.out.println(e.getMessage());
return null;
}
}
public boolean put(String key, Object object) {
cache.put(key, object);
return true;
}
Public class Cache {
private Cache<Key, Value> cache;
prviate DataDAO cataDao;
public Cache(DataDAO dataDao) {
_dataDao = DataDAO;
cache = CacheBuilder.newBuilder().build();
}
public Value getValue(Key key) {
Value value;
if (cache.getIfPresent(key) == null) {
value = dataDao.getById(key);
cache.put(key, value);
return value;
}else{
return cache.getIfPresent(key);
}
}
Public List<Value> getValues(List<Key> keys) {
List<Value> values = new ArrayList<>();
List<Key> notInCacheKeys = new ArrayList<>();
for (Key key: keys) {
if (cache.getIfPresent(key)) == null) {
notInCacheKeys.add(key);
}
}
List<Value> newlyRetrievedValues = _dataDao.getByIds(notInCacheKeys);
//Store Keys and Values in order
//Return value and list of values from cache
}
}
I have decided to abandon CacheLoader and LoadingCache and just work with cache directly.
How can I get JSON to skip serialization of fields with specific default values? I can for example annotate the fields with a custom annotation for a TypeAdapter to parse, however I struggle finding out how to write such TypeAdapter without completely reinventing the wheel (i.e. I could skip the write method of ReflectiveTypeAdapterFactory and write my own with reflection).
Background: I'm sending a GUI over Json, and I want to expand e.g. a panel widget with every possible property (background, border, etc.) but not send all of those as null values because most panels use default values anyway.
POJO:
public class LabelWidget {
private final String label;
#DefaultValue("c") private final String align;
#DefaultValue("12") private final String size;
public LabelWidget(String label, String align, String size) {
...
}
public String getLabel() {return label;}
public String getAlign() {return align==null||align.isEmpty() ? "c" : align;}
public String getSize() {return size==null||size.isEmpty() ? "12" : size;}
}
Objects:
a = new LabelWidget("foo", "c", "12");
b = new LabelWidget("bar", "l", "50%");
Wanted results:
{label:"foo"}
{label:"bar", align:"l", size:"50%"}
Not sure how #DefaultValue integrates with GSON but one of the solution that works is to actually nullify fields in case of default values during construction time e.g.:
public LabelWidget(String label, String align, String size) {
this.label = label;
this.align = StringUtils.isBlank(align) || "c".equals(align) ? null : align;
this.size = StringUtils.isBlank(size) || "12".equals(size) ? null : size;
}
In such case your getters will return correct values and GSON will not serialize null values.
There are no options in Gson like that to accomplish your request and you still have to process such an annotation yourself. Ideally, it would be great if Gson could provide visiting ReflectiveTypeAdapterFactory, or probably enhance ExclusionStrategy in order to access fields values along with associated fields. However, none of those are available, but it's possible to take one of the following options:
convert your value objects to Map<String, Object> instances (requires intermediate objects to be constructed; probably expensive);
re-implement a #DefaultValue-aware ReflectiveTypeAdapterFactory (I guess it's the best solution, but probably it could be even more generalized);
temporarily strip the #DefaultValue-annotated fields from serialized objects and revert their state back once they are serialized (potentially unsafe and probably a performance hit);
clone values and stripping the values according to nulls so no worrying about reverting back (may be expensive too).
Option #3 can be implemented as follows:
#Target(FIELD)
#Retention(RUNTIME)
#interface DefaultValue {
String value() default "";
}
final class DefaultValueTypeAdapterFactory
implements TypeAdapterFactory {
private static final TypeAdapterFactory defaultValueTypeAdapterFactory = new DefaultValueTypeAdapterFactory();
private DefaultValueTypeAdapterFactory() {
}
static TypeAdapterFactory getDefaultValueTypeAdapterFactory() {
return defaultValueTypeAdapterFactory;
}
#Override
public <T> TypeAdapter<T> create(final Gson gson, final TypeToken<T> typeToken) {
if ( DefaultValueTypeAdapter.hasDefaults(typeToken.getType()) ) {
return new DefaultValueTypeAdapter<>(gson.getDelegateAdapter(this, typeToken));
}
return null;
}
private static final class DefaultValueTypeAdapter<T>
extends TypeAdapter<T> {
private final TypeAdapter<T> delegateAdapter;
private DefaultValueTypeAdapter(final TypeAdapter<T> delegateAdapter) {
this.delegateAdapter = delegateAdapter;
}
#Override
public void write(final JsonWriter out, final T value)
throws IOException {
final Map<Field, Object> defaults = getDefaults(value);
try {
resetFields(value, defaults.keySet());
delegateAdapter.write(out, value);
} finally {
setFields(value, defaults);
}
}
#Override
public T read(final JsonReader in)
throws IOException {
final T value = delegateAdapter.read(in);
trySetAnnotationDefaults(value);
return value;
}
private static boolean hasDefaults(final Type type) {
if ( !(type instanceof Class) ) {
return false;
}
final Class<?> c = (Class<?>) type;
return Stream.of(c.getDeclaredFields())
.flatMap(f -> Stream.of(f.getAnnotationsByType(DefaultValue.class)))
.findAny()
.isPresent();
}
private static Map<Field, Object> getDefaults(final Object o) {
if ( o == null ) {
return emptyMap();
}
final Class<?> c = o.getClass();
final Map<Field, Object> map = Stream.of(c.getDeclaredFields())
.filter(f -> f.isAnnotationPresent(DefaultValue.class))
.filter(f -> !f.getType().isPrimitive()) // primitive fields cause ambiguities
.peek(f -> f.setAccessible(true))
.filter(f -> {
final String defaultValue = f.getAnnotation(DefaultValue.class).value();
final String comparedValue = ofNullable(getUnchecked(o, f)).map(Object::toString).orElse(null);
return defaultValue.equals(comparedValue);
})
.collect(toMap(identity(), f -> getUnchecked(o, f)));
return unmodifiableMap(map);
}
private static void trySetAnnotationDefaults(final Object o) {
if ( o == null ) {
return;
}
final Class<?> c = o.getClass();
Stream.of(c.getDeclaredFields())
.filter(f -> f.isAnnotationPresent(DefaultValue.class))
.forEach(f -> {
f.setAccessible(true);
if ( getUnchecked(o, f) == null ) {
final String annotationValue = f.getAnnotation(DefaultValue.class).value();
setOrDefaultUnchecked(o, f, parseDefaultValue(f.getType(), annotationValue));
}
});
}
private static Object parseDefaultValue(final Class<?> type, final String rawValue) {
if ( type == String.class ) {
return rawValue;
}
if ( type == Boolean.class ) {
return Boolean.valueOf(rawValue);
}
if ( type == Byte.class ) {
return Byte.valueOf(rawValue);
}
if ( type == Short.class ) {
return Short.valueOf(rawValue);
}
if ( type == Integer.class ) {
return Integer.valueOf(rawValue);
}
if ( type == Long.class ) {
return Long.valueOf(rawValue);
}
if ( type == Float.class ) {
return Float.valueOf(rawValue);
}
if ( type == Double.class ) {
return Double.valueOf(rawValue);
}
if ( type == Character.class ) {
final int length = rawValue.length();
if ( length != 1 ) {
throw new IllegalArgumentException("Illegal raw value length: " + length + " for " + rawValue);
}
return rawValue.charAt(0);
}
throw new AssertionError(type);
}
private static void resetFields(final Object o, final Iterable<Field> fields) {
fields.forEach(f -> setOrDefaultUnchecked(o, f, null));
}
private static void setFields(final Object o, final Map<Field, Object> defaults) {
if ( o == null ) {
return;
}
defaults.entrySet().forEach(e -> setOrDefaultUnchecked(o, e.getKey(), e.getValue()));
}
private static Object getUnchecked(final Object o, final Field field) {
try {
return field.get(o);
} catch ( final IllegalAccessException ex ) {
throw new RuntimeException(ex);
}
}
private static void setOrDefaultUnchecked(final Object o, final Field field, final Object value) {
try {
field.set(o, value);
} catch ( final IllegalAccessException ex ) {
throw new RuntimeException(ex);
}
}
}
}
So:
final Gson gson = new GsonBuilder()
.registerTypeAdapterFactory(getDefaultValueTypeAdapterFactory())
.create();
final LabelWidget before = new LabelWidget("label", "c", "12");
out.println(before);
final String json = gson.toJson(before);
out.println(json);
final LabelWidget after = gson.fromJson(json, LabelWidget.class);
out.println(after);
LabelWidget{label='label', align='c', size='12'}
{"label":"label"}
LabelWidget{label='label', align='c', size='12'}
Or you might also re-consider the design of your data transfer architecture, and probably proceed with just nulls (that however does not let distinguish between "really" null and something like undefined).
If I have class like this:
class MyObject {
public int myInt;
public String myString;
}
Is it possible to convert instance of this class to HashMap without implementing converting code?
MyObject obj = new MyObject();
obj.myInt = 1; obj.myString = "string";
HashMap<String, Object> hs = convert(obj);
hs.getInt("myInt"); // returns 1
hs.getString("myString"); // returns "string"
Does Java provide that kind of solution, or I need to implement convert by myself?
My Class has more than 50 fields and writing converter for each field is not so good idea.
With jackson library this is also possible
MyObject obj = new MyObject();
obj.myInt = 1;
obj.myString = "1";
ObjectMapper mapObject = new ObjectMapper();
Map < String, Object > mapObj = mapObject.convertValue(obj, Map.class);
You can use reflection for implementing this behavior. You can get all fields of the class you want to convert to map iterate over this fields and take the name of each field as key of the map. This will result in a map from String to object.
Map<String, Object> myObjectAsDict = new HashMap<>();
Field[] allFields = SomeClass.class.getDeclaredFields();
for (Field field : allFields) {
Class<?> targetType = field.getType();
Object objectValue = targetType.newInstance();
Object value = field.get(objectValue);
myObjectAsDict.put(field.getName(), value);
}
}
Something like that will do the trick:
MyObject obj = new MyObject();
obj.myInt = 1; obj.myString = "string";
Map<String, Object> map = new HashMap<>();
// Use MyObject.class.getFields() instead of getDeclaredFields()
// If you are interested in public fields only
for (Field field : MyObject.class.getDeclaredFields()) {
// Skip this if you intend to access to public fields only
if (!field.isAccessible()) {
field.setAccessible(true);
}
map.put(field.getName(), field.get(obj));
}
System.out.println(map);
Output:
{myString=string, myInt=1}
You might consider using a map instead of a class.
Or have your class extend a map such as
public class MyObject extends HashMap<String, Object> {
}
If you don't want to use Reflection then you can use my trick. hope this may help for someone.
Suppose your class looks like this.
public class MyClass {
private int id;
private String name;
}
Now Override toString() method in this class. in Eclipse there is a shortcut for generating this method also.
public class MyClass {
private int id;
private String name;
#Override
public String toString() {
StringBuilder builder = new StringBuilder();
builder.append("MyClass [id=").append(id).append(", name=").append(name).append("]");
return builder.toString();
}
}
Now write a method inside this class that will convert your object into Map<String,String>
public Map<String, String> asMap() {
Map<String, String> map = new HashMap<String, String>();
String stringRepresentation = this.toString();
if (stringRepresentation == null || stringRepresentation.trim().equals("")) {
return map;
}
if (stringRepresentation.contains("[")) {
int index = stringRepresentation.indexOf("[");
stringRepresentation = stringRepresentation.substring(index + 1, stringRepresentation.length());
}
if (stringRepresentation.endsWith("]")) {
stringRepresentation = stringRepresentation.substring(0, stringRepresentation.length() - 1);
}
String[] commaSeprated = stringRepresentation.split(",");
for (int i = 0; i < commaSeprated.length; i++) {
String keyEqualsValue = commaSeprated[i];
keyEqualsValue = keyEqualsValue.trim();
if (keyEqualsValue.equals("") || !keyEqualsValue.contains("=")) {
continue;
}
String[] keyValue = keyEqualsValue.split("=", 2);
if (keyValue.length > 1) {
map.put(keyValue[0].trim(), keyValue[1].trim());
}
}
return map;
}
Now from any where in your application you can simply call this method to get your HashMap from the Object. Cheers
Updated approach using reflection:
public static <T> Map<String, String> parseInnerClass(T classInstance) {
LinkedHashMap<String, String> ret = new LinkedHashMap<>();
for (Field attr : classInstance.getClass().getDeclaredFields()) {
String attrValue = "";
attr.setAccessible(true);
try {
attrValue = attr.get(classInstance).toString();
} catch (IllegalAccessException | NullPointerException e) {
// Do not add nothing
}
ret.put(attr.getName(), attrValue);
}
return ret;
}
Given a single Java class I'd like to be able to list all properties that are exposed in all ancestors and recursively traverse all their exposed properties (i.e. public or with getters/setters) in the same way.
Easier to explain with a simple example:
public class BaseClass1 {
private int intProperty; // has getter and setter (not shown)
}
public class SubClass1 extends BaseClass1 {
private int privateSoNotListed;
public SubClass2 subClass2Property;
}
public class BaseClass2 {
public String stringProperty;
}
public class SubClass2 extends BaseClass2 {
private long longProperty; // has getter and setter (not shown)
}
Given SubClass1 above as input, the output would be something like this:
intProperty - int [from BaseClass1]
subClass2Property.stringProperty - String [from BaseClass2]
subClass2Property.longProperty - long [from SubClass2]
It should be possible to write something like this using a bit of clever reflection but I'd rather not reinvent the wheel - is there an existing tool that can do this (perhaps an Eclipse plugin?)
EDIT: Eclipse's Type Hierarchy does a nice job of displaying properties for a single class - the ideal solution in my mind would be if this were a tree view (similar to Package Explorer) with the ability to expand properties that are themselves classes.
See also the duplicate Recursive BeanUtils.describe(), which works also recursively. The following is a custom version we are using (logs in a log4j logger):
import java.beans.BeanInfo;
import java.beans.IntrospectionException;
import java.beans.Introspector;
import java.beans.PropertyDescriptor;
import java.lang.reflect.Field;
import java.lang.reflect.Method;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Map;
import java.util.Set;
import java.util.TreeMap;
import org.apache.commons.beanutils.BeanUtilsBean;
import org.apache.commons.beanutils.ConvertUtilsBean;
import org.apache.log4j.Logger;
/*
* See the original version: https://stackoverflow.com/questions/6133660/recursive-beanutils-describe
*/
public class Inspector {
public static void recursivelyDescribeAndLog(Object ob, Logger log){
log.info(ob.getClass());
try {
Map<String, String> props = recursiveDescribe(ob);
for (Map.Entry<String, String> p : props.entrySet()) {
log.info(" -> " + p.getKey() + "="+p.getValue());
}
} catch (Throwable e) {
log.error(e.getMessage(), e);
}
}
public static Map<String, String> recursiveDescribe(Object object) {
Set cache = new HashSet();
return recursiveDescribe(object, null, cache);
}
private static Map<String, String> recursiveDescribe(Object object, String prefix, Set cache) {
if (object == null || cache.contains(object)) return Collections.EMPTY_MAP;
cache.add(object);
prefix = (prefix != null) ? prefix + "." : "";
Map<String, String> beanMap = new TreeMap<String, String>();
Map<String, Object> properties = getProperties(object);
for (String property : properties.keySet()) {
Object value = properties.get(property);
try {
if (value == null) {
//ignore nulls
} else if (Collection.class.isAssignableFrom(value.getClass())) {
beanMap.putAll(convertAll((Collection) value, prefix + property, cache));
} else if (value.getClass().isArray()) {
beanMap.putAll(convertAll(Arrays.asList((Object[]) value), prefix + property, cache));
} else if (Map.class.isAssignableFrom(value.getClass())) {
beanMap.putAll(convertMap((Map) value, prefix + property, cache));
} else {
beanMap.putAll(convertObject(value, prefix + property, cache));
}
} catch (Exception e) {
e.printStackTrace();
}
}
return beanMap;
}
private static Map<String, Object> getProperties(Object object) {
Map<String, Object> propertyMap = getFields(object);
//getters take precedence in case of any name collisions
propertyMap.putAll(getGetterMethods(object));
return propertyMap;
}
private static Map<String, Object> getGetterMethods(Object object) {
Map<String, Object> result = new HashMap<String, Object>();
BeanInfo info;
try {
info = Introspector.getBeanInfo(object.getClass());
for (PropertyDescriptor pd : info.getPropertyDescriptors()) {
Method reader = pd.getReadMethod();
if (reader != null) {
String name = pd.getName();
if (!"class".equals(name)) {
try {
Object value = reader.invoke(object);
result.put(name, value);
} catch (Exception e) {
//you can choose to do something here
}
}
}
}
} catch (IntrospectionException e) {
//you can choose to do something here
} finally {
return result;
}
}
private static Map<String, Object> getFields(Object object) {
return getFields(object, object.getClass());
}
private static Map<String, Object> getFields(Object object, Class<?> classType) {
Map<String, Object> result = new HashMap<String, Object>();
Class superClass = classType.getSuperclass();
if (superClass != null) result.putAll(getFields(object, superClass));
//get public fields only
Field[] fields = classType.getFields();
for (Field field : fields) {
try {
result.put(field.getName(), field.get(object));
} catch (IllegalAccessException e) {
//you can choose to do something here
}
}
return result;
}
private static Map<String, String> convertAll(Collection<Object> values, String key, Set cache) {
Map<String, String> valuesMap = new HashMap<String, String>();
Object[] valArray = values.toArray();
for (int i = 0; i < valArray.length; i++) {
Object value = valArray[i];
if (value != null) valuesMap.putAll(convertObject(value, key + "[" + i + "]", cache));
}
return valuesMap;
}
private static Map<String, String> convertMap(Map<Object, Object> values, String key, Set cache) {
Map<String, String> valuesMap = new HashMap<String, String>();
for (Object thisKey : values.keySet()) {
Object value = values.get(thisKey);
if (value != null) valuesMap.putAll(convertObject(value, key + "[" + thisKey + "]", cache));
}
return valuesMap;
}
private static ConvertUtilsBean converter = BeanUtilsBean.getInstance().getConvertUtils();
private static Map<String, String> convertObject(Object value, String key, Set cache) {
//if this type has a registered converted, then get the string and return
if (converter.lookup(value.getClass()) != null) {
String stringValue = converter.convert(value);
Map<String, String> valueMap = new HashMap<String, String>();
valueMap.put(key, stringValue);
return valueMap;
} else {
//otherwise, treat it as a nested bean that needs to be described itself
return recursiveDescribe(value, key, cache);
}
}
}
have a look at apache commons beanutils. they have utility classes that will allow you to list properties (among other things) - specifically PropertyUtilsBean.getPropertyDescriptors().
note that their definiteion of a "property" is something that is accessible/editable via getter/setter methods. if you want to list fields you'd need to do something else
Have just found a useful way of achieving something fairly similar to what was originally asked via Eclipse's Type Hierarchy.
There is a toggle named "Show All Inherited Members" as shown by the red arrow below:
After selecting this, the fields and methods from all superclasses are displayed in addition to those for the selected class (with a clear indication of where each one came from), as shown below:
(Of course, this includes more than just properties but since the getters are displayed in alphabetical order and there are icons for public/private/protected, it can be used to obtain this information easily enough.)