During a Hibernate Session, I am loading some objects and some of them are loaded as proxies due to lazy loading. It's all OK and I don't want to turn lazy loading off.
But later I need to send some of the objects (actually one object) to the GWT client via RPC. And it happens that this concrete object is a proxy. So I need to turn it into a real object. I can't find a method like "materialize" in Hibernate.
How can I turn some of the objects from proxies to reals knowing their class and ID?
At the moment the only solution I see is to evict that object from Hibernate's cache and reload it, but it is really bad for many reasons.
Here's a method I'm using.
public static <T> T initializeAndUnproxy(T entity) {
if (entity == null) {
throw new
NullPointerException("Entity passed for initialization is null");
}
Hibernate.initialize(entity);
if (entity instanceof HibernateProxy) {
entity = (T) ((HibernateProxy) entity).getHibernateLazyInitializer()
.getImplementation();
}
return entity;
}
Since Hibernate ORM 5.2.10, you can do it likee this:
Object unproxiedEntity = Hibernate.unproxy(proxy);
Before Hibernate 5.2.10. the simplest way to do that was to use the unproxy method offered by Hibernate internal PersistenceContext implementation:
Object unproxiedEntity = ((SessionImplementor) session)
.getPersistenceContext()
.unproxy(proxy);
Try to use Hibernate.getClass(obj)
I've written following code which cleans object from proxies (if they are not already initialized)
public class PersistenceUtils {
private static void cleanFromProxies(Object value, List<Object> handledObjects) {
if ((value != null) && (!isProxy(value)) && !containsTotallyEqual(handledObjects, value)) {
handledObjects.add(value);
if (value instanceof Iterable) {
for (Object item : (Iterable<?>) value) {
cleanFromProxies(item, handledObjects);
}
} else if (value.getClass().isArray()) {
for (Object item : (Object[]) value) {
cleanFromProxies(item, handledObjects);
}
}
BeanInfo beanInfo = null;
try {
beanInfo = Introspector.getBeanInfo(value.getClass());
} catch (IntrospectionException e) {
// LOGGER.warn(e.getMessage(), e);
}
if (beanInfo != null) {
for (PropertyDescriptor property : beanInfo.getPropertyDescriptors()) {
try {
if ((property.getWriteMethod() != null) && (property.getReadMethod() != null)) {
Object fieldValue = property.getReadMethod().invoke(value);
if (isProxy(fieldValue)) {
fieldValue = unproxyObject(fieldValue);
property.getWriteMethod().invoke(value, fieldValue);
}
cleanFromProxies(fieldValue, handledObjects);
}
} catch (Exception e) {
// LOGGER.warn(e.getMessage(), e);
}
}
}
}
}
public static <T> T cleanFromProxies(T value) {
T result = unproxyObject(value);
cleanFromProxies(result, new ArrayList<Object>());
return result;
}
private static boolean containsTotallyEqual(Collection<?> collection, Object value) {
if (CollectionUtils.isEmpty(collection)) {
return false;
}
for (Object object : collection) {
if (object == value) {
return true;
}
}
return false;
}
public static boolean isProxy(Object value) {
if (value == null) {
return false;
}
if ((value instanceof HibernateProxy) || (value instanceof PersistentCollection)) {
return true;
}
return false;
}
private static Object unproxyHibernateProxy(HibernateProxy hibernateProxy) {
Object result = hibernateProxy.writeReplace();
if (!(result instanceof SerializableProxy)) {
return result;
}
return null;
}
#SuppressWarnings("unchecked")
private static <T> T unproxyObject(T object) {
if (isProxy(object)) {
if (object instanceof PersistentCollection) {
PersistentCollection persistentCollection = (PersistentCollection) object;
return (T) unproxyPersistentCollection(persistentCollection);
} else if (object instanceof HibernateProxy) {
HibernateProxy hibernateProxy = (HibernateProxy) object;
return (T) unproxyHibernateProxy(hibernateProxy);
} else {
return null;
}
}
return object;
}
private static Object unproxyPersistentCollection(PersistentCollection persistentCollection) {
if (persistentCollection instanceof PersistentSet) {
return unproxyPersistentSet((Map<?, ?>) persistentCollection.getStoredSnapshot());
}
return persistentCollection.getStoredSnapshot();
}
private static <T> Set<T> unproxyPersistentSet(Map<T, ?> persistenceSet) {
return new LinkedHashSet<T>(persistenceSet.keySet());
}
}
I use this function over result of my RPC services (via aspects) and it cleans recursively all result objects from proxies (if they are not initialized).
The way I recommend with JPA 2 :
Object unproxied = entityManager.unwrap(SessionImplementor.class).getPersistenceContext().unproxy(proxy);
Starting from Hiebrnate 5.2.10 you can use Hibernate.proxy method to convert a proxy to your real entity:
MyEntity myEntity = (MyEntity) Hibernate.unproxy( proxyMyEntity );
The another workaround is to call
Hibernate.initialize(extractedObject.getSubojbectToUnproxy());
Just before closing the session.
With Spring Data JPA and Hibernate, I was using subinterfaces of JpaRepository to look up objects belonging to a type hierarchy that was mapped using the "join" strategy. Unfortunately, the queries were returning proxies of the base type instead of instances of the expected concrete types. This prevented me from casting the results to the correct types. Like you, I came here looking for an effective way to get my entites unproxied.
Vlad has the right idea for unproxying these results; Yannis provides a little more detail. Adding to their answers, here's the rest of what you might be looking for:
The following code provides an easy way to unproxy your proxied entities:
import org.hibernate.engine.spi.PersistenceContext;
import org.hibernate.engine.spi.SessionImplementor;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.jpa.repository.JpaContext;
import org.springframework.stereotype.Component;
#Component
public final class JpaHibernateUtil {
private static JpaContext jpaContext;
#Autowired
JpaHibernateUtil(JpaContext jpaContext) {
JpaHibernateUtil.jpaContext = jpaContext;
}
public static <Type> Type unproxy(Type proxied, Class<Type> type) {
PersistenceContext persistenceContext =
jpaContext
.getEntityManagerByManagedType(type)
.unwrap(SessionImplementor.class)
.getPersistenceContext();
Type unproxied = (Type) persistenceContext.unproxyAndReassociate(proxied);
return unproxied;
}
}
You can pass either unproxied entites or proxied entities to the unproxy method. If they are already unproxied, they'll simply be returned. Otherwise, they'll get unproxied and returned.
Hope this helps!
Thank you for the suggested solutions! Unfortunately, none of them worked for my case: receiving a list of CLOB objects from Oracle database through JPA - Hibernate, using a native query.
All of the proposed approaches gave me either a ClassCastException or just returned java Proxy object (which deeply inside contained the desired Clob).
So my solution is the following (based on several above approaches):
Query sqlQuery = manager.createNativeQuery(queryStr);
List resultList = sqlQuery.getResultList();
for ( Object resultProxy : resultList ) {
String unproxiedClob = unproxyClob(resultProxy);
if ( unproxiedClob != null ) {
resultCollection.add(unproxiedClob);
}
}
private String unproxyClob(Object proxy) {
try {
BeanInfo beanInfo = Introspector.getBeanInfo(proxy.getClass());
for (PropertyDescriptor property : beanInfo.getPropertyDescriptors()) {
Method readMethod = property.getReadMethod();
if ( readMethod.getName().contains("getWrappedClob") ) {
Object result = readMethod.invoke(proxy);
return clobToString((Clob) result);
}
}
}
catch (InvocationTargetException | IntrospectionException | IllegalAccessException | SQLException | IOException e) {
LOG.error("Unable to unproxy CLOB value.", e);
}
return null;
}
private String clobToString(Clob data) throws SQLException, IOException {
StringBuilder sb = new StringBuilder();
Reader reader = data.getCharacterStream();
BufferedReader br = new BufferedReader(reader);
String line;
while( null != (line = br.readLine()) ) {
sb.append(line);
}
br.close();
return sb.toString();
}
Hope this will help somebody!
I found a solution to deproxy a class using standard Java and JPA API. Tested with hibernate, but does not require hibernate as a dependency and should work with all JPA providers.
Onle one requirement - its necessary to modify parent class (Address) and add a simple helper method.
General idea: add helper method to parent class which returns itself. when method called on proxy, it will forward the call to real instance and return this real instance.
Implementation is a little bit more complex, as hibernate recognizes that proxied class returns itself and still returns proxy instead of real instance. Workaround is to wrap returned instance into a simple wrapper class, which has different class type than the real instance.
In code:
class Address {
public AddressWrapper getWrappedSelf() {
return new AddressWrapper(this);
}
...
}
class AddressWrapper {
private Address wrappedAddress;
...
}
To cast Address proxy to real subclass, use following:
Address address = dao.getSomeAddress(...);
Address deproxiedAddress = address.getWrappedSelf().getWrappedAddress();
if (deproxiedAddress instanceof WorkAddress) {
WorkAddress workAddress = (WorkAddress)deproxiedAddress;
}
Related
I've been using modelmapper and java 8 Optionals all around the application which was working fine because they were primitive types; until I changed one of my model objects' field to Optional type. Then all hell broke loose. Turns out many libraries cannot handle generics very well.
Here is the structure
public class MyObjectDto
{
private Optional<MySubObjectDto> mySubObject;
}
public MyObject
{
privae Optional<MySubjObject> mySubObject;
}
When I attempt to map MyObjectDto to MyObject, modelmapper calls
public void setMySubObject(Optional<MySubObject> mySubObject){
this.mySubObject = mySubObject;
}
with Optional<MySubObjectDto>, which I don't understand how that's even possible (there is no inheritance between them). Of course that crashes fast. For now I've changed my setters to accept Dto type just to survive the day but that's not going to work on the long run. Is there a better way to get around this, or shall I create an issue?
So I digged into the modelmapper code and have done this looking at some generic implementations:
modelMapper.createTypeMap(Optional.class, Optional.class).setConverter(new OptionalConverter());
public class OptionalConverter implements ConditionalConverter<Optional, Optional> {
public MatchResult match(Class<?> sourceType, Class<?> destinationType) {
if (Optional.class.isAssignableFrom(destinationType)) {
return MatchResult.FULL;
} else {
return MatchResult.NONE;
}
}
private Class<?> getElementType(MappingContext<Optional, Optional> context) {
Mapping mapping = context.getMapping();
if (mapping instanceof PropertyMapping) {
PropertyInfo destInfo = ((PropertyMapping) mapping).getLastDestinationProperty();
Class<?> elementType = TypeResolver.resolveArgument(destInfo.getGenericType(),
destInfo.getInitialType());
return elementType == TypeResolver.Unknown.class ? Object.class : elementType;
} else if (context.getGenericDestinationType() instanceof ParameterizedType) {
return Types.rawTypeFor(((ParameterizedType) context.getGenericDestinationType()).getActualTypeArguments()[0]);
}
return Object.class;
}
public Optional<?> convert(MappingContext<Optional, Optional> context) {
Class<?> optionalType = getElementType(context);
Optional source = context.getSource();
Object dest = null;
if (source != null && source.isPresent()) {
MappingContext<?, ?> optionalContext = context.create(source.get(), optionalType);
dest = context.getMappingEngine().map(optionalContext);
}
return Optional.ofNullable(dest);
}
}
I am trying to map a A-DTO object to an A-DO object, each having a collection (a List) of T-DTOs, and T-DOs, respectively. I am trying to do it in the context of a REST API. It's a separate question whether it's a right approach - the problem I'm solving is a case of update. Basically, if one of the T-DTOs inside the A-DTO changes, I want that change to be mapped into the corresponding T-DO inside the A-DO.
I found relationship-type="non-cumulative" in Dozer documentation, so that the object inside the collection is updated, if present. But I end up with Dozer inserting a new T-DO into the A-DO's collection!
NOTE: I did implement equals! it is based on the primary key only for now.
Any ideas?
PS: and, if you think this is a bad idea to handle updates to a one-to-many dependent entity, feel free to point that out.. I'm not 100% sure I like that approach, but my REST foo is not very strong.
UPDATE
equals implementation:
#Override
public boolean equals(Object obj) {
if (obj instanceof MyDOClass) {
MyDOClass other = (MyDOClass) obj;
return other.getId().equals(this.getId());
}
return false;
}
I just had the same problem and I solved it:
Dozer uses contains to determine if a member is inside a collection.
You should implement hashCode so that "contains" will work appropriately.
You can see this in the following documentation page:
http://dozer.sourceforge.net/documentation/collectionandarraymapping.html
Under: "Cumulative vs. Non-Cumulative List Mapping (bi-directional)"
Good luck!
Ended up doing a custom mapping.
I did endup doing my own AbstractConverter please find it below:
It has some constraints which are suitable for me (possibly not for you).
will update based on "sameId" implementation
will remove orphans (element from destination not in the source).
Only works on List (enough for my needs).
While the converter will manage the decision to update the mapping of objects are delegated back to Dozer so you don't need to implement the mapping of the elements in your list
Sample use
public class MyConverter extends AbstractListConverter<ClassX,ClassY>{
public MyConverter(){ super(ClassX.class, ClassY.class);}
#Override
protected boolean sameId(ClassX o1, ClassY o2) {
return // your custom comparison here... true means the o2 and o1 can update each other.
}
}
Declaration in mapper.xml
<mapping>
<class-a>x.y.z.AClass</class-a>
<class-b>a.b.c.AnotherClass</class-b>
<field custom-converter="g.e.MyConverter">
<a>ListField</a>
<b>OtherListField</b>
</field>
</mapping>
public abstract class AbstractListConverter<A, B> implements MapperAware, CustomConverter {
private Mapper mapper;
private Class<A> prototypeA;
private Class<B> prototypeB;
#Override
public void setMapper(Mapper mapper) {
this.mapper = mapper;
}
AbstractListConverter(Class<A> prototypeA, Class<B> prototypeB) {
this.prototypeA = prototypeA;
this.prototypeB = prototypeB;
}
#Override
public Object convert(Object destination, Object source, Class<?> destinationClass, Class<?> sourceClass) {
if (destinationClass == null || sourceClass == null || source == null) {
return null;
}
if (List.class.isAssignableFrom(sourceClass) && List.class.isAssignableFrom(destinationClass)) {
if (destination == null || ((List) destination).size() == 0) {
return produceNewList((List) source, destinationClass);
}
return mergeList((List) source, (List) destination, destinationClass);
}
throw new Error("This specific mapper is only to be used when both source and destination are of type java.util.List");
}
private boolean same(Object o1, Object o2) {
if (prototypeA.isAssignableFrom(o1.getClass()) && prototypeB.isAssignableFrom(o2.getClass())) {
return sameId((A) o1, (B) o2);
}
if (prototypeB.isAssignableFrom(o1.getClass()) && prototypeA.isAssignableFrom(o2.getClass())) {
return sameId((A) o2, (B) o1);
}
return false;
}
abstract protected boolean sameId(A o, B t);
private List mergeList(List source, List destination, Class<?> destinationClass) {
return (List)
source.stream().map(from -> {
Optional to = destination.stream().filter(search -> same(from, search)).findFirst();
if (to.isPresent()) {
Object ret = to.get();
mapper.map(from, ret);
return ret;
} else {
return create(from);
}
}
).collect(Collectors.toList());
}
private List produceNewList(List source, Class<?> destinationClass) {
if (source.size() == 0) return source;
return (List) source.stream().map(o -> create(o)).collect(Collectors.toList());
}
private Object create(Object o) {
if (prototypeA.isAssignableFrom(o.getClass())) {
return mapper.map(o, prototypeB);
}
if (prototypeB.isAssignableFrom(o.getClass())) {
return mapper.map(o, prototypeA);
}
return null;
}
}
I have a domain object, that for the purposes of this question I will call Person with the following private variables:
String name
int age
Each of these have getters and setters. Now I also have a Map<String, String> with the following entries:
name, phil
age, 35
I would like to populate a list of all setter methods within the class Person and then looping through this list and invoking each method using the values from the map.
Is this even possible as I cannot see any examples close to this on the net. Examples are very much appreciated.
Sure it's possible! You can get all methods that start with "set" back by doing this:
Class curClass = myclass.class;
Method[] allMethods = curClass.getMethods();
List<Method> setters = new ArrayList<Method>();
for(Method method : allMethods) {
if(method.getName().startsWith("set")) {
setters.add(method);
}
}
Now you've got the methods. Do you already know how to call them for your instance of the class?
Have you tried BeanUtils.populate()) from Apache Commons BeanUtils?
BeanUtils.populate(yourObject, propertiesMap);
This is a full solution that verifies output class beforehand and consequently calls setters for all the properties that the map contains. It uses purely java.beans and java.lang.reflect.
public Object mapToObject(Map<String, Object> input, Class<?> outputType) {
Object outputObject = null;
List<PropertyDescriptor> outputPropertyDescriptors = null;
// Test if class is instantiable with default constructor
if(isInstantiable(outputType)
&& hasDefaultConstructor(outputType)
&& (outputPropertyDescriptors = getPropertyDescriptors(outputType)) != null) {
try {
outputObject = outputType.getConstructor().newInstance();
for(PropertyDescriptor pd : outputPropertyDescriptors) {
Object value = input.get(pd.getName());
if(value != null) {
pd.getWriteMethod().invoke(outputObject, value);
}
}
} catch (InstantiationException|IllegalAccessException|InvocationTargetException|NoSuchMethodException e) {
throw new IllegalStateException("Failed to instantiate verified class " + outputType, e);
}
} else {
throw new IllegalArgumentException("Specified outputType class " + outputType + "cannot be instantiated with default constructor!");
}
return outputObject;
}
private List<PropertyDescriptor> getPropertyDescriptors(Class<?> outputType) {
List<PropertyDescriptor> propertyDescriptors = null;
try {
propertyDescriptors = Arrays.asList(Introspector.getBeanInfo(outputType, Object.class).getPropertyDescriptors());
} catch (IntrospectionException e) {
}
return propertyDescriptors;
}
private boolean isInstantiable(Class<?> clazz) {
return ! clazz.isInterface() && ! Modifier.isAbstract(clazz.getModifiers());
}
private boolean hasDefaultConstructor(Class<?> clazz) {
try {
clazz.getConstructor();
return true;
} catch (NoSuchMethodException e) {
return false;
}
}
I think you could use a library, the Apache Commons BeanUtils. If you have a map that contains field and value pairs, the class PropertyUtils can help you:
Person person = new Person();
for(Map.Entry<String, Object> entry : map.entrySet())
PropertyUtils.setProperty(person, entry.getKey(), entry.getValue());
Consider the three following classes:
EntityTransformer contains a map associating an Entity with a String
Entity is an object containing an ID (used by equals / hashcode), and which contains a reference to an EntityTransformer (note the circular dependency)
SomeWrapper contains an EntityTransformer, and maintains a Map associating Entity's identifiers and the corresponding Entity object.
The following code will create an EntityTransformer and a Wrapper, add two entities to the Wrapper, serialize it, deserialize it and test the presence of the two entitites:
public static void main(String[] args)
throws Exception {
EntityTransformer et = new EntityTransformer();
Wrapper wr = new Wrapper(et);
Entity a1 = wr.addEntity("a1"); // a1 and a2 are created internally by the Wrapper
Entity a2 = wr.addEntity("a2");
byte[] bs = object2Bytes(wr);
wr = (SomeWrapper) bytes2Object(bs);
System.out.println(wr.et.map);
System.out.println(wr.et.map.containsKey(a1));
System.out.println(wr.et.map.containsKey(a2));
}
The output is:
{a1=whatever-a1, a2=whatever-a2}
false
true
So basically, the serialization failed somehow, as the map should contain both entities as Keys. I suspect the cyclic dependency between Entity and EntityTransformer, and indeed if I make static the EntityManager instance variable of Entity, it works.
Question 1: given that I'm stuck with this cyclic dependency, how could I overcome this issue ?
Another very weird thing: if I remove the Map maintaining an association between identifiers and Entities in the Wrapper, everything works fine... ??
Question 2: someone understand what's going on here ?
Bellow is a full functional code if you want to test it:
Thanks in advance for your help :)
public class SerializeTest {
public static class Entity
implements Serializable
{
private EntityTransformer em;
private String id;
Entity(String id, EntityTransformer em) {
this.id = id;
this.em = em;
}
#Override
public boolean equals(Object obj) {
if (obj == null) {
return false;
}
if (getClass() != obj.getClass()) {
return false;
}
final Entity other = (Entity) obj;
if ((this.id == null) ? (other.id != null) : !this.id.equals(
other.id)) {
return false;
}
return true;
}
#Override
public int hashCode() {
int hash = 3;
hash = 97 * hash + (this.id != null ? this.id.hashCode() : 0);
return hash;
}
public String toString() {
return id;
}
}
public static class EntityTransformer
implements Serializable
{
Map<Entity, String> map = new HashMap<Entity, String>();
}
public static class Wrapper
implements Serializable
{
EntityTransformer et;
Map<String, Entity> eMap;
public Wrapper(EntityTransformer b) {
this.et = b;
this.eMap = new HashMap<String, Entity>();
}
public Entity addEntity(String id) {
Entity e = new Entity(id, et);
et.map.put(e, "whatever-" + id);
eMap.put(id, e);
return e;
}
}
public static void main(String[] args)
throws Exception {
EntityTransformer et = new EntityTransformer();
Wrapper wr = new Wrapper(et);
Entity a1 = wr.addEntity("a1"); // a1 and a2 are created internally by the Wrapper
Entity a2 = wr.addEntity("a2");
byte[] bs = object2Bytes(wr);
wr = (Wrapper) bytes2Object(bs);
System.out.println(wr.et.map);
System.out.println(wr.et.map.containsKey(a1));
System.out.println(wr.et.map.containsKey(a2));
}
public static Object bytes2Object(byte[] bytes)
throws IOException, ClassNotFoundException {
ObjectInputStream oi = null;
Object o = null;
try {
oi = new ObjectInputStream(new ByteArrayInputStream(bytes));
o = oi.readObject();
}
catch (IOException io) {
throw io;
}
catch (ClassNotFoundException cne) {
throw cne;
}
finally {
if (oi != null) {
oi.close();
}
}
return o;
}
public static byte[] object2Bytes(Object o)
throws IOException {
ByteArrayOutputStream baos = null;
ObjectOutputStream oo = null;
byte[] bytes = null;
try {
baos = new ByteArrayOutputStream();
oo = new ObjectOutputStream(baos);
oo.writeObject(o);
bytes = baos.toByteArray();
}
catch (IOException ex) {
throw ex;
}
finally {
if (oo != null) {
oo.close();
}
}
return bytes;
}
}
EDIT
There is a good summary of what is potentially in play for this issue:
http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4957674
The problem is that HashMap's readObject() implementation , in order
to re-hash the map, invokes the hashCode() method of some of its keys,
regardless of whether those keys have been fully deserialized.
If a key contains (directly or indirectly) a circular reference to the
map, the following order of execution is possible during
deserialization --- if the key was written to the object stream before
the hashmap:
Instantiate the key
Deserialize the key's attributes
2a. Deserialize the HashMap (which was directly or indirectly pointed to by the key)
2a-1. Instantiate the HashMap
2a-2. Read keys and values
2a-3. Invoke hashCode() on the keys to re-hash the map
2b. Deserialize the key's remaining attributes
Since 2a-3 is executed before 2b, hashCode() may return the wrong
answer, because the key's attributes have not yet been fully
deserialized.
Now that does not explain fully why the issue can be fixed if the HashMap from Wrapper is removed, or move to the EntityTransformer class.
This is a problem with circular initialisation. Whilst Java Serialisation can handle arbitrary cycles, the initialisation has to happen in some order.
There's a similar problem in AWT where Component (Entity) contains a reference to its parent Container (EntityTransformer). What AWT does is to make the parent reference in Component transient.
transient Container parent;
So now each Component can complete its initialisation before Container.readObject adds it back in:
for(Component comp : component) {
comp.parent = this;
Even stranger, if you do
Map<Entity, String> map = new HashMap<>(wr.et.map);
System.out.println(map.containsKey(a1));
System.out.println(map.containsKey(a2));
After serializing and de-serializing, you will get the correct output.
Also:
for( Entity a : wr.et.map.keySet() ){
System.out.println(a.toString());
System.out.println(wr.et.map.containsKey(a));
}
Gives:
a1
false
a2
true
I think you found a bug. Most likely, serialization broke the hashing somehow.
In fact, I think you might have found this bug.
Can you override the serialization to transform the reference into a key value before serializing, and then transform it back on deserialization?
It seems like it would be pretty trivial to find the hash key of the EntityTransformer when serializing and use that value instead, (maybe provide a value in the structure called parentKey) and null out the reference. Then when reserializing, you find the EntityTransformer associated with that key value and assign its reference.
I have such enum:
public enum PartnershipIndicator {
VENDOR("VENDOR"), COPARTNER("COPARTNER"), BUYER("BUYER");
String code;
private PartnershipIndicator(String code) {
this.code = code;
}
public String getCode() {
return code;
}
public static PartnershipIndicator valueOfCode(String code) {
for (PartnershipIndicator status : values()) {
if (status.getCode().equals(code)) {
return status;
}
}
throw new IllegalArgumentException(
"Partnership status cannot be resolved for code " + code);
}
#Override
public String toString() {
return code;
}
}
I need to convert it to String and vice versa. Now, it is done by custom converter. But i want to do it via dozer mappings (if it is possible). If i do not write any mappings to the dozer confing, i get
org.dozer.MappingException: java.lang.NoSuchMethodException: by.dev.madhead.demo.test_java.model.PartnershipIndicator.<init>()
exception. I cannot add default public constructor to enum, as it is not possible. So, i wrote a trick with internal code and valueOfCode() / toString(). It does not work. Then, i've mapped it in dozer config:
<mapping>
<class-a>java.lang.String</class-a>
<class-b create-method="valueOfCode">by.dev.madhead.demo.test_java.model.PartnershipIndicator</class-b>
</mapping>
It does not work. I tried valueOfCode(), one-way mappings. Nothing works. Enum to String conversion does not work too, i get empty Strings.
Any ideas?
Not sure if this is still an issue, but maybe help for anyone searching. But here is implemented solution to this:
#Override
public Object convert(Object destination, Object source, Class<?> destinationClass, Class<?> sourceClass) {
if(source == null)
return null;
if(destinationClass != null){
if(destinationClass.getSimpleName().equalsIgnoreCase("String")){
return this.getString(source);
}else if( destinationClass.isEnum()){
return this.getEnum(destinationClass, source);
}else{
throw new MappingException(new StrBuilder("Converter ").append(this.getClass().getSimpleName())
.append(" was used incorrectly. Arguments were: ")
.append(destinationClass.getClass().getName())
.append(" and ")
.append(source).toString());
}
}
return null;
}
private Object getString(Object object){
String value = object.toString();
return value;
}
private Object getEnum(Class<?> destinationClass, Object source){
Object enumeration = null;
Method [] ms = destinationClass.getMethods();
for(Method m : ms){
if(m.getName().equalsIgnoreCase("valueOf")){
try {
enumeration = m.invoke( destinationClass.getClass(), (String)source);
}
catch (IllegalArgumentException e) {
e.printStackTrace();
}
catch (IllegalAccessException e) {
e.printStackTrace();
}
catch (InvocationTargetException e) {
e.printStackTrace();
}
return enumeration;
}
}
return null;
}
The StrBuilder class when building the exception message is from the apaches common-lang libs. But other than that a simple reflection to solve this issue. Just add to a class that implements CustomConverter and then in your dozer mapping xml file add the following configuration:
<configuration>
<custom-converters>
<converter type="com.yourcompany.manager.utils.dozer.converters.EnumStringBiDirectionalDozerConverter">
<class-a>java.lang.Enum</class-a>
<class-b>java.lang.String</class-b>
</converter>
</custom-converters>
</configuration>
Note that you can only list a configuration once between all of your mapping files (if you have multiple) otherwise dozer will complain. What I typically do is place my custom converter configurations in one file for simplicity. Hope this helps!
There isn't a default enum to String mapping in Dozer. See Data type conversion from Dozer docs. So you have two options:
You can write a custom converter that uses generics to handle any enum.
Or, you could submit a patch to Dozer to add enum<->String mapping to the default mappings.