Consider the three following classes:
EntityTransformer contains a map associating an Entity with a String
Entity is an object containing an ID (used by equals / hashcode), and which contains a reference to an EntityTransformer (note the circular dependency)
SomeWrapper contains an EntityTransformer, and maintains a Map associating Entity's identifiers and the corresponding Entity object.
The following code will create an EntityTransformer and a Wrapper, add two entities to the Wrapper, serialize it, deserialize it and test the presence of the two entitites:
public static void main(String[] args)
throws Exception {
EntityTransformer et = new EntityTransformer();
Wrapper wr = new Wrapper(et);
Entity a1 = wr.addEntity("a1"); // a1 and a2 are created internally by the Wrapper
Entity a2 = wr.addEntity("a2");
byte[] bs = object2Bytes(wr);
wr = (SomeWrapper) bytes2Object(bs);
System.out.println(wr.et.map);
System.out.println(wr.et.map.containsKey(a1));
System.out.println(wr.et.map.containsKey(a2));
}
The output is:
{a1=whatever-a1, a2=whatever-a2}
false
true
So basically, the serialization failed somehow, as the map should contain both entities as Keys. I suspect the cyclic dependency between Entity and EntityTransformer, and indeed if I make static the EntityManager instance variable of Entity, it works.
Question 1: given that I'm stuck with this cyclic dependency, how could I overcome this issue ?
Another very weird thing: if I remove the Map maintaining an association between identifiers and Entities in the Wrapper, everything works fine... ??
Question 2: someone understand what's going on here ?
Bellow is a full functional code if you want to test it:
Thanks in advance for your help :)
public class SerializeTest {
public static class Entity
implements Serializable
{
private EntityTransformer em;
private String id;
Entity(String id, EntityTransformer em) {
this.id = id;
this.em = em;
}
#Override
public boolean equals(Object obj) {
if (obj == null) {
return false;
}
if (getClass() != obj.getClass()) {
return false;
}
final Entity other = (Entity) obj;
if ((this.id == null) ? (other.id != null) : !this.id.equals(
other.id)) {
return false;
}
return true;
}
#Override
public int hashCode() {
int hash = 3;
hash = 97 * hash + (this.id != null ? this.id.hashCode() : 0);
return hash;
}
public String toString() {
return id;
}
}
public static class EntityTransformer
implements Serializable
{
Map<Entity, String> map = new HashMap<Entity, String>();
}
public static class Wrapper
implements Serializable
{
EntityTransformer et;
Map<String, Entity> eMap;
public Wrapper(EntityTransformer b) {
this.et = b;
this.eMap = new HashMap<String, Entity>();
}
public Entity addEntity(String id) {
Entity e = new Entity(id, et);
et.map.put(e, "whatever-" + id);
eMap.put(id, e);
return e;
}
}
public static void main(String[] args)
throws Exception {
EntityTransformer et = new EntityTransformer();
Wrapper wr = new Wrapper(et);
Entity a1 = wr.addEntity("a1"); // a1 and a2 are created internally by the Wrapper
Entity a2 = wr.addEntity("a2");
byte[] bs = object2Bytes(wr);
wr = (Wrapper) bytes2Object(bs);
System.out.println(wr.et.map);
System.out.println(wr.et.map.containsKey(a1));
System.out.println(wr.et.map.containsKey(a2));
}
public static Object bytes2Object(byte[] bytes)
throws IOException, ClassNotFoundException {
ObjectInputStream oi = null;
Object o = null;
try {
oi = new ObjectInputStream(new ByteArrayInputStream(bytes));
o = oi.readObject();
}
catch (IOException io) {
throw io;
}
catch (ClassNotFoundException cne) {
throw cne;
}
finally {
if (oi != null) {
oi.close();
}
}
return o;
}
public static byte[] object2Bytes(Object o)
throws IOException {
ByteArrayOutputStream baos = null;
ObjectOutputStream oo = null;
byte[] bytes = null;
try {
baos = new ByteArrayOutputStream();
oo = new ObjectOutputStream(baos);
oo.writeObject(o);
bytes = baos.toByteArray();
}
catch (IOException ex) {
throw ex;
}
finally {
if (oo != null) {
oo.close();
}
}
return bytes;
}
}
EDIT
There is a good summary of what is potentially in play for this issue:
http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4957674
The problem is that HashMap's readObject() implementation , in order
to re-hash the map, invokes the hashCode() method of some of its keys,
regardless of whether those keys have been fully deserialized.
If a key contains (directly or indirectly) a circular reference to the
map, the following order of execution is possible during
deserialization --- if the key was written to the object stream before
the hashmap:
Instantiate the key
Deserialize the key's attributes
2a. Deserialize the HashMap (which was directly or indirectly pointed to by the key)
2a-1. Instantiate the HashMap
2a-2. Read keys and values
2a-3. Invoke hashCode() on the keys to re-hash the map
2b. Deserialize the key's remaining attributes
Since 2a-3 is executed before 2b, hashCode() may return the wrong
answer, because the key's attributes have not yet been fully
deserialized.
Now that does not explain fully why the issue can be fixed if the HashMap from Wrapper is removed, or move to the EntityTransformer class.
This is a problem with circular initialisation. Whilst Java Serialisation can handle arbitrary cycles, the initialisation has to happen in some order.
There's a similar problem in AWT where Component (Entity) contains a reference to its parent Container (EntityTransformer). What AWT does is to make the parent reference in Component transient.
transient Container parent;
So now each Component can complete its initialisation before Container.readObject adds it back in:
for(Component comp : component) {
comp.parent = this;
Even stranger, if you do
Map<Entity, String> map = new HashMap<>(wr.et.map);
System.out.println(map.containsKey(a1));
System.out.println(map.containsKey(a2));
After serializing and de-serializing, you will get the correct output.
Also:
for( Entity a : wr.et.map.keySet() ){
System.out.println(a.toString());
System.out.println(wr.et.map.containsKey(a));
}
Gives:
a1
false
a2
true
I think you found a bug. Most likely, serialization broke the hashing somehow.
In fact, I think you might have found this bug.
Can you override the serialization to transform the reference into a key value before serializing, and then transform it back on deserialization?
It seems like it would be pretty trivial to find the hash key of the EntityTransformer when serializing and use that value instead, (maybe provide a value in the structure called parentKey) and null out the reference. Then when reserializing, you find the EntityTransformer associated with that key value and assign its reference.
Related
I just want to be shure when inputting new DBObject to DB that it is really unique and Collection doesn't contain key field duplicates .
Here is how it looks now:
public abstract class AbstractMongoDAO<ID, MODEL> implements GenericDAO<ID, MODEL> {
protected Mongo client;
protected Class<MODEL> model;
protected DBCollection dbCollection;
/**
* Contains model data : unique key name and name of get method
*/
protected KeyField keyField;
#SuppressWarnings("unchecked")
protected AbstractMongoDAO() {
ParameterizedType genericSuperclass = (ParameterizedType) this.getClass().getGenericSuperclass();
model = (Class<MODEL>) genericSuperclass.getActualTypeArguments()[1];
getKeyField();
}
public void connect() throws UnknownHostException {
client = new MongoClient(Config.getMongoHost(), Integer.parseInt(Config.getMongoPort()));
DB clientDB = client.getDB(Config.getMongoDb());
clientDB.authenticate(Config.getMongoDbUser(), Config.getMongoDbPass().toCharArray());
dbCollection = clientDB.getCollection(getCollectionName(model));
}
public void disconnect() {
if (client != null) {
client.close();
}
}
#Override
public void create(MODEL model) {
Object keyValue = get(model);
try {
ObjectMapper mapper = new ObjectMapper();
String requestAsString = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(model);
// check if not presented
BasicDBObject dbObject = new BasicDBObject((String) keyValue, requestAsString);
dbCollection.ensureIndex(dbObject, new BasicDBObject("unique", true));
dbCollection.insert(new BasicDBObject((String) keyValue, requestAsString));
} catch (Throwable e) {
throw new RuntimeException(String.format("Duplicate parameters '%s' : '%s'", keyField.id(), keyValue));
}
}
private Object get(MODEL model) {
Object result = null;
try {
Method m = this.model.getMethod(this.keyField.get());
result = m.invoke(model);
} catch (Exception e) {
throw new RuntimeException(String.format("Couldn't find method by name '%s' at class '%s'", this.keyField.get(), this.model.getName()));
}
return result;
}
/**
* Extract the name of collection that is specified at '#Entity' annotation.
*
* #param clazz is model class object.
* #return the name of collection that is specified.
*/
private String getCollectionName(Class<MODEL> clazz) {
Entity entity = clazz.getAnnotation(Entity.class);
String tableName = entity.value();
if (tableName.equals(Mapper.IGNORED_FIELDNAME)) {
// think about usual logger
tableName = clazz.getName();
}
return tableName;
}
private void getKeyField() {
for (Field field : this.model.getDeclaredFields()) {
if (field.isAnnotationPresent(KeyField.class)) {
keyField = field.getAnnotation(KeyField.class);
break;
}
}
if (keyField == null) {
throw new RuntimeException(String.format("Couldn't find key field at class : '%s'", model.getName()));
}
}
KeyFeld is custom annotation:
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.FIELD)
public #interface KeyField {
String id();
String get();
String statusProp() default "ALL";
But I'm not shure that this solution really prove this. I'm newly at Mongo.
Any suggestions?
A uniqueness can be maintained in MonboDb using _id field. If we will not provide the value of this field, MongoDB automatically creates a unique id for that particuler collection.
So, in your case just create a property called _id in java & assign your unique field value here. If duplicated, it will throw an exception.
With Spring Data MongoDB (the question was tagged with spring-data, that's why I suggest it), all you need is that:
// Your types
class YourType {
BigInteger id;
#Indexed(unique = true) String emailAddress;
…
}
interface YourTypeRepository extends CrudRepository<YourType, BigInteger> { }
// Infrastructure setup, if you use Spring as container prefer #EnableMongoRepositories
MongoOperations operations = new MongoTemplate(new MongoClient(), "myDatabase");
MongoRepositoryFactory factory = new MongoRepositoryFactory(operations);
YourTypeRepository repository = factory.getRepository(YourTypeRepository.class);
// Now use it…
YourType first = …; // set email address
YourType second = …; // set same email address
repository.save(first);
repository.save(second); // will throw an exception
The crucial part that's most related to your original question is #Indexed as this will cause the required unique index created when you create the repository.
What you get beyond that is:
no need to manually implement any repository (deleted code does not contain bugs \o/)
automatic object-to-document conversion
automatic index creation
powerful repository abstraction to easily query data by declaring query methods
For more details, check out the reference documentation.
I'm trying to implement an equivalent to String.intern(), but for other objets.
My goal is the following:
I've an object A which I will serialize and then deserialize.
If there is another reference to A somewhere, I want the result of the deserialization to be the same reference.
Here is one example of what I would expect.
MyObject A = new MyObject();
A.data1 = 1;
A.data2 = 2;
byte[] serialized = serialize(A);
A.data1 = 3;
MyObject B = deserialize(serialized); // B!=A and B.data1=1, B.data2=2
MyObject C = B.intern(); // Here we should have C == A. Consequently C.data1=3 AND C.data2=2
Here is my implementation atm. (the MyObject class extends InternableObject)
public abstract class InternableObject {
private static final AtomicLong maxObjectId = new AtomicLong();
private static final Map<Long, InternableObject> dataMap = new ConcurrentHashMap<>();
private final long objectId;
public InternableObject() {
this.objectId = maxObjectId.incrementAndGet();
dataMap.put(this.objectId, this);
}
#Override
protected void finalize() throws Throwable {
super.finalize();
dataMap.remove(this.objectId);
}
public final InternableObject intern() {
return intern(this);
}
public static InternableObject intern(InternableObject o) {
InternableObject r = dataMap.get(o.objectId);
if (r == null) {
throw new IllegalStateException();
} else {
return r;
}
}
}
My unit test (which fails):
private static class MyData extends InternableObject implements Serializable {
public int data;
public MyData(int data) {
this.data = data;
}
}
#Test
public void testIntern() throws Exception {
MyData data1 = new MyData(7);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ObjectOutputStream oos = new ObjectOutputStream(baos);
oos.writeObject(data1);
oos.flush();
baos.flush();
oos.close();
baos.close();
ByteArrayInputStream bais = new ByteArrayInputStream(baos.toByteArray());
ObjectInputStream ois = new ObjectInputStream(bais);
MyData data2 = (MyData) ois.readObject();
Assert.assertTrue(data1 == data2.intern()); // Fails here
}
The failure is due to the fact that, when deserializing, the constructor of InternableObject is called, and thus objectId will be 2 (even if the serialized data contains "1")
Any idea about how to solve this particular problem or, another approach to handle the high level problem ?
Thanks guys
Do not use the constructor to create instances. Use a factory method that checks if an instance already exists first, only create an instance if there isn't already a matching one.
To get serialization to cooperate, your class will need to make use of readResolve() / writeReplace(). http://docs.oracle.com/javase/7/docs/platform/serialization/spec/serial-arch.html#4539
The way you implemented your constructor, you're leaking a reference during construction, which can lead to very hard to nail down problems. Also, your instance map isn't protected by any locks, so its not thread save.
Typically intern() forms an aspect, and maybe should not be realized as a base class, maybe too restricting its usage in a more complex constellation.
There are two aspects:
1. Sharing the "same" object.
Internalizing an object only gives a profit, when several objects can be "internalized" to the same object. So I think, that InternalableObjecte. with a new sequential number is not really adequate. More important is that the class defines a fitting equals and hashCode.
Then you can do an identity Map<Object, Object>:
public class InternMap {
private final Map<Object, Object> identityMap = new HashMap<>();
public static <I extends Internalizable<?>> Object intern(I x) {
Object first = identityMap.get(x);
if (first == null) {
first = x;
identityMap.put(x, x);
}
return first;
}
}
InternMap could be used for any class, but above we restrict it to Internalizable things.
2. Replacing a dynamically created non-shared object with it's .intern().
Which in Java 8 could be realised with a defualt method in an interface:
interface Internalizable<T> {
public static final InternMap interns = new InternMap();
public default T intern(Class<T> klazz) {
return klazz.cast(internMap.intern(this));
}
class C implements Internalizable<C> { ... }
C x = new C();
x = x.intern(C.class);
The Class<T> parameter needed because of type erasure. Concurrency disregarded here.
Prior to Java 8, just use an empty interface Internalizable as _marker: interface, and use a static InternMap.
I am trying to map a A-DTO object to an A-DO object, each having a collection (a List) of T-DTOs, and T-DOs, respectively. I am trying to do it in the context of a REST API. It's a separate question whether it's a right approach - the problem I'm solving is a case of update. Basically, if one of the T-DTOs inside the A-DTO changes, I want that change to be mapped into the corresponding T-DO inside the A-DO.
I found relationship-type="non-cumulative" in Dozer documentation, so that the object inside the collection is updated, if present. But I end up with Dozer inserting a new T-DO into the A-DO's collection!
NOTE: I did implement equals! it is based on the primary key only for now.
Any ideas?
PS: and, if you think this is a bad idea to handle updates to a one-to-many dependent entity, feel free to point that out.. I'm not 100% sure I like that approach, but my REST foo is not very strong.
UPDATE
equals implementation:
#Override
public boolean equals(Object obj) {
if (obj instanceof MyDOClass) {
MyDOClass other = (MyDOClass) obj;
return other.getId().equals(this.getId());
}
return false;
}
I just had the same problem and I solved it:
Dozer uses contains to determine if a member is inside a collection.
You should implement hashCode so that "contains" will work appropriately.
You can see this in the following documentation page:
http://dozer.sourceforge.net/documentation/collectionandarraymapping.html
Under: "Cumulative vs. Non-Cumulative List Mapping (bi-directional)"
Good luck!
Ended up doing a custom mapping.
I did endup doing my own AbstractConverter please find it below:
It has some constraints which are suitable for me (possibly not for you).
will update based on "sameId" implementation
will remove orphans (element from destination not in the source).
Only works on List (enough for my needs).
While the converter will manage the decision to update the mapping of objects are delegated back to Dozer so you don't need to implement the mapping of the elements in your list
Sample use
public class MyConverter extends AbstractListConverter<ClassX,ClassY>{
public MyConverter(){ super(ClassX.class, ClassY.class);}
#Override
protected boolean sameId(ClassX o1, ClassY o2) {
return // your custom comparison here... true means the o2 and o1 can update each other.
}
}
Declaration in mapper.xml
<mapping>
<class-a>x.y.z.AClass</class-a>
<class-b>a.b.c.AnotherClass</class-b>
<field custom-converter="g.e.MyConverter">
<a>ListField</a>
<b>OtherListField</b>
</field>
</mapping>
public abstract class AbstractListConverter<A, B> implements MapperAware, CustomConverter {
private Mapper mapper;
private Class<A> prototypeA;
private Class<B> prototypeB;
#Override
public void setMapper(Mapper mapper) {
this.mapper = mapper;
}
AbstractListConverter(Class<A> prototypeA, Class<B> prototypeB) {
this.prototypeA = prototypeA;
this.prototypeB = prototypeB;
}
#Override
public Object convert(Object destination, Object source, Class<?> destinationClass, Class<?> sourceClass) {
if (destinationClass == null || sourceClass == null || source == null) {
return null;
}
if (List.class.isAssignableFrom(sourceClass) && List.class.isAssignableFrom(destinationClass)) {
if (destination == null || ((List) destination).size() == 0) {
return produceNewList((List) source, destinationClass);
}
return mergeList((List) source, (List) destination, destinationClass);
}
throw new Error("This specific mapper is only to be used when both source and destination are of type java.util.List");
}
private boolean same(Object o1, Object o2) {
if (prototypeA.isAssignableFrom(o1.getClass()) && prototypeB.isAssignableFrom(o2.getClass())) {
return sameId((A) o1, (B) o2);
}
if (prototypeB.isAssignableFrom(o1.getClass()) && prototypeA.isAssignableFrom(o2.getClass())) {
return sameId((A) o2, (B) o1);
}
return false;
}
abstract protected boolean sameId(A o, B t);
private List mergeList(List source, List destination, Class<?> destinationClass) {
return (List)
source.stream().map(from -> {
Optional to = destination.stream().filter(search -> same(from, search)).findFirst();
if (to.isPresent()) {
Object ret = to.get();
mapper.map(from, ret);
return ret;
} else {
return create(from);
}
}
).collect(Collectors.toList());
}
private List produceNewList(List source, Class<?> destinationClass) {
if (source.size() == 0) return source;
return (List) source.stream().map(o -> create(o)).collect(Collectors.toList());
}
private Object create(Object o) {
if (prototypeA.isAssignableFrom(o.getClass())) {
return mapper.map(o, prototypeB);
}
if (prototypeB.isAssignableFrom(o.getClass())) {
return mapper.map(o, prototypeA);
}
return null;
}
}
I have a class like this:
public class DeserializedHeader
int typeToClassId;
Object obj
I know what type of object obj is based on the typeToClassId, which is unfortunately only known at runtime.
I want to parse obj out based on typeToClassId - what's the best approach here? Annotations seem like they're out, and something based on ObjectMapper seems right, but I'm having trouble figuring out what the best approach is likely to be.
Something along the lines of
Class clazz = lookUpClassBasedOnId(typeToClassId)
objectMapper.readValue(obj, clazz)
Obviously, this doesn't work since obj is already deserialized... but could I do this in 2 steps somehow, perhaps with convertValue?
This is really complex and painful problem. I do not know any sophisticated and elegant solution, but I can share with you my idea which I developed. I have created example program which help me to show you how you can solve your problem. At the beginning I have created two simple POJO classes:
class Product {
private String name;
// getters/setters/toString
}
and
class Entity {
private long id;
// getters/setters/toString
}
Example input JSON for those classes could look like this. For Product class:
{
"typeToClassId" : 33,
"obj" : {
"name" : "Computer"
}
}
and for Entity class:
{
"typeToClassId" : 45,
"obj" : {
"id" : 10
}
}
The main functionality which we want to use is "partial serializing/deserializing". To do this we will enable FAIL_ON_UNKNOWN_PROPERTIES feature on ObjectMapper. Now we have to create two classes which define typeToClassId and obj properties.
class HeaderType {
private int typeToClassId;
public int getTypeToClassId() {
return typeToClassId;
}
public void setTypeToClassId(int typeToClassId) {
this.typeToClassId = typeToClassId;
}
#Override
public String toString() {
return "HeaderType [typeToClassId=" + typeToClassId + "]";
}
}
class HeaderObject<T> {
private T obj;
public T getObj() {
return obj;
}
public void setObj(T obj) {
this.obj = obj;
}
#Override
public String toString() {
return "HeaderObject [obj=" + obj + "]";
}
}
And, finally source code which can parse JSON:
// Simple binding
Map<Integer, Class<?>> classResolverMap = new HashMap<Integer, Class<?>>();
classResolverMap.put(33, Product.class);
classResolverMap.put(45, Entity.class);
ObjectMapper mapper = new ObjectMapper();
mapper.disable(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES);
String json = "{...}";
// Parse type
HeaderType headerType = mapper.readValue(json, HeaderType.class);
// Retrieve class by integer value
Class<?> clazz = classResolverMap.get(headerType.getTypeToClassId());
// Create dynamic type
JavaType type = mapper.getTypeFactory().constructParametricType(HeaderObject.class, clazz);
// Parse object
HeaderObject<?> headerObject = (HeaderObject<?>) mapper.readValue(json, type);
// Get the object
Object result = headerObject.getObj();
System.out.println(result);
Helpful links:
How To Convert Java Map To / From JSON (Jackson).
java jackson parse object containing a generic type object.
During a Hibernate Session, I am loading some objects and some of them are loaded as proxies due to lazy loading. It's all OK and I don't want to turn lazy loading off.
But later I need to send some of the objects (actually one object) to the GWT client via RPC. And it happens that this concrete object is a proxy. So I need to turn it into a real object. I can't find a method like "materialize" in Hibernate.
How can I turn some of the objects from proxies to reals knowing their class and ID?
At the moment the only solution I see is to evict that object from Hibernate's cache and reload it, but it is really bad for many reasons.
Here's a method I'm using.
public static <T> T initializeAndUnproxy(T entity) {
if (entity == null) {
throw new
NullPointerException("Entity passed for initialization is null");
}
Hibernate.initialize(entity);
if (entity instanceof HibernateProxy) {
entity = (T) ((HibernateProxy) entity).getHibernateLazyInitializer()
.getImplementation();
}
return entity;
}
Since Hibernate ORM 5.2.10, you can do it likee this:
Object unproxiedEntity = Hibernate.unproxy(proxy);
Before Hibernate 5.2.10. the simplest way to do that was to use the unproxy method offered by Hibernate internal PersistenceContext implementation:
Object unproxiedEntity = ((SessionImplementor) session)
.getPersistenceContext()
.unproxy(proxy);
Try to use Hibernate.getClass(obj)
I've written following code which cleans object from proxies (if they are not already initialized)
public class PersistenceUtils {
private static void cleanFromProxies(Object value, List<Object> handledObjects) {
if ((value != null) && (!isProxy(value)) && !containsTotallyEqual(handledObjects, value)) {
handledObjects.add(value);
if (value instanceof Iterable) {
for (Object item : (Iterable<?>) value) {
cleanFromProxies(item, handledObjects);
}
} else if (value.getClass().isArray()) {
for (Object item : (Object[]) value) {
cleanFromProxies(item, handledObjects);
}
}
BeanInfo beanInfo = null;
try {
beanInfo = Introspector.getBeanInfo(value.getClass());
} catch (IntrospectionException e) {
// LOGGER.warn(e.getMessage(), e);
}
if (beanInfo != null) {
for (PropertyDescriptor property : beanInfo.getPropertyDescriptors()) {
try {
if ((property.getWriteMethod() != null) && (property.getReadMethod() != null)) {
Object fieldValue = property.getReadMethod().invoke(value);
if (isProxy(fieldValue)) {
fieldValue = unproxyObject(fieldValue);
property.getWriteMethod().invoke(value, fieldValue);
}
cleanFromProxies(fieldValue, handledObjects);
}
} catch (Exception e) {
// LOGGER.warn(e.getMessage(), e);
}
}
}
}
}
public static <T> T cleanFromProxies(T value) {
T result = unproxyObject(value);
cleanFromProxies(result, new ArrayList<Object>());
return result;
}
private static boolean containsTotallyEqual(Collection<?> collection, Object value) {
if (CollectionUtils.isEmpty(collection)) {
return false;
}
for (Object object : collection) {
if (object == value) {
return true;
}
}
return false;
}
public static boolean isProxy(Object value) {
if (value == null) {
return false;
}
if ((value instanceof HibernateProxy) || (value instanceof PersistentCollection)) {
return true;
}
return false;
}
private static Object unproxyHibernateProxy(HibernateProxy hibernateProxy) {
Object result = hibernateProxy.writeReplace();
if (!(result instanceof SerializableProxy)) {
return result;
}
return null;
}
#SuppressWarnings("unchecked")
private static <T> T unproxyObject(T object) {
if (isProxy(object)) {
if (object instanceof PersistentCollection) {
PersistentCollection persistentCollection = (PersistentCollection) object;
return (T) unproxyPersistentCollection(persistentCollection);
} else if (object instanceof HibernateProxy) {
HibernateProxy hibernateProxy = (HibernateProxy) object;
return (T) unproxyHibernateProxy(hibernateProxy);
} else {
return null;
}
}
return object;
}
private static Object unproxyPersistentCollection(PersistentCollection persistentCollection) {
if (persistentCollection instanceof PersistentSet) {
return unproxyPersistentSet((Map<?, ?>) persistentCollection.getStoredSnapshot());
}
return persistentCollection.getStoredSnapshot();
}
private static <T> Set<T> unproxyPersistentSet(Map<T, ?> persistenceSet) {
return new LinkedHashSet<T>(persistenceSet.keySet());
}
}
I use this function over result of my RPC services (via aspects) and it cleans recursively all result objects from proxies (if they are not initialized).
The way I recommend with JPA 2 :
Object unproxied = entityManager.unwrap(SessionImplementor.class).getPersistenceContext().unproxy(proxy);
Starting from Hiebrnate 5.2.10 you can use Hibernate.proxy method to convert a proxy to your real entity:
MyEntity myEntity = (MyEntity) Hibernate.unproxy( proxyMyEntity );
The another workaround is to call
Hibernate.initialize(extractedObject.getSubojbectToUnproxy());
Just before closing the session.
With Spring Data JPA and Hibernate, I was using subinterfaces of JpaRepository to look up objects belonging to a type hierarchy that was mapped using the "join" strategy. Unfortunately, the queries were returning proxies of the base type instead of instances of the expected concrete types. This prevented me from casting the results to the correct types. Like you, I came here looking for an effective way to get my entites unproxied.
Vlad has the right idea for unproxying these results; Yannis provides a little more detail. Adding to their answers, here's the rest of what you might be looking for:
The following code provides an easy way to unproxy your proxied entities:
import org.hibernate.engine.spi.PersistenceContext;
import org.hibernate.engine.spi.SessionImplementor;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.jpa.repository.JpaContext;
import org.springframework.stereotype.Component;
#Component
public final class JpaHibernateUtil {
private static JpaContext jpaContext;
#Autowired
JpaHibernateUtil(JpaContext jpaContext) {
JpaHibernateUtil.jpaContext = jpaContext;
}
public static <Type> Type unproxy(Type proxied, Class<Type> type) {
PersistenceContext persistenceContext =
jpaContext
.getEntityManagerByManagedType(type)
.unwrap(SessionImplementor.class)
.getPersistenceContext();
Type unproxied = (Type) persistenceContext.unproxyAndReassociate(proxied);
return unproxied;
}
}
You can pass either unproxied entites or proxied entities to the unproxy method. If they are already unproxied, they'll simply be returned. Otherwise, they'll get unproxied and returned.
Hope this helps!
Thank you for the suggested solutions! Unfortunately, none of them worked for my case: receiving a list of CLOB objects from Oracle database through JPA - Hibernate, using a native query.
All of the proposed approaches gave me either a ClassCastException or just returned java Proxy object (which deeply inside contained the desired Clob).
So my solution is the following (based on several above approaches):
Query sqlQuery = manager.createNativeQuery(queryStr);
List resultList = sqlQuery.getResultList();
for ( Object resultProxy : resultList ) {
String unproxiedClob = unproxyClob(resultProxy);
if ( unproxiedClob != null ) {
resultCollection.add(unproxiedClob);
}
}
private String unproxyClob(Object proxy) {
try {
BeanInfo beanInfo = Introspector.getBeanInfo(proxy.getClass());
for (PropertyDescriptor property : beanInfo.getPropertyDescriptors()) {
Method readMethod = property.getReadMethod();
if ( readMethod.getName().contains("getWrappedClob") ) {
Object result = readMethod.invoke(proxy);
return clobToString((Clob) result);
}
}
}
catch (InvocationTargetException | IntrospectionException | IllegalAccessException | SQLException | IOException e) {
LOG.error("Unable to unproxy CLOB value.", e);
}
return null;
}
private String clobToString(Clob data) throws SQLException, IOException {
StringBuilder sb = new StringBuilder();
Reader reader = data.getCharacterStream();
BufferedReader br = new BufferedReader(reader);
String line;
while( null != (line = br.readLine()) ) {
sb.append(line);
}
br.close();
return sb.toString();
}
Hope this will help somebody!
I found a solution to deproxy a class using standard Java and JPA API. Tested with hibernate, but does not require hibernate as a dependency and should work with all JPA providers.
Onle one requirement - its necessary to modify parent class (Address) and add a simple helper method.
General idea: add helper method to parent class which returns itself. when method called on proxy, it will forward the call to real instance and return this real instance.
Implementation is a little bit more complex, as hibernate recognizes that proxied class returns itself and still returns proxy instead of real instance. Workaround is to wrap returned instance into a simple wrapper class, which has different class type than the real instance.
In code:
class Address {
public AddressWrapper getWrappedSelf() {
return new AddressWrapper(this);
}
...
}
class AddressWrapper {
private Address wrappedAddress;
...
}
To cast Address proxy to real subclass, use following:
Address address = dao.getSomeAddress(...);
Address deproxiedAddress = address.getWrappedSelf().getWrappedAddress();
if (deproxiedAddress instanceof WorkAddress) {
WorkAddress workAddress = (WorkAddress)deproxiedAddress;
}