I am looking for a way to do a query that requires a JOIN. Is there any way to do this in a prepared statement, or is the rawQuery the only option that I have. If rawQuery is the only option, then is there some way to automatically map the returned objects to the objects of the Dao being implemented.
I've dug through the documents and examples but cannot find anything that will allow me to map the raw database result to an ORM object class.
I am looking for a way to do a query that requires a JOIN.
ORMLite supports simple JOIN queries. You can also use raw-queries to accomplish this.
You can use the Dao.getRawRowMapper() to map the queries as you found or you can create a custom mapper. The documentation has the following sample code which shows how to map the String[] into your object:
GenericRawResults<Foo> rawResults =
orderDao.queryRaw(
"select account_id,sum(amount) from orders group by account_id",
new RawRowMapper<Foo>() {
public Foo mapRow(String[] columnNames,
String[] resultColumns) {
return new Foo(Long.parseLong(resultColumns[0]),
Integer.parseInt(resultColumns[1]));
}
});
I've found a way to auto map a result set to a model object.
// return the orders with the sum of their amounts per account
GenericRawResults<Order> rawResults =
orderDao.queryRaw(query, orderDao.getRawRowMapper(), param1)
// page through the results
for (Order order : rawResults) {
System.out.println("Account-id " + order.accountId + " has "
+ order.totalOrders + " total orders");
}
rawResults.close();
The key is to pull the row mapper from your object Dao using getRawRowMapper(), which will handle the mapping for you. I hope this helps anyone who finds it.
I still would love the ability to do joins within the QueryBuilder but until that is supported, this is the next best thing in my opinion.
Raw query auto mapping
I had problem of mapping fields from custom SELECT which return columns that are not present in any table model. So I made custom RawRowMapper which can map fields from custom query to custom model. This is useful when you have query which has fields that doesn't corresponds to any table maping model.
This is RowMapper which performs query auto mapping:
public class GenericRowMapper<T> implements RawRowMapper<T> {
private Class<T> entityClass;
private Set<Field> fields = new HashSet<>();
private Map<String, Field> colNameFieldMap = new HashMap<>();
public GenericRowMapper(Class<T> entityClass) {
this.dbType = dbType;
this.entityClass = entityClass;
Class cl = entityClass;
do {
for (Field field : cl.getDeclaredFields()) {
if (field.isAnnotationPresent(DatabaseField.class)) {
DatabaseField an = field.getAnnotation(DatabaseField.class);
fields.add(field);
colNameFieldMap.put(an.columnName(), field);
}
}
cl = cl.getSuperclass();
} while (cl != Object.class);
}
#Override
public T mapRow(String[] columnNames, String[] resultColumns) throws SQLException {
try {
T entity = entityClass.newInstance();
for (int i = 0; i < columnNames.length; i++) {
Field f = colNameFieldMap.get(columnNames[i]);
boolean accessible = f.isAccessible();
f.setAccessible(true);
f.set(entity, stringToJavaObject(f.getType(), resultColumns[i]));
f.setAccessible(accessible);
}
return entity;
} catch (InstantiationException e) {
throw new RuntimeException(e);
} catch (IllegalAccessException e) {
throw new RuntimeException(e);
}
}
public Object stringToJavaObject(Class cl, String result) {
if (result == null){
return null;
}else if (cl == Integer.class || int.class == cl) {
return Integer.parseInt(result);
} else if (cl == Float.class || float.class == cl) {
return Float.parseFloat(result);
} else if (cl == Double.class || double.class == cl) {
return Double.parseDouble(result);
} else if (cl == Boolean.class || cl == boolean.class) {
try{
return Integer.valueOf(result) > 0;
}catch (NumberFormatException e){
return Boolean.parseBoolean(result);
}
} else if (cl == Date.class) {
DateLongType lType = DateLongType.getSingleton();
DateStringType sType = DateStringType.getSingleton();
try {
return lType.resultStringToJava(null, result, -1);
} catch (NumberFormatException e) {
try {
return sType.resultStringToJava(null, result, -1);
} catch (SQLException e2) {
throw new RuntimeException(e);
}
}
} else {
return result;
}
}
}
And here is the usage:
class Model{
#DatabaseField(columnName = "account_id")
String accId;
#DatabaseField(columnName = "amount")
int amount;
}
String sql = "select account_id,sum(amount) amount from orders group by account_id"
return queryRaw(sql,new GenericRowMapper<>(Model.class)).getResults()
This will return List<Model> with mapped result rows to Model if query column names and #DatabaseField(columnName are the same
Related
For now I have method body of which looks like this:
jdbcTemplate.query(queryJoiningTwoTables, (rs, rowNum) -> {
final long id= rs.getLong("id");
MyObject obj = jobStatusBunchMap.get(id);
if (obj== null) {
OffsetDateTime offsetDateTime = rs.getObject("creation_timestamp", OffsetDateTime.class);
...
obj = new MyObject (offsetDateTime ...);
map.put(id, obj );
}
String jobId = rs.getString("job_id");
obj.getJobIds().add(jobId);
return null;
});
return map.values();
Looks like I use API in improper way.
Is there better method to achieve the same ?
P.S.
I tried to use jdbcTemplate#queryForRowSet but at this case rs.getObject("creation_timestamp", OffsetDateTime.class) throws exception that it is not supported operation.
There are many options to map the results using jdbcTemplate, including yours.
Maybe this will help you to understand it better:
public List<Action> findAllActions() {
final String selectStatement = "SELECT id,name FROM Actions"; // or your query
try {
return jdbcTemplate.query(selectStatement,(resultSet, rowNum) -> {
int actionId = resultSet.getInt("id");
String actionName = resultSet.getString("name");
Action action = new Action();
action.setId(actionId);
action.setName(actionName);
return action;
});
} catch (EmptyResultDataAccessException e) {
LOGGER.error("Get all actions - empty set", e);
return Collections.emptyList();
}
}
Without lambda expressions you could use jdbcTemplate query like this:
public List<Action> findAllActions() {
final String selectStatement = "SELECT id,name FROM Actions"; // or your query
try {
return jdbcTemplate.query(selectStatement, getActionRowMapper());
} catch (EmptyResultDataAccessException e) {
LOGGER.error("Get all actions - empty set", e);
return Collections.emptyList();
}
}
private RowMapper<Action> getActionRowMapper() {
return (resultSet, rowNum) -> {
int actionId = resultSet.getInt("id");
String actionName = resultSet.getString("name");
return action;
};
}
As you can see, the second parameter of jdbcTemplate.query method takes a RowMapper<Action> type but is hidden using lambdas expressions. This option is "restricting" you from making anything else besides mapping the resultSet for every row and return the result. The final result will be eventually a List of actions.
The second option is using a ResultSetExtractor which will let you to loop through the result set and gives you more flexibility. The query will be the same, just the second parameter of jdbcTemplate.query method will be changed. And for this, I would personally implement the ResultSetExtractor and override the extractData method, or you can do the same as above if you don't need anything else just to map the results
public List<Group> findGroups() {
final String selectStatement = "SELECT stud.id, stud.name, gr.id, gr.name FROM student stud INNER JOIN group gr ON gr.id=stud.group_id ORDER BY stud.id"; // or your query
try {
return jdbcTemplate.query(selectStatement, new GroupExtractor() );
} catch (EmptyResultDataAccessException e) {
LOGGER.error("Get all groups - empty set", e);
return Collections.emptyList();
}
}
public class GroupExtractor implements ResultSetExtractor<List<Group>> {
#Override
public List<Group> extractData(ResultSet resultSet) {
Map<Group, List<Student>> studentsGroup= new HashMap<>();
List<Group> groups = new ArrayList<>();
try {
while (resultSet.next()) {
int studentId = resultSet.getInt("stud.id");
String studentName = resultSet.getString("stud.name");
int groupId = resultSet.getInt("gr.id");
String groupName = resultSet.getString("gr.name");
Group group = createGroup(groupId, groupName);
Student student = createStudent(studentId, studentName);
studentsGroup.putIfAbsent(group, new ArrayList<>());
studentsGroup.get(group).add(student);
}
studentsGroup.forEach((group,students) ->{
group.setStudents(students);
groups.add(group);
}
return groups;
} catch (SQLException e) {
LOGGER.info("An error occured during extracting data", e);
}
return actions;
}
private Student createStudent(String studentId, String studentName)
{
Student student=new Student();
student.setId(studentId);
student.setName(studentName);
return student;
}
//idem for createGroup
}
I am writing an insert query inside #Query annotation in a Spring application with PostGreSQL. So I am extending CRUD repository inside an interface that I have written.
#Repository
public interface PostGreRepository extends CrudRepository<FoodDetails,Long> {
#Modifying
#Query(value="insert into fooddetails(person_id,food_desc) select id,food_desc from person,food where id = " +
"person_id",nativeQuery = true)
void insertIntoPostGre();
}
Now I have the requirement to keep the query as a parameter in the application because it might change later. I cannot use #Value annotation inside an interface. So how can I parameterize this? Ideas?
Just as an idea, use reflection to change annotation value:
Disclaimer: changeAnnotationValue method is taken from here, I haven't run it myself
#SuppressWarnings("unchecked")
public static Object changeAnnotationValue(Annotation annotation, String key, Object newValue){
Object handler = Proxy.getInvocationHandler(annotation);
Field f;
try {
f = handler.getClass().getDeclaredField("memberValues");
} catch (NoSuchFieldException | SecurityException e) {
throw new IllegalStateException(e);
}
f.setAccessible(true);
Map<String, Object> memberValues;
try {
memberValues = (Map<String, Object>) f.get(handler);
} catch (IllegalArgumentException | IllegalAccessException e) {
throw new IllegalStateException(e);
}
Object oldValue = memberValues.get(key);
if (oldValue == null || oldValue.getClass() != newValue.getClass()) {
throw new IllegalArgumentException();
}
memberValues.put(key,newValue);
return oldValue;
}
Using query as a parameter:
#Component
public class PostGreRepositoryParameterizer {
//...
#Value("query")
private String query;
public void modify() {
Method method = PostGreRepository.class.getMethod("insertIntoPostGre");
final Query queryAnnotation = method.getAnnotation(Query.class);
changeAnnotationValue(queryAnnotation, "value", query);
}
//...
}
I have created generic json parser using java reflection, but there is a error that i am not able to solve.
Method (at the bottom of this question), receives subclass of my custom Model class. I iterate through fields and set values from json. If subclass contains array property of some other class (which is again, subclass of Model), i've created small recursion to fill those objects.
Eg.
class UserModel extends Model
{
#JsonResponseParam(Name="userName")
public String Name;
#JsonResponseParam(Name="friends")
public FriendModel[] Friends;
}
At the end UserModel should be filled with Friends. (JsonResponseParam is custom anotation, and Name value is used as property for getting values from json)
Result of this method is IllegalArgumentException, and it is thrown on
field.set(t, values.toArray());
Here is the method:
protected <T extends Model> T getModel(T t)
{
Field[] fields = t.getClass().getFields();
for (Field field : fields) {
Annotation an = field.getAnnotation(JsonResponseParam.class);
if(an != null){
try
{
if(field.getType() == boolean.class)
field.setBoolean(t, t.getBool(((JsonResponseParam)an).Name()));
if(field.getType() == String.class)
field.set(t, t.getString(((JsonResponseParam)an).Name()));
if(field.getType() == int.class)
field.setInt(t, t.getInt(((JsonResponseParam)an).Name()));
if(field.getType() == Date.class)
field.set(t, t.getDate(((JsonResponseParam)an).Name()));
if(field.getType().isArray()){
ArrayList<Model> modelArray = t.getModelArray(((JsonResponseParam)an).Name());
ArrayList<Model> values = new ArrayList<Model>();
for (Model model : modelArray) {
Class<? extends Model> arrayType = field.getType().getComponentType().asSubclass(Model.class);
Model m = arrayType.newInstance();
m.jsonObject = model.jsonObject;
model.getModel(m);
values.add(m);
}
field.set(t, values.toArray());
}
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
} catch (InstantiationException e) {
e.printStackTrace();
}
}
}
return t;
}
I am suspecting on class type inconsistency between field and values..
Thank you for your time.
toArray() only return an Object[] so it cannot be assigned to any other array type.
What you want is
field.set(t, values.toArray(Array.newInstance(field.getType().getComponentType(), values.size()));
This will create an array of the type to match the field.
See Array.newInstance
Basically, the pattern is as below:
// First, create the array
Object myArray = Array.newInstance(field.getType().getComponentType(), arraySize);
// Then, adding value to that array
for (int i = 0; i < arraySize; i++) {
// value = ....
Array.set(myArray, i, value);
}
// Finally, set value for that array field
set(data, fieldName, myArray);
The set function is taken from this stackoverflow question:
public static boolean set(Object object, String fieldName, Object fieldValue) {
Class<?> clazz = object.getClass();
while (clazz != null) {
try {
Field field = clazz.getDeclaredField(fieldName);
field.setAccessible(true);
field.set(object, fieldValue);
return true;
} catch (NoSuchFieldException e) {
clazz = clazz.getSuperclass();
} catch (Exception e) {
throw new IllegalStateException(e);
}
}
return false;
}
Apply the above code, we have:
if(field.getType().isArray()){
// ....
int arraySize = modelArray.size();
Object values = Array.newInstance(field.getType().getComponentType(), modelArray.size());
for (int i = 0; i < arraySize; i++) {
// ......
Array.set(values, i, m);
}
field.set(t, values);
}
say i have a java bean/an entity with 100 fields (inherited or not it is not relevant in this case). After update operations - in a transaction, i want to determine which fields are modified to track updates like a CVS. What is the easiest way to do this? Any Framework suggestion? Should i make two instances of this object and iterate over all fields and match the values of fields ? How would the best equals method seem in such situations ? The following equals() seems very awkward :
return (field1.equals(o.field1)) &&
(field2.equals(o.field2)) &&
(field3.equals(o.field3)) &&
...
(field100.equals(o.field100));
You could use Apache Commons Beanutils. Here's a simple example:
package at.percom.temp.zztests;
import java.lang.reflect.InvocationTargetException;
import org.apache.commons.beanutils.BeanMap;
import org.apache.commons.beanutils.PropertyUtilsBean;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Objects;
import java.util.Set;
public class Main {
public static void main(String[] args) throws IllegalAccessException, InvocationTargetException, NoSuchMethodException {
Main main = new Main();
main.start();
}
public void start() throws IllegalAccessException, InvocationTargetException, NoSuchMethodException {
SampleBean oldSample = new SampleBean("John", "Doe", 1971);
SampleBean newSample = new SampleBean("John X.", "Doe", 1971);
SampleBean diffSample = (SampleBean) compareObjects(oldSample, newSample, new HashSet<>(Arrays.asList("lastName")), 10L);
}
public Object compareObjects(Object oldObject, Object newObject, Set<String> propertyNamesToAvoid, Long deep) {
return compareObjects(oldObject, newObject, propertyNamesToAvoid, deep, null);
}
private Object compareObjects(Object oldObject, Object newObject, Set<String> propertyNamesToAvoid, Long deep,
String parentPropertyPath) {
propertyNamesToAvoid = propertyNamesToAvoid != null ? propertyNamesToAvoid : new HashSet<>();
parentPropertyPath = parentPropertyPath != null ? parentPropertyPath : "";
Object diffObject = null;
try {
diffObject = oldObject.getClass().newInstance();
} catch (Exception e) {
return diffObject;
}
BeanMap map = new BeanMap(oldObject);
PropertyUtilsBean propUtils = new PropertyUtilsBean();
for (Object propNameObject : map.keySet()) {
String propertyName = (String) propNameObject;
String propertyPath = parentPropertyPath + propertyName;
if (!propUtils.isWriteable(diffObject, propertyName) || !propUtils.isReadable(newObject, propertyName)
|| propertyNamesToAvoid.contains(propertyPath)) {
continue;
}
Object property1 = null;
try {
property1 = propUtils.getProperty(oldObject, propertyName);
} catch (Exception e) {
}
Object property2 = null;
try {
property2 = propUtils.getProperty(newObject, propertyName);
} catch (Exception e) {
}
try {
if (property1 != null && property2 != null && property1.getClass().getName().startsWith("com.racing.company")
&& (deep == null || deep > 0)) {
Object diffProperty = compareObjects(property1, property2, propertyNamesToAvoid,
deep != null ? deep - 1 : null, propertyPath + ".");
propUtils.setProperty(diffObject, propertyName, diffProperty);
} else {
if (!Objects.deepEquals(property1, property2)) {
propUtils.setProperty(diffObject, propertyName, property2);
System.out.println("> " + propertyPath + " is different (oldValue=\"" + property1 + "\", newValue=\""
+ property2 + "\")");
} else {
System.out.println(" " + propertyPath + " is equal");
}
}
} catch (Exception e) {
}
}
return diffObject;
}
public class SampleBean {
public String firstName;
public String lastName;
public int yearOfBirth;
public SampleBean(String firstName, String lastName, int yearOfBirth) {
this.firstName = firstName;
this.lastName = lastName;
this.yearOfBirth = yearOfBirth;
}
public String getFirstName() {
return firstName;
}
public String getLastName() {
return lastName;
}
public int getYearOfBirth() {
return yearOfBirth;
}
}
}
Hey look at Javers it's exactly what you need - objects auditing and diff framework . With Javers you can persist changes done on your domain objects with a single javers.commit() call after every update. When you persist some changes you can easily read them by javers.getChangeHistory, e.g.
public static void main(String... args) {
//get Javers instance
Javers javers = JaversBuilder.javers().build();
//create java bean
User user = new User(1, "John");
//commit current state
javers.commit("author", user);
//update operation
user.setUserName("David");
//commit change
javers.commit("author", user);
//read 100 last changes
List<Change> changes = javers.getChangeHistory(instanceId(1, User.class), 100);
//print change log
System.out.printf(javers.processChangeList(changes, new SimpleTextChangeLog()));
}
and the output is:
commit 2.0, author:author, 2015-01-07 23:00:10
changed object: org.javers.demo.User/1
value changed on 'userName' property: 'John' -> 'David'
commit 1.0, author:author, 2015-01-07 23:00:10
new object: 'org.javers.demo.User/1
You can use reflection to load the fields and then invoke them on each object and compare the result.
Example source code might look like this:
public static <T> void Compare(T source, T target) throws IllegalArgumentException, IllegalAccessException {
if(source == null) {
throw new IllegalArgumentException("Null argument not excepted at this point");
}
Field[] fields = source.getClass().getFields();
Object sourceObject;
Object targetObject;
for(Field field : fields){
sourceObject = field.get(source);
targetObject = field.get(target);
//Compare the object
}
}
FYI, this code will work only on public fields declared for class.
You can use Apache BeanUtils to checkout the properties.
During a Hibernate Session, I am loading some objects and some of them are loaded as proxies due to lazy loading. It's all OK and I don't want to turn lazy loading off.
But later I need to send some of the objects (actually one object) to the GWT client via RPC. And it happens that this concrete object is a proxy. So I need to turn it into a real object. I can't find a method like "materialize" in Hibernate.
How can I turn some of the objects from proxies to reals knowing their class and ID?
At the moment the only solution I see is to evict that object from Hibernate's cache and reload it, but it is really bad for many reasons.
Here's a method I'm using.
public static <T> T initializeAndUnproxy(T entity) {
if (entity == null) {
throw new
NullPointerException("Entity passed for initialization is null");
}
Hibernate.initialize(entity);
if (entity instanceof HibernateProxy) {
entity = (T) ((HibernateProxy) entity).getHibernateLazyInitializer()
.getImplementation();
}
return entity;
}
Since Hibernate ORM 5.2.10, you can do it likee this:
Object unproxiedEntity = Hibernate.unproxy(proxy);
Before Hibernate 5.2.10. the simplest way to do that was to use the unproxy method offered by Hibernate internal PersistenceContext implementation:
Object unproxiedEntity = ((SessionImplementor) session)
.getPersistenceContext()
.unproxy(proxy);
Try to use Hibernate.getClass(obj)
I've written following code which cleans object from proxies (if they are not already initialized)
public class PersistenceUtils {
private static void cleanFromProxies(Object value, List<Object> handledObjects) {
if ((value != null) && (!isProxy(value)) && !containsTotallyEqual(handledObjects, value)) {
handledObjects.add(value);
if (value instanceof Iterable) {
for (Object item : (Iterable<?>) value) {
cleanFromProxies(item, handledObjects);
}
} else if (value.getClass().isArray()) {
for (Object item : (Object[]) value) {
cleanFromProxies(item, handledObjects);
}
}
BeanInfo beanInfo = null;
try {
beanInfo = Introspector.getBeanInfo(value.getClass());
} catch (IntrospectionException e) {
// LOGGER.warn(e.getMessage(), e);
}
if (beanInfo != null) {
for (PropertyDescriptor property : beanInfo.getPropertyDescriptors()) {
try {
if ((property.getWriteMethod() != null) && (property.getReadMethod() != null)) {
Object fieldValue = property.getReadMethod().invoke(value);
if (isProxy(fieldValue)) {
fieldValue = unproxyObject(fieldValue);
property.getWriteMethod().invoke(value, fieldValue);
}
cleanFromProxies(fieldValue, handledObjects);
}
} catch (Exception e) {
// LOGGER.warn(e.getMessage(), e);
}
}
}
}
}
public static <T> T cleanFromProxies(T value) {
T result = unproxyObject(value);
cleanFromProxies(result, new ArrayList<Object>());
return result;
}
private static boolean containsTotallyEqual(Collection<?> collection, Object value) {
if (CollectionUtils.isEmpty(collection)) {
return false;
}
for (Object object : collection) {
if (object == value) {
return true;
}
}
return false;
}
public static boolean isProxy(Object value) {
if (value == null) {
return false;
}
if ((value instanceof HibernateProxy) || (value instanceof PersistentCollection)) {
return true;
}
return false;
}
private static Object unproxyHibernateProxy(HibernateProxy hibernateProxy) {
Object result = hibernateProxy.writeReplace();
if (!(result instanceof SerializableProxy)) {
return result;
}
return null;
}
#SuppressWarnings("unchecked")
private static <T> T unproxyObject(T object) {
if (isProxy(object)) {
if (object instanceof PersistentCollection) {
PersistentCollection persistentCollection = (PersistentCollection) object;
return (T) unproxyPersistentCollection(persistentCollection);
} else if (object instanceof HibernateProxy) {
HibernateProxy hibernateProxy = (HibernateProxy) object;
return (T) unproxyHibernateProxy(hibernateProxy);
} else {
return null;
}
}
return object;
}
private static Object unproxyPersistentCollection(PersistentCollection persistentCollection) {
if (persistentCollection instanceof PersistentSet) {
return unproxyPersistentSet((Map<?, ?>) persistentCollection.getStoredSnapshot());
}
return persistentCollection.getStoredSnapshot();
}
private static <T> Set<T> unproxyPersistentSet(Map<T, ?> persistenceSet) {
return new LinkedHashSet<T>(persistenceSet.keySet());
}
}
I use this function over result of my RPC services (via aspects) and it cleans recursively all result objects from proxies (if they are not initialized).
The way I recommend with JPA 2 :
Object unproxied = entityManager.unwrap(SessionImplementor.class).getPersistenceContext().unproxy(proxy);
Starting from Hiebrnate 5.2.10 you can use Hibernate.proxy method to convert a proxy to your real entity:
MyEntity myEntity = (MyEntity) Hibernate.unproxy( proxyMyEntity );
The another workaround is to call
Hibernate.initialize(extractedObject.getSubojbectToUnproxy());
Just before closing the session.
With Spring Data JPA and Hibernate, I was using subinterfaces of JpaRepository to look up objects belonging to a type hierarchy that was mapped using the "join" strategy. Unfortunately, the queries were returning proxies of the base type instead of instances of the expected concrete types. This prevented me from casting the results to the correct types. Like you, I came here looking for an effective way to get my entites unproxied.
Vlad has the right idea for unproxying these results; Yannis provides a little more detail. Adding to their answers, here's the rest of what you might be looking for:
The following code provides an easy way to unproxy your proxied entities:
import org.hibernate.engine.spi.PersistenceContext;
import org.hibernate.engine.spi.SessionImplementor;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.jpa.repository.JpaContext;
import org.springframework.stereotype.Component;
#Component
public final class JpaHibernateUtil {
private static JpaContext jpaContext;
#Autowired
JpaHibernateUtil(JpaContext jpaContext) {
JpaHibernateUtil.jpaContext = jpaContext;
}
public static <Type> Type unproxy(Type proxied, Class<Type> type) {
PersistenceContext persistenceContext =
jpaContext
.getEntityManagerByManagedType(type)
.unwrap(SessionImplementor.class)
.getPersistenceContext();
Type unproxied = (Type) persistenceContext.unproxyAndReassociate(proxied);
return unproxied;
}
}
You can pass either unproxied entites or proxied entities to the unproxy method. If they are already unproxied, they'll simply be returned. Otherwise, they'll get unproxied and returned.
Hope this helps!
Thank you for the suggested solutions! Unfortunately, none of them worked for my case: receiving a list of CLOB objects from Oracle database through JPA - Hibernate, using a native query.
All of the proposed approaches gave me either a ClassCastException or just returned java Proxy object (which deeply inside contained the desired Clob).
So my solution is the following (based on several above approaches):
Query sqlQuery = manager.createNativeQuery(queryStr);
List resultList = sqlQuery.getResultList();
for ( Object resultProxy : resultList ) {
String unproxiedClob = unproxyClob(resultProxy);
if ( unproxiedClob != null ) {
resultCollection.add(unproxiedClob);
}
}
private String unproxyClob(Object proxy) {
try {
BeanInfo beanInfo = Introspector.getBeanInfo(proxy.getClass());
for (PropertyDescriptor property : beanInfo.getPropertyDescriptors()) {
Method readMethod = property.getReadMethod();
if ( readMethod.getName().contains("getWrappedClob") ) {
Object result = readMethod.invoke(proxy);
return clobToString((Clob) result);
}
}
}
catch (InvocationTargetException | IntrospectionException | IllegalAccessException | SQLException | IOException e) {
LOG.error("Unable to unproxy CLOB value.", e);
}
return null;
}
private String clobToString(Clob data) throws SQLException, IOException {
StringBuilder sb = new StringBuilder();
Reader reader = data.getCharacterStream();
BufferedReader br = new BufferedReader(reader);
String line;
while( null != (line = br.readLine()) ) {
sb.append(line);
}
br.close();
return sb.toString();
}
Hope this will help somebody!
I found a solution to deproxy a class using standard Java and JPA API. Tested with hibernate, but does not require hibernate as a dependency and should work with all JPA providers.
Onle one requirement - its necessary to modify parent class (Address) and add a simple helper method.
General idea: add helper method to parent class which returns itself. when method called on proxy, it will forward the call to real instance and return this real instance.
Implementation is a little bit more complex, as hibernate recognizes that proxied class returns itself and still returns proxy instead of real instance. Workaround is to wrap returned instance into a simple wrapper class, which has different class type than the real instance.
In code:
class Address {
public AddressWrapper getWrappedSelf() {
return new AddressWrapper(this);
}
...
}
class AddressWrapper {
private Address wrappedAddress;
...
}
To cast Address proxy to real subclass, use following:
Address address = dao.getSomeAddress(...);
Address deproxiedAddress = address.getWrappedSelf().getWrappedAddress();
if (deproxiedAddress instanceof WorkAddress) {
WorkAddress workAddress = (WorkAddress)deproxiedAddress;
}