I am writing an insert query inside #Query annotation in a Spring application with PostGreSQL. So I am extending CRUD repository inside an interface that I have written.
#Repository
public interface PostGreRepository extends CrudRepository<FoodDetails,Long> {
#Modifying
#Query(value="insert into fooddetails(person_id,food_desc) select id,food_desc from person,food where id = " +
"person_id",nativeQuery = true)
void insertIntoPostGre();
}
Now I have the requirement to keep the query as a parameter in the application because it might change later. I cannot use #Value annotation inside an interface. So how can I parameterize this? Ideas?
Just as an idea, use reflection to change annotation value:
Disclaimer: changeAnnotationValue method is taken from here, I haven't run it myself
#SuppressWarnings("unchecked")
public static Object changeAnnotationValue(Annotation annotation, String key, Object newValue){
Object handler = Proxy.getInvocationHandler(annotation);
Field f;
try {
f = handler.getClass().getDeclaredField("memberValues");
} catch (NoSuchFieldException | SecurityException e) {
throw new IllegalStateException(e);
}
f.setAccessible(true);
Map<String, Object> memberValues;
try {
memberValues = (Map<String, Object>) f.get(handler);
} catch (IllegalArgumentException | IllegalAccessException e) {
throw new IllegalStateException(e);
}
Object oldValue = memberValues.get(key);
if (oldValue == null || oldValue.getClass() != newValue.getClass()) {
throw new IllegalArgumentException();
}
memberValues.put(key,newValue);
return oldValue;
}
Using query as a parameter:
#Component
public class PostGreRepositoryParameterizer {
//...
#Value("query")
private String query;
public void modify() {
Method method = PostGreRepository.class.getMethod("insertIntoPostGre");
final Query queryAnnotation = method.getAnnotation(Query.class);
changeAnnotationValue(queryAnnotation, "value", query);
}
//...
}
Related
Imagine there is a class:
#Something(someProperty = "some value")
public class Foobar {
//...
}
Which is already compiled (I cannot control the source), and is part of the classpath when the jvm starts up. I would like to be able to change "some value" to something else at runtime, such that any reflection thereafter would have my new value instead of the default "some value".
Is this possible? If so, how?
Warning: Not tested on OSX - see comment from #Marcel
Tested on OSX. Works fine.
Since I also had the need to change annotation values at runtime, I revisited this question.
Here is a modified version of #assylias approach (many thanks for the inspiration).
/**
* Changes the annotation value for the given key of the given annotation to newValue and returns
* the previous value.
*/
#SuppressWarnings("unchecked")
public static Object changeAnnotationValue(Annotation annotation, String key, Object newValue){
Object handler = Proxy.getInvocationHandler(annotation);
Field f;
try {
f = handler.getClass().getDeclaredField("memberValues");
} catch (NoSuchFieldException | SecurityException e) {
throw new IllegalStateException(e);
}
f.setAccessible(true);
Map<String, Object> memberValues;
try {
memberValues = (Map<String, Object>) f.get(handler);
} catch (IllegalArgumentException | IllegalAccessException e) {
throw new IllegalStateException(e);
}
Object oldValue = memberValues.get(key);
if (oldValue == null || oldValue.getClass() != newValue.getClass()) {
throw new IllegalArgumentException();
}
memberValues.put(key,newValue);
return oldValue;
}
Usage example:
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.TYPE)
public #interface ClassAnnotation {
String value() default "";
}
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.FIELD)
public #interface FieldAnnotation {
String value() default "";
}
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.METHOD)
public #interface MethodAnnotation {
String value() default "";
}
#ClassAnnotation("class test")
public static class TestClass{
#FieldAnnotation("field test")
public Object field;
#MethodAnnotation("method test")
public void method(){
}
}
public static void main(String[] args) throws Exception {
final ClassAnnotation classAnnotation = TestClass.class.getAnnotation(ClassAnnotation.class);
System.out.println("old ClassAnnotation = " + classAnnotation.value());
changeAnnotationValue(classAnnotation, "value", "another class annotation value");
System.out.println("modified ClassAnnotation = " + classAnnotation.value());
Field field = TestClass.class.getField("field");
final FieldAnnotation fieldAnnotation = field.getAnnotation(FieldAnnotation.class);
System.out.println("old FieldAnnotation = " + fieldAnnotation.value());
changeAnnotationValue(fieldAnnotation, "value", "another field annotation value");
System.out.println("modified FieldAnnotation = " + fieldAnnotation.value());
Method method = TestClass.class.getMethod("method");
final MethodAnnotation methodAnnotation = method.getAnnotation(MethodAnnotation.class);
System.out.println("old MethodAnnotation = " + methodAnnotation.value());
changeAnnotationValue(methodAnnotation, "value", "another method annotation value");
System.out.println("modified MethodAnnotation = " + methodAnnotation.value());
}
The advantage of this approach is, that one does not need to create a new annotation instance. Therefore one doesn't need to know the concrete annotation class in advance. Also the side effects should be minimal since the original annotation instance stays untouched.
Tested with Java 8.
This code does more or less what you ask for - it is a simple proof of concept:
a proper implementation needs to also deal with the declaredAnnotations
if the implementation of annotations in Class.java changes, the code will break (i.e. it can break at any time in the future)
I have no idea if there are side effects...
Output:
oldAnnotation = some value
modifiedAnnotation = another value
public static void main(String[] args) throws Exception {
final Something oldAnnotation = (Something) Foobar.class.getAnnotations()[0];
System.out.println("oldAnnotation = " + oldAnnotation.someProperty());
Annotation newAnnotation = new Something() {
#Override
public String someProperty() {
return "another value";
}
#Override
public Class<? extends Annotation> annotationType() {
return oldAnnotation.annotationType();
}
};
Field field = Class.class.getDeclaredField("annotations");
field.setAccessible(true);
Map<Class<? extends Annotation>, Annotation> annotations = (Map<Class<? extends Annotation>, Annotation>) field.get(Foobar.class);
annotations.put(Something.class, newAnnotation);
Something modifiedAnnotation = (Something) Foobar.class.getAnnotations()[0];
System.out.println("modifiedAnnotation = " + modifiedAnnotation.someProperty());
}
#Something(someProperty = "some value")
public static class Foobar {
}
#Retention(RetentionPolicy.RUNTIME)
#interface Something {
String someProperty();
}
This one works on my machine with Java 8. It changes the value of ignoreUnknown in the annotation #JsonIgnoreProperties(ignoreUnknown = true) from true to false.
final List<Annotation> matchedAnnotation = Arrays.stream(SomeClass.class.getAnnotations()).filter(annotation -> annotation.annotationType().equals(JsonIgnoreProperties.class)).collect(Collectors.toList());
final Annotation modifiedAnnotation = new JsonIgnoreProperties() {
#Override public Class<? extends Annotation> annotationType() {
return matchedAnnotation.get(0).annotationType();
} #Override public String[] value() {
return new String[0];
} #Override public boolean ignoreUnknown() {
return false;
} #Override public boolean allowGetters() {
return false;
} #Override public boolean allowSetters() {
return false;
}
};
final Method method = Class.class.getDeclaredMethod("getDeclaredAnnotationMap", null);
method.setAccessible(true);
final Map<Class<? extends Annotation>, Annotation> annotations = (Map<Class<? extends Annotation>, Annotation>) method.invoke(SomeClass.class, null);
annotations.put(JsonIgnoreProperties.class, modifiedAnnotation);
SPRING can do this job very easily , might be useful for spring developer .
follow these steps :-
First Solution :-
1)create a Bean returning a value for someProperty . Here I injected the somePropertyValue with #Value annotation from DB or property file :-
#Value("${config.somePropertyValue}")
private String somePropertyValue;
#Bean
public String somePropertyValue(){
return somePropertyValue;
}
2)After this , it is possible to inject the somePropertyValue into the #Something annotation like this :-
#Something(someProperty = "#{#somePropertyValue}")
public class Foobar {
//...
}
Second solution :-
1) create getter setter in bean :-
#Component
public class config{
#Value("${config.somePropertyValue}")
private String somePropertyValue;
public String getSomePropertyValue() {
return somePropertyValue;
}
public void setSomePropertyValue(String somePropertyValue) {
this.somePropertyValue = somePropertyValue;
}
}
2)After this , it is possible to inject the somePropertyValue into the #Something annotation like this :-
#Something(someProperty = "#{config.somePropertyValue}")
public class Foobar {
//...
}
Try this solution for Java 8
public static void main(String[] args) throws Exception {
final Something oldAnnotation = (Something) Foobar.class.getAnnotations()[0];
System.out.println("oldAnnotation = " + oldAnnotation.someProperty());
Annotation newAnnotation = new Something() {
#Override
public String someProperty() {
return "another value";
}
#Override
public Class<? extends Annotation> annotationType() {
return oldAnnotation.annotationType();
}
};
Method method = Class.class.getDeclaredMethod("annotationData", null);
method.setAccessible(true);
Object annotationData = method.invoke(getClass(), null);
Field declaredAnnotations = annotationData.getClass().getDeclaredField("declaredAnnotations");
declaredAnnotations.setAccessible(true);
Map<Class<? extends Annotation>, Annotation> annotations = (Map<Class<? extends Annotation>, Annotation>) declaredAnnotations.get(annotationData);
annotations.put(Something.class, newAnnotation);
Something modifiedAnnotation = (Something) Foobar.class.getAnnotations()[0];
System.out.println("modifiedAnnotation = " + modifiedAnnotation.someProperty());
}
#Something(someProperty = "some value")
public static class Foobar {
}
#Retention(RetentionPolicy.RUNTIME)
#interface Something {
String someProperty();
}
i am able to access and modify annotaions in this way in jdk1.8,but not sure why has no effect,
try {
Field annotationDataField = myObject.getClass().getClass().getDeclaredField("annotationData");
annotationDataField.setAccessible(true);
Field annotationsField = annotationDataField.get(myObject.getClass()).getClass().getDeclaredField("annotations");
annotationsField.setAccessible(true);
Map<Class<? extends Annotation>, Annotation> annotations = (Map<Class<? extends Annotation>, Annotation>) annotationsField.get(annotationDataField.get(myObject.getClass()));
annotations.put(Something.class, newSomethingValue);
} catch (IllegalArgumentException | IllegalAccessException e) {
e.printStackTrace();
} catch (NoSuchFieldException e) {
e.printStackTrace();
} catch (SecurityException e) {
e.printStackTrace();
}
Annotation attribute values have to be constants - so unless you want to do some serious byte code manipulation it won't be possible. Is there a cleaner way, such as creating a wrapper class with the annotation you desire?
I have a collection "documentDev" present in the database with sharding key as 'dNumber'
Sample Document :
{
"_id" : "12831221wadaee23",
"dNumber" : "115",
"processed": false
}
If I try to update this document through any query tool using a command like -
db.documentDev.update({
"_id" : ObjectId("12831221wadaee23"),
"dNumber":"115"
},{
$set:{"processed": true}},
{ multi: false, upsert: false}
)}`
It updates the document properly.
But if I do use spring boot's mongorepository command like
DocumentRepo.save(Object)
it throws an exception
Caused by: com.mongodb.MongoCommandException: Command failed with error 61: 'query in command must target a single shard key' on server by3prdddc01-docdb-3.documents.azure.com:10255. The full response is { "_t" : "OKMongoResponse", "ok" : 0, "code" : 61, "errmsg" : "query in command must target a single shard key", "$err" : "query in command must target a single shard key" }
This is my DocumentObject:
#Document(collection = "documentDev")
public class DocumentDev
{
#Id
private String id;
private String dNumber;
private String fileName;
private boolean processed;
}
This is my Repository Class -
#Repository
public interface DocumentRepo extends MongoRepository<DocumentDev,
String> { }
and value i am trying to update
Value : doc :
{
"_id" : "12831221wadaee23",
"dNumber" : "115",
"processed": true
}
the function I am trying to execute :
#Autowired
DocumentRepo docRepo;
docRepo.save(doc); // Fails to execute
Note: I have sharding enabled on dNumber field. And I am successfully able to update using Native queries on NoSQL Tool.
I was also able to execute the Repository save operation on Non sharded collection.
Update: I am able to update the document by creating native query using MongoTemplate - My Query looks like this -
public DocumentDev updateProcessedFlag(DocumentDev request) {
Query query = new Query();
query.addCriteria(Criteria.where("_id").is(request.getId()));
query.addCriteria(Criteria.where("dNumber").is(request.getDNumber()));
Update update = new Update();
update.set("processed", request.isProcessed());
mongoTemplate.updateFirst(query, update, request.getClass());
return request;
}
But this is not a generic solution as any other field might have update and my document may have other fields as well.
I had the same issue, solved with following hack:
#Configuration
public class ReactiveMongoConfig {
#Bean
public ReactiveMongoTemplate reactiveMongoTemplate(ReactiveMongoDatabaseFactory reactiveMongoDatabaseFactory,
MongoConverter converter, MyService service) {
return new ReactiveMongoTemplate(reactiveMongoDatabaseFactory, converter) {
#Override
protected Mono<UpdateResult> doUpdate(String collectionName, Query query, UpdateDefinition update,
Class<?> entityClass, boolean upsert, boolean multi) {
query.addCriteria(new Criteria("shardKey").is(service.getShardKey()));
return super.doUpdate(collectionName, query, update, entityClass, upsert, multi);
}
};
}
}
Would be nice to have an annotation #ShardKey to mark document field as shard and have it added to query automatically.
Following the custom repository approach, I got an error because spring is expecting a Cosmos entity to be available in the custom implementation {EntityName}CustomRepositoryImpl, so I renamed the implementation. I also added code for:
The case when entity has inherited fields
Shard key is not always the Id, we should add it along with the id: { "shardkeyName": "shardValue" }
Adding generated ObjectId to the entity for new documents
public class DocumentRepositoryImpl<T> implements CosmosRepositoryCustom<T> {
#Autowired
protected MongoTemplate mongoTemplate;
#Override
public T customSave(T entity) {
WriteResult writeResult = mongoTemplate.upsert(createQuery(entity), createUpdate(entity), entity.getClass());
setIdForEntity(entity,writeResult);
return entity;
}
#Override
public T customSave(T entity, String collectionName) {
WriteResult writeResult = mongoTemplate.upsert(createQuery(entity), createUpdate(entity), collectionName);
setIdForEntity(entity,writeResult);
return entity;
}
#Override
public void customSave(List<T> entities) {
if(CollectionUtils.isNotEmpty(entities)){
entities.forEach(entity -> customSave(entity));
}
}
public <T> Update createUpdate(T entity){
Update update = new Update();
for (Field field : getAllFields(entity)) {
try {
field.setAccessible(true);
if (field.get(entity) != null) {
update.set(field.getName(), field.get(entity));
}
} catch (IllegalArgumentException | IllegalAccessException e) {
LOGGER.error("Error creating update for entity",e);
}
}
return update;
}
public <T> Query createQuery(T entity) {
Criteria criteria = new Criteria();
for (Field field : getAllFields(entity)) {
try {
field.setAccessible(true);
if (field.get(entity) != null) {
if (field.getName().equals("id")) {
Query query = new Query(Criteria.where("id").is(field.get(entity)));
query.addCriteria(new Criteria(SHARD_KEY_NAME).is(SHARD_KEY_VALUE));
return query;
}
criteria.and(field.getName()).is(field.get(entity));
}
} catch (IllegalArgumentException | IllegalAccessException e) {
LOGGER.error("Error creating query for entity",e);
}
}
return new Query(criteria);
}
private <T> List<Field> getAllFields(T entity) {
List<Field> fields = new ArrayList<>();
fields.addAll(Arrays.asList(entity.getClass().getDeclaredFields()));
Class<?> c = entity.getClass().getSuperclass();
if(!c.equals(Object.class)){
fields.addAll(Arrays.asList(c.getDeclaredFields()));
}
return fields;
}
public <T> void setIdForEntity(T entity,WriteResult writeResult){
if(null != writeResult && null != writeResult.getUpsertedId()){
Object upsertId = writeResult.getUpsertedId();
entity.setId(upsertId.toString());
}
}
}
I am using spring-boot-starter-mongodb:1.5.1 with spring-data-mongodb:1.9.11
i am hacking this by create a custom repository:
public interface CosmosCustomRepository<T> {
void customSave(T entity);
void customSave(T entity, String collectionName);
}
the implement for this repository:
public class CosmosCustomRepositoryImpl<T> implements CosmosCustomRepository<T> {
#Autowired
private MongoTemplate mongoTemplate;
#Override
public void customSave(T entity) {
mongoTemplate.upsert(createQuery(entity), createUpdate(entity), entity.getClass());
}
#Override
public void customSave(T entity, String collectionName) {
mongoTemplate.upsert(createQuery(entity), createUpdate(entity), collectionName);
}
private Update createUpdate(T entity) {
Update update = new Update();
for (Field field : entity.getClass().getDeclaredFields()) {
try {
field.setAccessible(true);
if (field.get(entity) != null) {
update.set(field.getName(), field.get(entity));
}
} catch (IllegalArgumentException | IllegalAccessException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
return update;
}
private Query createQuery(T entity) {
Criteria criteria = new Criteria();
for (Field field : entity.getClass().getDeclaredFields()) {
try {
field.setAccessible(true);
if (field.get(entity) != null) {
if (field.getName().equals("id")) {
return new Query(Criteria.where("id").is(field.get(entity)));
}
criteria.and(field.getName()).is(field.get(entity));
}
} catch (IllegalArgumentException | IllegalAccessException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
return new Query(criteria);
}
}
your DocumentRepo will extends this new custom repository.
#Repository
public interface DocumentRepo extends MongoRepository<DocumentDev, String>, CosmosCustomRepository<DocumentDev> { }
To save new document, just use new customSave
#Autowired
DocumentRepo docRepo;
docRepo.customSave(doc);
A more recent and simple approach I've figured out recently is to use the #Sharded({shardKey1, shardKey2, ...}) spring annotation like so:
#Document(collection = "documentDev")
#Sharded(shardKey = {"dNumber"})
public class DocumentDev {
#Id private String id;
private String dNumber;
private String fileName;
private boolean processed;
}
This is going to be picked up by MongoRepository automatically.
And here is the doc: https://docs.spring.io/spring-data/mongodb/docs/current/reference/html/#sharding
Enjoy coding!
I have a bean class
public class Group{string name;Type type; }
and another bean
public class Type{String name;}
Now, i want to bind group by using jdbi #BindBean
#SqlBatch("INSERT INTO (type_id,name) VALUES((SELECT id FROM type WHERE name=:m.type.name),:m.name)")
#BatchChunkSize(100)
int[] insertRewardGroup(#BindBean ("m") Set<Group> groups);
How can i bind the user defined object's property as member of the bean??
You could implement your own Bind-annotation here. I implemented one that I am adopting for this answer. It will unwrap all Type ones.
I think it could be made fully generic with a little more work.
Your code would look like this (please note that m.type.name changed to m.type):
#SqlBatch("INSERT ... WHERE name=:m.type),:m.name)")
#BatchChunkSize(100)
int[] insertRewardGroup(#BindTypeBean ("m") Set<Group> groups);
This would be the annotation:
#BindingAnnotation(BindTypeBean.SomethingBinderFactory.class)
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.PARAMETER})
public #interface BindTypeBean {
String value() default "___jdbi_bare___";
public static class SomethingBinderFactory implements BinderFactory {
public Binder build(Annotation annotation) {
return new Binder<BindTypeBean, Object>() {
public void bind(SQLStatement q, BindTypeBean bind, Object arg) {
final String prefix;
if ("___jdbi_bare___".equals(bind.value())) {
prefix = "";
} else {
prefix = bind.value() + ".";
}
try {
BeanInfo infos = Introspector.getBeanInfo(arg.getClass());
PropertyDescriptor[] props = infos.getPropertyDescriptors();
for (PropertyDescriptor prop : props) {
Method readMethod = prop.getReadMethod();
if (readMethod != null) {
Object r = readMethod.invoke(arg);
Class<?> c = readMethod.getReturnType();
if (prop.getName().equals("type") && r instanceof Type) {
r = ((Type) r).getType();
c = r.getClass();
}
q.dynamicBind(c, prefix + prop.getName(), r);
}
}
} catch (Exception e) {
throw new IllegalStateException("unable to bind bean properties", e);
}
}
};
}
}
}
Doing this in JDBI is not possible , you have to bring out the property and give is a argument.
I am looking for a way to do a query that requires a JOIN. Is there any way to do this in a prepared statement, or is the rawQuery the only option that I have. If rawQuery is the only option, then is there some way to automatically map the returned objects to the objects of the Dao being implemented.
I've dug through the documents and examples but cannot find anything that will allow me to map the raw database result to an ORM object class.
I am looking for a way to do a query that requires a JOIN.
ORMLite supports simple JOIN queries. You can also use raw-queries to accomplish this.
You can use the Dao.getRawRowMapper() to map the queries as you found or you can create a custom mapper. The documentation has the following sample code which shows how to map the String[] into your object:
GenericRawResults<Foo> rawResults =
orderDao.queryRaw(
"select account_id,sum(amount) from orders group by account_id",
new RawRowMapper<Foo>() {
public Foo mapRow(String[] columnNames,
String[] resultColumns) {
return new Foo(Long.parseLong(resultColumns[0]),
Integer.parseInt(resultColumns[1]));
}
});
I've found a way to auto map a result set to a model object.
// return the orders with the sum of their amounts per account
GenericRawResults<Order> rawResults =
orderDao.queryRaw(query, orderDao.getRawRowMapper(), param1)
// page through the results
for (Order order : rawResults) {
System.out.println("Account-id " + order.accountId + " has "
+ order.totalOrders + " total orders");
}
rawResults.close();
The key is to pull the row mapper from your object Dao using getRawRowMapper(), which will handle the mapping for you. I hope this helps anyone who finds it.
I still would love the ability to do joins within the QueryBuilder but until that is supported, this is the next best thing in my opinion.
Raw query auto mapping
I had problem of mapping fields from custom SELECT which return columns that are not present in any table model. So I made custom RawRowMapper which can map fields from custom query to custom model. This is useful when you have query which has fields that doesn't corresponds to any table maping model.
This is RowMapper which performs query auto mapping:
public class GenericRowMapper<T> implements RawRowMapper<T> {
private Class<T> entityClass;
private Set<Field> fields = new HashSet<>();
private Map<String, Field> colNameFieldMap = new HashMap<>();
public GenericRowMapper(Class<T> entityClass) {
this.dbType = dbType;
this.entityClass = entityClass;
Class cl = entityClass;
do {
for (Field field : cl.getDeclaredFields()) {
if (field.isAnnotationPresent(DatabaseField.class)) {
DatabaseField an = field.getAnnotation(DatabaseField.class);
fields.add(field);
colNameFieldMap.put(an.columnName(), field);
}
}
cl = cl.getSuperclass();
} while (cl != Object.class);
}
#Override
public T mapRow(String[] columnNames, String[] resultColumns) throws SQLException {
try {
T entity = entityClass.newInstance();
for (int i = 0; i < columnNames.length; i++) {
Field f = colNameFieldMap.get(columnNames[i]);
boolean accessible = f.isAccessible();
f.setAccessible(true);
f.set(entity, stringToJavaObject(f.getType(), resultColumns[i]));
f.setAccessible(accessible);
}
return entity;
} catch (InstantiationException e) {
throw new RuntimeException(e);
} catch (IllegalAccessException e) {
throw new RuntimeException(e);
}
}
public Object stringToJavaObject(Class cl, String result) {
if (result == null){
return null;
}else if (cl == Integer.class || int.class == cl) {
return Integer.parseInt(result);
} else if (cl == Float.class || float.class == cl) {
return Float.parseFloat(result);
} else if (cl == Double.class || double.class == cl) {
return Double.parseDouble(result);
} else if (cl == Boolean.class || cl == boolean.class) {
try{
return Integer.valueOf(result) > 0;
}catch (NumberFormatException e){
return Boolean.parseBoolean(result);
}
} else if (cl == Date.class) {
DateLongType lType = DateLongType.getSingleton();
DateStringType sType = DateStringType.getSingleton();
try {
return lType.resultStringToJava(null, result, -1);
} catch (NumberFormatException e) {
try {
return sType.resultStringToJava(null, result, -1);
} catch (SQLException e2) {
throw new RuntimeException(e);
}
}
} else {
return result;
}
}
}
And here is the usage:
class Model{
#DatabaseField(columnName = "account_id")
String accId;
#DatabaseField(columnName = "amount")
int amount;
}
String sql = "select account_id,sum(amount) amount from orders group by account_id"
return queryRaw(sql,new GenericRowMapper<>(Model.class)).getResults()
This will return List<Model> with mapped result rows to Model if query column names and #DatabaseField(columnName are the same
During a Hibernate Session, I am loading some objects and some of them are loaded as proxies due to lazy loading. It's all OK and I don't want to turn lazy loading off.
But later I need to send some of the objects (actually one object) to the GWT client via RPC. And it happens that this concrete object is a proxy. So I need to turn it into a real object. I can't find a method like "materialize" in Hibernate.
How can I turn some of the objects from proxies to reals knowing their class and ID?
At the moment the only solution I see is to evict that object from Hibernate's cache and reload it, but it is really bad for many reasons.
Here's a method I'm using.
public static <T> T initializeAndUnproxy(T entity) {
if (entity == null) {
throw new
NullPointerException("Entity passed for initialization is null");
}
Hibernate.initialize(entity);
if (entity instanceof HibernateProxy) {
entity = (T) ((HibernateProxy) entity).getHibernateLazyInitializer()
.getImplementation();
}
return entity;
}
Since Hibernate ORM 5.2.10, you can do it likee this:
Object unproxiedEntity = Hibernate.unproxy(proxy);
Before Hibernate 5.2.10. the simplest way to do that was to use the unproxy method offered by Hibernate internal PersistenceContext implementation:
Object unproxiedEntity = ((SessionImplementor) session)
.getPersistenceContext()
.unproxy(proxy);
Try to use Hibernate.getClass(obj)
I've written following code which cleans object from proxies (if they are not already initialized)
public class PersistenceUtils {
private static void cleanFromProxies(Object value, List<Object> handledObjects) {
if ((value != null) && (!isProxy(value)) && !containsTotallyEqual(handledObjects, value)) {
handledObjects.add(value);
if (value instanceof Iterable) {
for (Object item : (Iterable<?>) value) {
cleanFromProxies(item, handledObjects);
}
} else if (value.getClass().isArray()) {
for (Object item : (Object[]) value) {
cleanFromProxies(item, handledObjects);
}
}
BeanInfo beanInfo = null;
try {
beanInfo = Introspector.getBeanInfo(value.getClass());
} catch (IntrospectionException e) {
// LOGGER.warn(e.getMessage(), e);
}
if (beanInfo != null) {
for (PropertyDescriptor property : beanInfo.getPropertyDescriptors()) {
try {
if ((property.getWriteMethod() != null) && (property.getReadMethod() != null)) {
Object fieldValue = property.getReadMethod().invoke(value);
if (isProxy(fieldValue)) {
fieldValue = unproxyObject(fieldValue);
property.getWriteMethod().invoke(value, fieldValue);
}
cleanFromProxies(fieldValue, handledObjects);
}
} catch (Exception e) {
// LOGGER.warn(e.getMessage(), e);
}
}
}
}
}
public static <T> T cleanFromProxies(T value) {
T result = unproxyObject(value);
cleanFromProxies(result, new ArrayList<Object>());
return result;
}
private static boolean containsTotallyEqual(Collection<?> collection, Object value) {
if (CollectionUtils.isEmpty(collection)) {
return false;
}
for (Object object : collection) {
if (object == value) {
return true;
}
}
return false;
}
public static boolean isProxy(Object value) {
if (value == null) {
return false;
}
if ((value instanceof HibernateProxy) || (value instanceof PersistentCollection)) {
return true;
}
return false;
}
private static Object unproxyHibernateProxy(HibernateProxy hibernateProxy) {
Object result = hibernateProxy.writeReplace();
if (!(result instanceof SerializableProxy)) {
return result;
}
return null;
}
#SuppressWarnings("unchecked")
private static <T> T unproxyObject(T object) {
if (isProxy(object)) {
if (object instanceof PersistentCollection) {
PersistentCollection persistentCollection = (PersistentCollection) object;
return (T) unproxyPersistentCollection(persistentCollection);
} else if (object instanceof HibernateProxy) {
HibernateProxy hibernateProxy = (HibernateProxy) object;
return (T) unproxyHibernateProxy(hibernateProxy);
} else {
return null;
}
}
return object;
}
private static Object unproxyPersistentCollection(PersistentCollection persistentCollection) {
if (persistentCollection instanceof PersistentSet) {
return unproxyPersistentSet((Map<?, ?>) persistentCollection.getStoredSnapshot());
}
return persistentCollection.getStoredSnapshot();
}
private static <T> Set<T> unproxyPersistentSet(Map<T, ?> persistenceSet) {
return new LinkedHashSet<T>(persistenceSet.keySet());
}
}
I use this function over result of my RPC services (via aspects) and it cleans recursively all result objects from proxies (if they are not initialized).
The way I recommend with JPA 2 :
Object unproxied = entityManager.unwrap(SessionImplementor.class).getPersistenceContext().unproxy(proxy);
Starting from Hiebrnate 5.2.10 you can use Hibernate.proxy method to convert a proxy to your real entity:
MyEntity myEntity = (MyEntity) Hibernate.unproxy( proxyMyEntity );
The another workaround is to call
Hibernate.initialize(extractedObject.getSubojbectToUnproxy());
Just before closing the session.
With Spring Data JPA and Hibernate, I was using subinterfaces of JpaRepository to look up objects belonging to a type hierarchy that was mapped using the "join" strategy. Unfortunately, the queries were returning proxies of the base type instead of instances of the expected concrete types. This prevented me from casting the results to the correct types. Like you, I came here looking for an effective way to get my entites unproxied.
Vlad has the right idea for unproxying these results; Yannis provides a little more detail. Adding to their answers, here's the rest of what you might be looking for:
The following code provides an easy way to unproxy your proxied entities:
import org.hibernate.engine.spi.PersistenceContext;
import org.hibernate.engine.spi.SessionImplementor;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.jpa.repository.JpaContext;
import org.springframework.stereotype.Component;
#Component
public final class JpaHibernateUtil {
private static JpaContext jpaContext;
#Autowired
JpaHibernateUtil(JpaContext jpaContext) {
JpaHibernateUtil.jpaContext = jpaContext;
}
public static <Type> Type unproxy(Type proxied, Class<Type> type) {
PersistenceContext persistenceContext =
jpaContext
.getEntityManagerByManagedType(type)
.unwrap(SessionImplementor.class)
.getPersistenceContext();
Type unproxied = (Type) persistenceContext.unproxyAndReassociate(proxied);
return unproxied;
}
}
You can pass either unproxied entites or proxied entities to the unproxy method. If they are already unproxied, they'll simply be returned. Otherwise, they'll get unproxied and returned.
Hope this helps!
Thank you for the suggested solutions! Unfortunately, none of them worked for my case: receiving a list of CLOB objects from Oracle database through JPA - Hibernate, using a native query.
All of the proposed approaches gave me either a ClassCastException or just returned java Proxy object (which deeply inside contained the desired Clob).
So my solution is the following (based on several above approaches):
Query sqlQuery = manager.createNativeQuery(queryStr);
List resultList = sqlQuery.getResultList();
for ( Object resultProxy : resultList ) {
String unproxiedClob = unproxyClob(resultProxy);
if ( unproxiedClob != null ) {
resultCollection.add(unproxiedClob);
}
}
private String unproxyClob(Object proxy) {
try {
BeanInfo beanInfo = Introspector.getBeanInfo(proxy.getClass());
for (PropertyDescriptor property : beanInfo.getPropertyDescriptors()) {
Method readMethod = property.getReadMethod();
if ( readMethod.getName().contains("getWrappedClob") ) {
Object result = readMethod.invoke(proxy);
return clobToString((Clob) result);
}
}
}
catch (InvocationTargetException | IntrospectionException | IllegalAccessException | SQLException | IOException e) {
LOG.error("Unable to unproxy CLOB value.", e);
}
return null;
}
private String clobToString(Clob data) throws SQLException, IOException {
StringBuilder sb = new StringBuilder();
Reader reader = data.getCharacterStream();
BufferedReader br = new BufferedReader(reader);
String line;
while( null != (line = br.readLine()) ) {
sb.append(line);
}
br.close();
return sb.toString();
}
Hope this will help somebody!
I found a solution to deproxy a class using standard Java and JPA API. Tested with hibernate, but does not require hibernate as a dependency and should work with all JPA providers.
Onle one requirement - its necessary to modify parent class (Address) and add a simple helper method.
General idea: add helper method to parent class which returns itself. when method called on proxy, it will forward the call to real instance and return this real instance.
Implementation is a little bit more complex, as hibernate recognizes that proxied class returns itself and still returns proxy instead of real instance. Workaround is to wrap returned instance into a simple wrapper class, which has different class type than the real instance.
In code:
class Address {
public AddressWrapper getWrappedSelf() {
return new AddressWrapper(this);
}
...
}
class AddressWrapper {
private Address wrappedAddress;
...
}
To cast Address proxy to real subclass, use following:
Address address = dao.getSomeAddress(...);
Address deproxiedAddress = address.getWrappedSelf().getWrappedAddress();
if (deproxiedAddress instanceof WorkAddress) {
WorkAddress workAddress = (WorkAddress)deproxiedAddress;
}