Related
I'm trying to learn Gson and I'm struggling with field exclusion. Here are my classes
public class Student {
private Long id;
private String firstName = "Philip";
private String middleName = "J.";
private String initials = "P.F";
private String lastName = "Fry";
private Country country;
private Country countryOfBirth;
}
public class Country {
private Long id;
private String name;
private Object other;
}
I can use the GsonBuilder and add an ExclusionStrategy for a field name like firstName or country but I can't seem to manage to exclude properties of certain fields like country.name.
Using the method public boolean shouldSkipField(FieldAttributes fa), FieldAttributes doesn't contain enough information to match the field with a filter like country.name.
P.S: I want to avoid annotations since I want to improve on this and use RegEx to filter fields out.
Edit: I'm trying to see if it's possible to emulate the behavior of Struts2 JSON plugin
using Gson
<interceptor-ref name="json">
<param name="enableSMD">true</param>
<param name="excludeProperties">
login.password,
studentList.*\.sin
</param>
</interceptor-ref>
Edit:
I reopened the question with the following addition:
I added a second field with the same type to futher clarify this problem. Basically I want to exclude country.name but not countrOfBirth.name. I also don't want to exclude Country as a type.
So the types are the same it's the actual place in the object graph that I want to pinpoint and exclude.
Any fields you don't want serialized in general you should use the "transient" modifier, and this also applies to json serializers (at least it does to a few that I have used, including gson).
If you don't want name to show up in the serialized json give it a transient keyword, eg:
private transient String name;
More details in the Gson documentation
Nishant provided a good solution, but there's an easier way. Simply mark the desired fields with the #Expose annotation, such as:
#Expose private Long id;
Leave out any fields that you do not want to serialize. Then just create your Gson object this way:
Gson gson = new GsonBuilder().excludeFieldsWithoutExposeAnnotation().create();
So, you want to exclude firstName and country.name. Here is what your ExclusionStrategy should look like
public class TestExclStrat implements ExclusionStrategy {
public boolean shouldSkipClass(Class<?> arg0) {
return false;
}
public boolean shouldSkipField(FieldAttributes f) {
return (f.getDeclaringClass() == Student.class && f.getName().equals("firstName"))||
(f.getDeclaringClass() == Country.class && f.getName().equals("name"));
}
}
If you see closely it returns true for Student.firstName and Country.name, which is what you want to exclude.
You need to apply this ExclusionStrategy like this,
Gson gson = new GsonBuilder()
.setExclusionStrategies(new TestExclStrat())
//.serializeNulls() <-- uncomment to serialize NULL fields as well
.create();
Student src = new Student();
String json = gson.toJson(src);
System.out.println(json);
This returns:
{ "middleName": "J.", "initials": "P.F", "lastName": "Fry", "country": { "id": 91}}
I assume the country object is initialized with id = 91L in student class.
You may get fancy. For example, you do not want to serialize any field that contains "name" string in its name. Do this:
public boolean shouldSkipField(FieldAttributes f) {
return f.getName().toLowerCase().contains("name");
}
This will return:
{ "initials": "P.F", "country": { "id": 91 }}
EDIT: Added more info as requested.
This ExclusionStrategy will do the thing, but you need to pass "Fully Qualified Field Name". See below:
public class TestExclStrat implements ExclusionStrategy {
private Class<?> c;
private String fieldName;
public TestExclStrat(String fqfn) throws SecurityException, NoSuchFieldException, ClassNotFoundException
{
this.c = Class.forName(fqfn.substring(0, fqfn.lastIndexOf(".")));
this.fieldName = fqfn.substring(fqfn.lastIndexOf(".")+1);
}
public boolean shouldSkipClass(Class<?> arg0) {
return false;
}
public boolean shouldSkipField(FieldAttributes f) {
return (f.getDeclaringClass() == c && f.getName().equals(fieldName));
}
}
Here is how we can use it generically.
Gson gson = new GsonBuilder()
.setExclusionStrategies(new TestExclStrat("in.naishe.test.Country.name"))
//.serializeNulls()
.create();
Student src = new Student();
String json = gson.toJson(src);
System.out.println(json);
It returns:
{ "firstName": "Philip" , "middleName": "J.", "initials": "P.F", "lastName": "Fry", "country": { "id": 91 }}
After reading all available answers I found out, that most flexible, in my case, was to use custom #Exclude annotation. So, I implemented simple strategy for this (I didn't want to mark all fields using #Expose nor I wanted to use transient which conflicted with in app Serializable serialization) :
Annotation:
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.FIELD)
public #interface Exclude {
}
Strategy:
public class AnnotationExclusionStrategy implements ExclusionStrategy {
#Override
public boolean shouldSkipField(FieldAttributes f) {
return f.getAnnotation(Exclude.class) != null;
}
#Override
public boolean shouldSkipClass(Class<?> clazz) {
return false;
}
}
Usage:
new GsonBuilder().setExclusionStrategies(new AnnotationExclusionStrategy()).create();
I ran into this issue, in which I had a small number of fields I wanted to exclude only from serialization, so I developed a fairly simple solution that uses Gson's #Expose annotation with custom exclusion strategies.
The only built-in way to use #Expose is by setting GsonBuilder.excludeFieldsWithoutExposeAnnotation(), but as the name indicates, fields without an explicit #Expose are ignored. As I only had a few fields I wanted to exclude, I found the prospect of adding the annotation to every field very cumbersome.
I effectively wanted the inverse, in which everything was included unless I explicitly used #Expose to exclude it. I used the following exclusion strategies to accomplish this:
new GsonBuilder()
.addSerializationExclusionStrategy(new ExclusionStrategy() {
#Override
public boolean shouldSkipField(FieldAttributes fieldAttributes) {
final Expose expose = fieldAttributes.getAnnotation(Expose.class);
return expose != null && !expose.serialize();
}
#Override
public boolean shouldSkipClass(Class<?> aClass) {
return false;
}
})
.addDeserializationExclusionStrategy(new ExclusionStrategy() {
#Override
public boolean shouldSkipField(FieldAttributes fieldAttributes) {
final Expose expose = fieldAttributes.getAnnotation(Expose.class);
return expose != null && !expose.deserialize();
}
#Override
public boolean shouldSkipClass(Class<?> aClass) {
return false;
}
})
.create();
Now I can easily exclude a few fields with #Expose(serialize = false) or #Expose(deserialize = false) annotations (note that the default value for both #Expose attributes is true). You can of course use #Expose(serialize = false, deserialize = false), but that is more concisely accomplished by declaring the field transient instead (which does still take effect with these custom exclusion strategies).
You can explore the json tree with gson.
Try something like this :
gson.toJsonTree(student).getAsJsonObject()
.get("country").getAsJsonObject().remove("name");
You can add some properties also :
gson.toJsonTree(student).getAsJsonObject().addProperty("isGoodStudent", false);
Tested with gson 2.2.4.
I came up with a class factory to support this functionality. Pass in any combination of either fields or classes you want to exclude.
public class GsonFactory {
public static Gson build(final List<String> fieldExclusions, final List<Class<?>> classExclusions) {
GsonBuilder b = new GsonBuilder();
b.addSerializationExclusionStrategy(new ExclusionStrategy() {
#Override
public boolean shouldSkipField(FieldAttributes f) {
return fieldExclusions == null ? false : fieldExclusions.contains(f.getName());
}
#Override
public boolean shouldSkipClass(Class<?> clazz) {
return classExclusions == null ? false : classExclusions.contains(clazz);
}
});
return b.create();
}
}
To use, create two lists (each is optional), and create your GSON object:
static {
List<String> fieldExclusions = new ArrayList<String>();
fieldExclusions.add("id");
fieldExclusions.add("provider");
fieldExclusions.add("products");
List<Class<?>> classExclusions = new ArrayList<Class<?>>();
classExclusions.add(Product.class);
GSON = GsonFactory.build(null, classExclusions);
}
private static final Gson GSON;
public String getSomeJson(){
List<Provider> list = getEntitiesFromDatabase();
return GSON.toJson(list);
}
I solved this problem with custom annotations.
This is my "SkipSerialisation" Annotation class:
#Target (ElementType.FIELD)
public #interface SkipSerialisation {
}
and this is my GsonBuilder:
gsonBuilder.addSerializationExclusionStrategy(new ExclusionStrategy() {
#Override public boolean shouldSkipField (FieldAttributes f) {
return f.getAnnotation(SkipSerialisation.class) != null;
}
#Override public boolean shouldSkipClass (Class<?> clazz) {
return false;
}
});
Example :
public class User implements Serializable {
public String firstName;
public String lastName;
#SkipSerialisation
public String email;
}
Kotlin's #Transientannotation also does the trick apparently.
data class Json(
#field:SerializedName("serialized_field_1") val field1: String,
#field:SerializedName("serialized_field_2") val field2: String,
#Transient val field3: String
)
Output:
{"serialized_field_1":"VALUE1","serialized_field_2":"VALUE2"}
Or can say whats fields not will expose with:
Gson gson = gsonBuilder.excludeFieldsWithModifiers(Modifier.TRANSIENT).create();
on your class on attribute:
private **transient** boolean nameAttribute;
I used this strategy:
i excluded every field which is not marked with #SerializedName annotation, i.e.:
public class Dummy {
#SerializedName("VisibleValue")
final String visibleValue;
final String hiddenValue;
public Dummy(String visibleValue, String hiddenValue) {
this.visibleValue = visibleValue;
this.hiddenValue = hiddenValue;
}
}
public class SerializedNameOnlyStrategy implements ExclusionStrategy {
#Override
public boolean shouldSkipField(FieldAttributes f) {
return f.getAnnotation(SerializedName.class) == null;
}
#Override
public boolean shouldSkipClass(Class<?> clazz) {
return false;
}
}
Gson gson = new GsonBuilder()
.setExclusionStrategies(new SerializedNameOnlyStrategy())
.create();
Dummy dummy = new Dummy("I will see this","I will not see this");
String json = gson.toJson(dummy);
It returns
{"VisibleValue":"I will see this"}
Another approach (especially useful if you need to make a decision to exclude a field at runtime) is to register a TypeAdapter with your gson instance. Example below:
Gson gson = new GsonBuilder()
.registerTypeAdapter(BloodPressurePost.class, new BloodPressurePostSerializer())
In the case below, the server would expect one of two values but since they were both ints then gson would serialize them both. My goal was to omit any value that is zero (or less) from the json that is posted to the server.
public class BloodPressurePostSerializer implements JsonSerializer<BloodPressurePost> {
#Override
public JsonElement serialize(BloodPressurePost src, Type typeOfSrc, JsonSerializationContext context) {
final JsonObject jsonObject = new JsonObject();
if (src.systolic > 0) {
jsonObject.addProperty("systolic", src.systolic);
}
if (src.diastolic > 0) {
jsonObject.addProperty("diastolic", src.diastolic);
}
jsonObject.addProperty("units", src.units);
return jsonObject;
}
}
I'm working just by putting the #Expose annotation, here my version that I use
compile 'com.squareup.retrofit2:retrofit:2.0.2'
compile 'com.squareup.retrofit2:converter-gson:2.0.2'
In Model class:
#Expose
int number;
public class AdapterRestApi {
In the Adapter class:
public EndPointsApi connectRestApi() {
OkHttpClient client = new OkHttpClient.Builder()
.connectTimeout(90000, TimeUnit.SECONDS)
.readTimeout(90000,TimeUnit.SECONDS).build();
Retrofit retrofit = new Retrofit.Builder()
.baseUrl(ConstantRestApi.ROOT_URL)
.addConverterFactory(GsonConverterFactory.create())
.client(client)
.build();
return retrofit.create (EndPointsApi.class);
}
I have Kotlin version
#Retention(AnnotationRetention.RUNTIME)
#Target(AnnotationTarget.FIELD)
internal annotation class JsonSkip
class SkipFieldsStrategy : ExclusionStrategy {
override fun shouldSkipClass(clazz: Class<*>): Boolean {
return false
}
override fun shouldSkipField(f: FieldAttributes): Boolean {
return f.getAnnotation(JsonSkip::class.java) != null
}
}
and how You can add this to Retrofit GSONConverterFactory:
val gson = GsonBuilder()
.setExclusionStrategies(SkipFieldsStrategy())
//.serializeNulls()
//.setDateFormat(DateFormat.LONG)
//.setFieldNamingPolicy(FieldNamingPolicy.UPPER_CAMEL_CASE)
//.setPrettyPrinting()
//.registerTypeAdapter(Id.class, IdTypeAdapter())
.create()
return GsonConverterFactory.create(gson)
This what I always use:
The default behaviour implemented in Gson is that null object fields are ignored.
Means Gson object does not serialize fields with null values to JSON. If a field in a Java object is null, Gson excludes it.
You can use this function to convert some object to null or well set by your own
/**
* convert object to json
*/
public String toJson(Object obj) {
// Convert emtpy string and objects to null so we don't serialze them
setEmtpyStringsAndObjectsToNull(obj);
return gson.toJson(obj);
}
/**
* Sets all empty strings and objects (all fields null) including sets to null.
*
* #param obj any object
*/
public void setEmtpyStringsAndObjectsToNull(Object obj) {
for (Field field : obj.getClass().getDeclaredFields()) {
field.setAccessible(true);
try {
Object fieldObj = field.get(obj);
if (fieldObj != null) {
Class fieldType = field.getType();
if (fieldType.isAssignableFrom(String.class)) {
if(fieldObj.equals("")) {
field.set(obj, null);
}
} else if (fieldType.isAssignableFrom(Set.class)) {
for (Object item : (Set) fieldObj) {
setEmtpyStringsAndObjectsToNull(item);
}
boolean setFielToNull = true;
for (Object item : (Set) field.get(obj)) {
if(item != null) {
setFielToNull = false;
break;
}
}
if(setFielToNull) {
setFieldToNull(obj, field);
}
} else if (!isPrimitiveOrWrapper(fieldType)) {
setEmtpyStringsAndObjectsToNull(fieldObj);
boolean setFielToNull = true;
for (Field f : fieldObj.getClass().getDeclaredFields()) {
f.setAccessible(true);
if(f.get(fieldObj) != null) {
setFielToNull = false;
break;
}
}
if(setFielToNull) {
setFieldToNull(obj, field);
}
}
}
} catch (IllegalAccessException e) {
System.err.println("Error while setting empty string or object to null: " + e.getMessage());
}
}
}
private void setFieldToNull(Object obj, Field field) throws IllegalAccessException {
if(!Modifier.isFinal(field.getModifiers())) {
field.set(obj, null);
}
}
private boolean isPrimitiveOrWrapper(Class fieldType) {
return fieldType.isPrimitive()
|| fieldType.isAssignableFrom(Integer.class)
|| fieldType.isAssignableFrom(Boolean.class)
|| fieldType.isAssignableFrom(Byte.class)
|| fieldType.isAssignableFrom(Character.class)
|| fieldType.isAssignableFrom(Float.class)
|| fieldType.isAssignableFrom(Long.class)
|| fieldType.isAssignableFrom(Double.class)
|| fieldType.isAssignableFrom(Short.class);
}
in kotlin can use #Transient to ignore the field... eg.
data class MyClass{
#Transient var myVar: Boolean
//....
}
Use different DTO for cached object.
For example, you can create UserCached class and keep there only fields you need.
After that, create mapper to map objects back & forth. Mapstruct is good for that.
Such approach solves the problem, decouples your application, and makes changes in your primary DTO more safe to make.
I am trying to map a A-DTO object to an A-DO object, each having a collection (a List) of T-DTOs, and T-DOs, respectively. I am trying to do it in the context of a REST API. It's a separate question whether it's a right approach - the problem I'm solving is a case of update. Basically, if one of the T-DTOs inside the A-DTO changes, I want that change to be mapped into the corresponding T-DO inside the A-DO.
I found relationship-type="non-cumulative" in Dozer documentation, so that the object inside the collection is updated, if present. But I end up with Dozer inserting a new T-DO into the A-DO's collection!
NOTE: I did implement equals! it is based on the primary key only for now.
Any ideas?
PS: and, if you think this is a bad idea to handle updates to a one-to-many dependent entity, feel free to point that out.. I'm not 100% sure I like that approach, but my REST foo is not very strong.
UPDATE
equals implementation:
#Override
public boolean equals(Object obj) {
if (obj instanceof MyDOClass) {
MyDOClass other = (MyDOClass) obj;
return other.getId().equals(this.getId());
}
return false;
}
I just had the same problem and I solved it:
Dozer uses contains to determine if a member is inside a collection.
You should implement hashCode so that "contains" will work appropriately.
You can see this in the following documentation page:
http://dozer.sourceforge.net/documentation/collectionandarraymapping.html
Under: "Cumulative vs. Non-Cumulative List Mapping (bi-directional)"
Good luck!
Ended up doing a custom mapping.
I did endup doing my own AbstractConverter please find it below:
It has some constraints which are suitable for me (possibly not for you).
will update based on "sameId" implementation
will remove orphans (element from destination not in the source).
Only works on List (enough for my needs).
While the converter will manage the decision to update the mapping of objects are delegated back to Dozer so you don't need to implement the mapping of the elements in your list
Sample use
public class MyConverter extends AbstractListConverter<ClassX,ClassY>{
public MyConverter(){ super(ClassX.class, ClassY.class);}
#Override
protected boolean sameId(ClassX o1, ClassY o2) {
return // your custom comparison here... true means the o2 and o1 can update each other.
}
}
Declaration in mapper.xml
<mapping>
<class-a>x.y.z.AClass</class-a>
<class-b>a.b.c.AnotherClass</class-b>
<field custom-converter="g.e.MyConverter">
<a>ListField</a>
<b>OtherListField</b>
</field>
</mapping>
public abstract class AbstractListConverter<A, B> implements MapperAware, CustomConverter {
private Mapper mapper;
private Class<A> prototypeA;
private Class<B> prototypeB;
#Override
public void setMapper(Mapper mapper) {
this.mapper = mapper;
}
AbstractListConverter(Class<A> prototypeA, Class<B> prototypeB) {
this.prototypeA = prototypeA;
this.prototypeB = prototypeB;
}
#Override
public Object convert(Object destination, Object source, Class<?> destinationClass, Class<?> sourceClass) {
if (destinationClass == null || sourceClass == null || source == null) {
return null;
}
if (List.class.isAssignableFrom(sourceClass) && List.class.isAssignableFrom(destinationClass)) {
if (destination == null || ((List) destination).size() == 0) {
return produceNewList((List) source, destinationClass);
}
return mergeList((List) source, (List) destination, destinationClass);
}
throw new Error("This specific mapper is only to be used when both source and destination are of type java.util.List");
}
private boolean same(Object o1, Object o2) {
if (prototypeA.isAssignableFrom(o1.getClass()) && prototypeB.isAssignableFrom(o2.getClass())) {
return sameId((A) o1, (B) o2);
}
if (prototypeB.isAssignableFrom(o1.getClass()) && prototypeA.isAssignableFrom(o2.getClass())) {
return sameId((A) o2, (B) o1);
}
return false;
}
abstract protected boolean sameId(A o, B t);
private List mergeList(List source, List destination, Class<?> destinationClass) {
return (List)
source.stream().map(from -> {
Optional to = destination.stream().filter(search -> same(from, search)).findFirst();
if (to.isPresent()) {
Object ret = to.get();
mapper.map(from, ret);
return ret;
} else {
return create(from);
}
}
).collect(Collectors.toList());
}
private List produceNewList(List source, Class<?> destinationClass) {
if (source.size() == 0) return source;
return (List) source.stream().map(o -> create(o)).collect(Collectors.toList());
}
private Object create(Object o) {
if (prototypeA.isAssignableFrom(o.getClass())) {
return mapper.map(o, prototypeB);
}
if (prototypeB.isAssignableFrom(o.getClass())) {
return mapper.map(o, prototypeA);
}
return null;
}
}
I have an enum:
enum Type {
LIVE, UPCOMING, REPLAY
}
And some JSON:
{
"type": "live"
}
And a class:
class Event {
Type type;
}
When I try to deserialize the JSON, using GSON, I receive null for the Event type field, since the case of the type field in the JSON does not match that of the enum.
Events events = new Gson().fromJson(json, Event.class);
If I change the enum to the following, then all works fine:
enum Type {
live, upcoming, replay
}
However, I would like to leave the enum constants as all uppercase.
I'm assuming I need to write an adapter but haven't found any good documentation or examples.
What is the best solution?
Edit:
I was able to get a JsonDeserializer working. Is there a more generic way to write this though, as it would be unfortunate to have to write this each time there is a case mismatch between enum values and JSON strings.
protected static class TypeCaseInsensitiveEnumAdapter implements JsonDeserializer<Type> {
#Override
public Type deserialize(JsonElement json, java.lang.reflect.Type classOfT, JsonDeserializationContext context)
throws JsonParseException {
return Type.valueOf(json.getAsString().toUpperCase());
}
}
A simpler way I found (just now) to do this is to use the #SerializedName annotation. I found it in the EnumTest.java here (the Gender class around ln 195):
https://code.google.com/p/google-gson/source/browse/trunk/gson/src/test/java/com/google/gson/functional/EnumTest.java?r=1230
This assumes that all of your Types will come in as lowercase as opposed to being "case insensitive"
public enum Type {
#SerializedName("live")
LIVE,
#SerializedName("upcoming")
UPCOMING,
#SerializedName("replay")
REPLAY;
}
This was the simplest and most generic way I found to do this. Hope it helps you.
Now you can add multiple values for #SerializedName like this:
public enum Type {
#SerializedName(value = "live", alternate = {"LIVE"})
LIVE,
#SerializedName(value = "upcoming", alternate = {"UPCOMING"})
UPCOMING,
#SerializedName(value = "replay", alternate = {"REPLAY"})
REPLAY;
}
I think it's a bit late for you but I hope it will help anyone else!
Conveniently for you, this is very close to the example given in TypeAdapterFactory's Javadoc:
public class CaseInsensitiveEnumTypeAdapterFactory implements TypeAdapterFactory {
public <T> TypeAdapter<T> create(Gson gson, TypeToken<T> type) {
Class<T> rawType = (Class<T>) type.getRawType();
if (!rawType.isEnum()) {
return null;
}
final Map<String, T> lowercaseToConstant = new HashMap<String, T>();
for (T constant : rawType.getEnumConstants()) {
lowercaseToConstant.put(toLowercase(constant), constant);
}
return new TypeAdapter<T>() {
public void write(JsonWriter out, T value) throws IOException {
if (value == null) {
out.nullValue();
} else {
out.value(toLowercase(value));
}
}
public T read(JsonReader reader) throws IOException {
if (reader.peek() == JsonToken.NULL) {
reader.nextNull();
return null;
} else {
return lowercaseToConstant.get(toLowercase(reader.nextString()));
}
}
};
}
private String toLowercase(Object o) {
return o.toString().toLowerCase(Locale.US);
}
}
This is a rather old question, but the accepted answer didn't work for me, and using #SerializedName is not enough because I want to make sure I can match "value", "Value" and "VALUE".
I managed to make a generic Adapter based on the code posted in the question:
public class UppercaseEnumAdapter implements JsonDeserializer<Enum> {
#Override
public Enum deserialize(JsonElement json, java.lang.reflect.Type type, JsonDeserializationContext context)
throws JsonParseException {
try {
if(type instanceof Class && ((Class<?>) type).isEnum())
return Enum.valueOf((Class<Enum>) type, json.getAsString().toUpperCase());
return null;
} catch (Exception e) {
e.printStackTrace();
return null;
}
}
}
And to use it:
GsonBuilder gsonBuilder = new GsonBuilder();
gsonBuilder.registerTypeAdapter(MyEnum.class, new UppercaseEnumAdapter());
Gson gson = gsonBuilder.create();
I have a Dozer mapping with a custom converter:
<mapping>
<class-a>com.xyz.Customer</class-a>
<class-b>com.xyz.CustomerDAO</class-b>
<field custom-converter="com.xyz.DozerEmptyString2NullConverter">
<a>customerName</a>
<b>customerName</b>
</field>
</mapping>
And the converter:
public class DozerEmptyString2NullConverter extends DozerConverter<String, String> {
public DozerEmptyString2NullConverter() {
super(String.class, String.class);
}
public String convertFrom(String source, String destination) {
String ret = null;
if (source != null) {
if (!source.equals(""))
{
ret = StringFormatter.wildcard(source);
}
}
return ret;
}
public String convertTo(String source, String destination) {
return source;
}
}
When I call the mapper in one direction (Customer -> CustomerDAO) the method 'convertTo' is called.
Since Dozer is able to handle bi-directional mapping, I expect that, as soon as I call the mapper in the opposite direction, the method 'convertFrom' will be called.
But the method convertTo is never called.
I suspect that the problem is, that both types are Strings - but how can I make this work?
As a workaround I created two one-way-mapping, is this the standard solution, or is the behavior a bug?
Yes, the problem is that your source and destination classes are the same. Here is the dozer source for DozerConverter:
public Object convert(Object existingDestinationFieldValue, Object sourceFieldValue, Class<?> destinationClass, Class<?> sourceClass) {
Class<?> wrappedDestinationClass = ClassUtils.primitiveToWrapper(destinationClass);
Class<?> wrappedSourceClass = ClassUtils.primitiveToWrapper(sourceClass);
if (prototypeA.equals(wrappedDestinationClass)) {
return convertFrom((B) sourceFieldValue, (A) existingDestinationFieldValue);
} else if (prototypeB.equals(wrappedDestinationClass)) {
return convertTo((A) sourceFieldValue, (B) existingDestinationFieldValue);
} else if (prototypeA.equals(wrappedSourceClass)) {
return convertTo((A) sourceFieldValue, (B) existingDestinationFieldValue);
} else if (prototypeB.equals(wrappedSourceClass)) {
return convertFrom((B) sourceFieldValue, (A) existingDestinationFieldValue);
} else if (prototypeA.isAssignableFrom(wrappedDestinationClass)) {
return convertFrom((B) sourceFieldValue, (A) existingDestinationFieldValue);
} else if (prototypeB.isAssignableFrom(wrappedDestinationClass)) {
return convertTo((A) sourceFieldValue, (B) existingDestinationFieldValue);
} else if (prototypeA.isAssignableFrom(wrappedSourceClass)) {
return convertTo((A) sourceFieldValue, (B) existingDestinationFieldValue);
} else if (prototypeB.isAssignableFrom(wrappedSourceClass)) {
return convertFrom((B) sourceFieldValue, (A) existingDestinationFieldValue);
} else {
throw new MappingException("Destination Type (" + wrappedDestinationClass.getName()
+ ") is not accepted by this Custom Converter ("
+ this.getClass().getName() + ")!");
}
}
Instead of using the convertFrom and convertTo methods (which are part of the new API), do it the original way in you have to implement CustomConverter.convert as shown in the tutorial.
I had the same problem and currently (as of Dozer 5.5.x) there's no simple way, but there is complex one.
Note, that it relies on having no security manager enabled in JVM, or else you will need to add few permissions in the security rules. That's because this solution uses reflection to access private fields of Dozer classes.
You need to extend 2 classes: DozerBeanMapper and MappingProcessor. You will also need enum for direction and interface to get direction from above classes.
The enum:
public enum Direction {
TO,
FROM;
}
The interface:
public interface DirectionAware {
Direction getDirection();
}
The class extending DozerBeanMapper:
public class DirectionAwareDozerBeanMapper extends DozerBeanMapper implements DirectionAware {
private Direction direction;
public DirectionAwareDozerBeanMapper(Direction direction) {
super();
this.direction = direction;
}
public DirectionAwareDozerBeanMapper(Direction direction, List<String> mappingFiles) {
super(mappingFiles);
this.direction = direction;
}
#Override
protected Mapper getMappingProcessor() {
try {
Method m = DozerBeanMapper.class.getDeclaredMethod("initMappings");
m.setAccessible(true);
m.invoke(this);
} catch (NoSuchMethodException|SecurityException|IllegalAccessException|IllegalArgumentException|InvocationTargetException e) {
// Handle the exception as you want
}
ClassMappings arg1 = (ClassMappings)getField("customMappings");
Configuration arg2 = (Configuration)getFieldValue("globalConfiguration");
CacheManager arg3 = (CacheManager)getField("cacheManager");
StatisticsManager arg4 = (StatisticsManager)getField("statsMgr");
List<CustomConverter> arg5 = (List<CustomConverter>)getField("customConverters");
DozerEventManager arg6 = (DozerEventManager)getField("eventManager");
Map<String, CustomConverter> arg7 = (Map<String, CustomConverter>)getField("customConvertersWithId");
Mapper mapper = new DirectionAwareMappingProcessor(arg1, arg2, arg3, arg4, arg5,
arg6, getCustomFieldMapper(), arg7, direction);
return mapper;
}
private Object getField(String fieldName) {
try {
Field field = DozerBeanMapper.class.getDeclaredField(fieldName);
field.setAccessible(true);
return field.get(this);
} catch (NoSuchFieldException|SecurityException|IllegalArgumentException|IllegalAccessException e) {
// Handle the exception as you want
}
return null;
}
public Direction getDirection() {
return direction;
}
}
The class extending MappingProcessor:
public class DirectionAwareMappingProcessor extends MappingProcessor implements DirectionAware {
private Direction direction;
protected DirectionAwareMappingProcessor(ClassMappings arg1, Configuration arg2, CacheManager arg3, StatisticsManager arg4, List<CustomConverter> arg5, DozerEventManager arg6, CustomFieldMapper arg7, Map<String, CustomConverter> arg8, Direction direction) {
super(arg1, arg2, arg3, arg4, arg5, arg6, arg7, arg8);
this.direction = direction;
}
public Direction getDirection() {
return direction;
}
}
Now, the usage.
1) Everytime you want to map the same primitive type (for example String-String), use DozerConverter with that type for both arguments as a custom converter in your dozer mappings file. The implementation of such converter should extend: DozerConverter<String,String> and implement MapperAware interface. This is important that you have MapperAware available, becuase having the mapper you will be able to cast it to DirectionAware and then get the direction.
For example:
public class MyMapper extends DozerConverter<String, String> implements MapperAware {
private DirectionAware dirAware;
public MyMapper(Class<String> cls) {
super(cls, cls);
}
#Override
public Object convert(Object existingDestinationFieldValue, Object sourceFieldValue, Class<String> destinationClass, Class<String> sourceClass) {
if (dirAware.getDirection() == Direction.FROM) {
// TODO convert sourceFieldValue for "FROM" direction and return it
} else {
// TODO convert sourceFieldValue for "TO" direction and return it
}
}
#Override
public void setMapper(Mapper mapper) {
dirAware = (DirectionAware)mapper;
}
}
2) You need to create 2 global Dozer mapper objects, one per mapping direction. They should be configured with the same mapping files, but with different direction argument. For example:
DirectionAwareDozerBeanMapper mapperFrom = DirectionAwareDozerBeanMapper(mappingFiles, Direction.FROM);
DirectionAwareDozerBeanMapper mapperTo = DirectionAwareDozerBeanMapper(mappingFiles, Direction.TO);
Of course you will need use proper mapper (from/to) to provide information to custom mappers on which direction you're mapping.
I got into same kind of issue after couple of years and somehow DozerConverter API which is a new API, still does not work properly as bi-direction !!
So, rather than getting into all these complex solutions advised here, I also created 2 one-way mapping to get over this issue(with ) . And then my conversions started working . I am using DozerConverter api like below :
public class MapToStringConverter extends DozerConverter
During a Hibernate Session, I am loading some objects and some of them are loaded as proxies due to lazy loading. It's all OK and I don't want to turn lazy loading off.
But later I need to send some of the objects (actually one object) to the GWT client via RPC. And it happens that this concrete object is a proxy. So I need to turn it into a real object. I can't find a method like "materialize" in Hibernate.
How can I turn some of the objects from proxies to reals knowing their class and ID?
At the moment the only solution I see is to evict that object from Hibernate's cache and reload it, but it is really bad for many reasons.
Here's a method I'm using.
public static <T> T initializeAndUnproxy(T entity) {
if (entity == null) {
throw new
NullPointerException("Entity passed for initialization is null");
}
Hibernate.initialize(entity);
if (entity instanceof HibernateProxy) {
entity = (T) ((HibernateProxy) entity).getHibernateLazyInitializer()
.getImplementation();
}
return entity;
}
Since Hibernate ORM 5.2.10, you can do it likee this:
Object unproxiedEntity = Hibernate.unproxy(proxy);
Before Hibernate 5.2.10. the simplest way to do that was to use the unproxy method offered by Hibernate internal PersistenceContext implementation:
Object unproxiedEntity = ((SessionImplementor) session)
.getPersistenceContext()
.unproxy(proxy);
Try to use Hibernate.getClass(obj)
I've written following code which cleans object from proxies (if they are not already initialized)
public class PersistenceUtils {
private static void cleanFromProxies(Object value, List<Object> handledObjects) {
if ((value != null) && (!isProxy(value)) && !containsTotallyEqual(handledObjects, value)) {
handledObjects.add(value);
if (value instanceof Iterable) {
for (Object item : (Iterable<?>) value) {
cleanFromProxies(item, handledObjects);
}
} else if (value.getClass().isArray()) {
for (Object item : (Object[]) value) {
cleanFromProxies(item, handledObjects);
}
}
BeanInfo beanInfo = null;
try {
beanInfo = Introspector.getBeanInfo(value.getClass());
} catch (IntrospectionException e) {
// LOGGER.warn(e.getMessage(), e);
}
if (beanInfo != null) {
for (PropertyDescriptor property : beanInfo.getPropertyDescriptors()) {
try {
if ((property.getWriteMethod() != null) && (property.getReadMethod() != null)) {
Object fieldValue = property.getReadMethod().invoke(value);
if (isProxy(fieldValue)) {
fieldValue = unproxyObject(fieldValue);
property.getWriteMethod().invoke(value, fieldValue);
}
cleanFromProxies(fieldValue, handledObjects);
}
} catch (Exception e) {
// LOGGER.warn(e.getMessage(), e);
}
}
}
}
}
public static <T> T cleanFromProxies(T value) {
T result = unproxyObject(value);
cleanFromProxies(result, new ArrayList<Object>());
return result;
}
private static boolean containsTotallyEqual(Collection<?> collection, Object value) {
if (CollectionUtils.isEmpty(collection)) {
return false;
}
for (Object object : collection) {
if (object == value) {
return true;
}
}
return false;
}
public static boolean isProxy(Object value) {
if (value == null) {
return false;
}
if ((value instanceof HibernateProxy) || (value instanceof PersistentCollection)) {
return true;
}
return false;
}
private static Object unproxyHibernateProxy(HibernateProxy hibernateProxy) {
Object result = hibernateProxy.writeReplace();
if (!(result instanceof SerializableProxy)) {
return result;
}
return null;
}
#SuppressWarnings("unchecked")
private static <T> T unproxyObject(T object) {
if (isProxy(object)) {
if (object instanceof PersistentCollection) {
PersistentCollection persistentCollection = (PersistentCollection) object;
return (T) unproxyPersistentCollection(persistentCollection);
} else if (object instanceof HibernateProxy) {
HibernateProxy hibernateProxy = (HibernateProxy) object;
return (T) unproxyHibernateProxy(hibernateProxy);
} else {
return null;
}
}
return object;
}
private static Object unproxyPersistentCollection(PersistentCollection persistentCollection) {
if (persistentCollection instanceof PersistentSet) {
return unproxyPersistentSet((Map<?, ?>) persistentCollection.getStoredSnapshot());
}
return persistentCollection.getStoredSnapshot();
}
private static <T> Set<T> unproxyPersistentSet(Map<T, ?> persistenceSet) {
return new LinkedHashSet<T>(persistenceSet.keySet());
}
}
I use this function over result of my RPC services (via aspects) and it cleans recursively all result objects from proxies (if they are not initialized).
The way I recommend with JPA 2 :
Object unproxied = entityManager.unwrap(SessionImplementor.class).getPersistenceContext().unproxy(proxy);
Starting from Hiebrnate 5.2.10 you can use Hibernate.proxy method to convert a proxy to your real entity:
MyEntity myEntity = (MyEntity) Hibernate.unproxy( proxyMyEntity );
The another workaround is to call
Hibernate.initialize(extractedObject.getSubojbectToUnproxy());
Just before closing the session.
With Spring Data JPA and Hibernate, I was using subinterfaces of JpaRepository to look up objects belonging to a type hierarchy that was mapped using the "join" strategy. Unfortunately, the queries were returning proxies of the base type instead of instances of the expected concrete types. This prevented me from casting the results to the correct types. Like you, I came here looking for an effective way to get my entites unproxied.
Vlad has the right idea for unproxying these results; Yannis provides a little more detail. Adding to their answers, here's the rest of what you might be looking for:
The following code provides an easy way to unproxy your proxied entities:
import org.hibernate.engine.spi.PersistenceContext;
import org.hibernate.engine.spi.SessionImplementor;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.jpa.repository.JpaContext;
import org.springframework.stereotype.Component;
#Component
public final class JpaHibernateUtil {
private static JpaContext jpaContext;
#Autowired
JpaHibernateUtil(JpaContext jpaContext) {
JpaHibernateUtil.jpaContext = jpaContext;
}
public static <Type> Type unproxy(Type proxied, Class<Type> type) {
PersistenceContext persistenceContext =
jpaContext
.getEntityManagerByManagedType(type)
.unwrap(SessionImplementor.class)
.getPersistenceContext();
Type unproxied = (Type) persistenceContext.unproxyAndReassociate(proxied);
return unproxied;
}
}
You can pass either unproxied entites or proxied entities to the unproxy method. If they are already unproxied, they'll simply be returned. Otherwise, they'll get unproxied and returned.
Hope this helps!
Thank you for the suggested solutions! Unfortunately, none of them worked for my case: receiving a list of CLOB objects from Oracle database through JPA - Hibernate, using a native query.
All of the proposed approaches gave me either a ClassCastException or just returned java Proxy object (which deeply inside contained the desired Clob).
So my solution is the following (based on several above approaches):
Query sqlQuery = manager.createNativeQuery(queryStr);
List resultList = sqlQuery.getResultList();
for ( Object resultProxy : resultList ) {
String unproxiedClob = unproxyClob(resultProxy);
if ( unproxiedClob != null ) {
resultCollection.add(unproxiedClob);
}
}
private String unproxyClob(Object proxy) {
try {
BeanInfo beanInfo = Introspector.getBeanInfo(proxy.getClass());
for (PropertyDescriptor property : beanInfo.getPropertyDescriptors()) {
Method readMethod = property.getReadMethod();
if ( readMethod.getName().contains("getWrappedClob") ) {
Object result = readMethod.invoke(proxy);
return clobToString((Clob) result);
}
}
}
catch (InvocationTargetException | IntrospectionException | IllegalAccessException | SQLException | IOException e) {
LOG.error("Unable to unproxy CLOB value.", e);
}
return null;
}
private String clobToString(Clob data) throws SQLException, IOException {
StringBuilder sb = new StringBuilder();
Reader reader = data.getCharacterStream();
BufferedReader br = new BufferedReader(reader);
String line;
while( null != (line = br.readLine()) ) {
sb.append(line);
}
br.close();
return sb.toString();
}
Hope this will help somebody!
I found a solution to deproxy a class using standard Java and JPA API. Tested with hibernate, but does not require hibernate as a dependency and should work with all JPA providers.
Onle one requirement - its necessary to modify parent class (Address) and add a simple helper method.
General idea: add helper method to parent class which returns itself. when method called on proxy, it will forward the call to real instance and return this real instance.
Implementation is a little bit more complex, as hibernate recognizes that proxied class returns itself and still returns proxy instead of real instance. Workaround is to wrap returned instance into a simple wrapper class, which has different class type than the real instance.
In code:
class Address {
public AddressWrapper getWrappedSelf() {
return new AddressWrapper(this);
}
...
}
class AddressWrapper {
private Address wrappedAddress;
...
}
To cast Address proxy to real subclass, use following:
Address address = dao.getSomeAddress(...);
Address deproxiedAddress = address.getWrappedSelf().getWrappedAddress();
if (deproxiedAddress instanceof WorkAddress) {
WorkAddress workAddress = (WorkAddress)deproxiedAddress;
}