Intercept Deserialization in Jackson - java

I want to hook into Jackson's deserialization to optionally deserialize a different JSON document than the one provided. That seems like a really weird use case so let me explain.
I am using the Amazon SQS Extended client to put messages that are too large for SQS on S3 instead and a message that looks like this through SQS
["com.amazon.sqs.javamessaging.MessageS3Pointer",{"s3BucketName":"my-bucket","s3Key":"f5a0fa29-7f9c-4852-8bbb-53697799efe2"}]
An elastic beanstalk worker is listening to the other end of that which means that those messages are POSTed to a Jersey endpoint my application maintains. Since those messages are POSTed instead of using a SQS receiveMessage call the extended client will not fetch the message from S3 itself.
I was thinking it would be pretty clever to make a custom JsonDeserializer that would look at the message to see if it was an S3 pointer, download that file, and deserialize it. Otherwise, just deserialize the provided message. However, that isn't working out quite as smoothly as I hoped.
Here is what I have so far:
public class SQSS3Deserializer<T> extends JsonDeserializer<T> {
private static final String s3PointerHeader = "com.amazon.sqs.javamessaging.MessageS3Pointer";
private Class<T> type;
private ObjectMapper mapper = new ObjectMapper();
public SQSS3Deserializer() {
super();
type = getParameterizedTypeArgument();
}
#Override
public T deserialize(JsonParser jp, DeserializationContext dc) throws IOException, JsonProcessingException {
if (jp.isExpectedStartArrayToken()) {
jp.nextToken();
if (s3PointerHeader.equals(jp.getValueAsString())) {
jp.nextToken();
S3Pointer p = jp.readValueAs(S3Pointer.class);
return mapper.readValue(S3Utils.getInputStream(p.s3BucketName, p.s3Key), type);
}
}
return jp.readValueAs(type);
}
#SuppressWarnings("unchecked")
protected Class<T> getParameterizedTypeArgument() {
return (Class<T>) ((ParameterizedType) getClass().getGenericSuperclass()).getActualTypeArguments()[0];
}
static private class S3Pointer {
public String s3BucketName;
public String s3Key;
}
}
For each POJO I want to deserialize I'll have to create an empty subclass with the correct generic specialization, for example:
public class POJOS3Deserializer extends SQSS3Deserializer<POJO> {}
I also will need to add the JsonDeserializer annotation to the class
#JsonDeserialize(using=POJOS3Deserializer.class)
public class POJO { ... }
However, doing it this way causes a stack overflow error because it will continually reenter my deserializer when it calls JsonParser.readValueAs() since readValueAs looks at the JsonDeserialize annotation.
So, I have two questions:
How do I change this to keep this fairly generic and still have Jackson do most of the heavy lifting of parsing while avoiding that recursive call?
Is there a way to remove the need to derive from SQSS3Deserializer for each POJO I want to deserialize this way?
Thanks

Related

Feign client body type serialization failure with Jackson encoder

I'm working on implementing a Spring service and client and would like to use OpenFeign for the client. The client will be deployed with legacy applications that do not want to incur a dependency on Spring, so I'm using OpenFeign directly instead of via Spring Cloud.
I've run into an issue with the Jackson encoder and the Body type. It seems that the Jackson encoder cannot serialize an interface implementation to an interface type for the client method. e.g. if my client method is createFoo(Foo interface) where Foo is an interface calling the method with createFoo((FooImpl)fooImpl) where FooImpl implements the Foo interface then I get an encoder exception.
I've created an MCCE Gradle project demonstrating the issue here
The client definition is this:
public interface FooClient {
#RequestLine("POST /submit")
#Headers("Content-Type: application/json")
Response createFoo(Foo foo);
#RequestLine("POST /submit")
#Headers("Content-Type: application/json")
Response createFooImpl(FooImpl foo);
interface Foo { int id(); }
record FooImpl(int id) implements Foo { }
}
And the failing test demonstrating the issue is this:
#SpringBootTest(webEnvironment = WebEnvironment.RANDOM_PORT)
class FooClientTest {
#LocalServerPort int port;
#Test
public void clientTest() {
final FooClient lClient = Feign.builder()
.encoder(new JacksonEncoder(List.of(
// Possibly this would be necessary with the original encoder implementation.
// new FooModule()
)))
.target(FooClient.class, String.format("http://localhost:%s", port));
Response response = lClient.createFooImpl(new FooImpl(10));
assertThat(response.status()).isEqualTo(404);
response = lClient.createFoo(new FooImpl(10));
assertThat(response.status()).isEqualTo(404); // <<===== This fails with the exception below.
}
public static class FooModule extends SimpleModule {
{
addAbstractTypeMapping(Foo.class, FooImpl.class);
}
}
}
The exception is:
feign.codec.EncodeException: No serializer found for class
codes.asm.feign.mcce.client.FooClient$FooImpl
and no properties discovered to create
BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS)
This issue was introduced in this commit. It seems to have somehow removed Jackson's ability to map the encoder for the interface to the implementation by explicitly calling for the interface encoder.
JavaType javaType = mapper.getTypeFactory().constructType(bodyType);
template.body(mapper.writerFor(javaType).writeValueAsBytes(object), Util.UTF_8);
Based on some experiments I think the original code would work fine, potentially with some configuration of the encoder via a Module.
As demonstrated in the test, I can work around the issue by typing the client method with the interface implementation, but this is undesirable for a number of reasons in my context.
I've figured out a workaround, but it's quite ugly. Create a Module and add the following serializer. I expect this to be extremely brittle and will likely just abandon the interface and go with a concrete record definition as a DTO.
addSerializer(new JsonSerializer<Foo>() {
#Override
public Class<Foo> handledType() {
return Foo.class;
}
/**
* This is an ugly hack to work around this: https://github.com/OpenFeign/feign/issues/1608
* Alternative would be to just make Foo a concrete record
* instead of an interface. That may be better.
*/
#Override
public void serialize(Foo value, JsonGenerator gen,
SerializerProvider serializers) throws IOException {
gen.writeStartObject();
final Method[] methods = Foo.class.getMethods();
for (Method method : methods) {
try {
final Object result = method.invoke(value);
gen.writePOJOField(method.getName(), result);
} catch (IllegalAccessException | InvocationTargetException e) {
throw new IllegalArgumentException(String.format("Class %s has method %s which is not an accessible no argument getter", value.getClass(), method.getName()));
}
}
gen.writeEndObject();
}
});

How to implement a custom Spring Http Message Converter for writing a typed Collection

I've got a Spring Boot Controller that returns a List<Person> and I would like to implement a custom HttpMessageConverter that can write collections of type Person.
I see AbstractHttpMessageConverter, but the supports() method only takes a Class, so I can test for Collection, but no way (as far as I know) to test for a Collection of type Person.
I also see GenericHttpMessageConverter and AbstractGenericHttpMessageConverter which sound promising, but I can't figure out to properly implement one.
I think I found a solution. The following seems to work...
#Component
static class PersonMessageConverter
extends AbstractGenericHttpMessageConverter<Collection<Person>> {
public PersonMessageConverter() {
super(MediaType.APPLICATION_JSON);
}
#Override
public boolean canWrite(Type type, Class<?> clazz, MediaType mediaType) {
TypeToken<Collection<Person>> personCollectionType = new com.google.common.reflect.TypeToken<>() {};
return canWrite(mediaType) && personCollectionType.isSupertypeOf(type);
}
#Override
protected void writeInternal(Collection<Person> persons, Type type, HttpOutputMessage outputMessage)
throws IOException, HttpMessageNotWritableException {
// do write...
}
// continue with read methods here
}

Hazelcast caller member throws timeout exception when a serialization exception happens on callee member

I'm using Hazelcast (JAVA, version 3.7.5) in a scenario with 2 members. The first member delegates a task to the second member through an IExecutorService. After some processing, the second member tries to send back an unserializable response.
As it's not possible to send back a response, the second member prints the stacktrace related to the HazelcastSerializationException.
As no response arrives, the first member throws an OperationTimeoutException when the operation-heartbeat-timeout is reached.
Currently, when the IExecutorService fails at parsing the Callable result, it prints a stacktrace (on callee side).
Let say I have a simple caller :
private Future<Object> startFlow() {
//This throws an OperationTimeoutException
return hazelcastInstance.getExecutorService("myExecutor").submit(myRunnable);
}
Which calls a simple callee :
#Override
public Object call() throws Exception {
//The object returned is not serializable, therefore an HazelcastSerializationException is thrown
return service.execute();
}
The callee prints a stacktrace after it failed to parse the response (see end of post).
In my case, it's not possible to know what kind of object the service might return, and it's not possible to trust the service to send back serializable objects.
I would like to be able to know the reason of the timeout on the caller-side.
After some search, I found that no configuration/API is available to intercept exceptions thrown by an IExecutorService when it fails to serialize a response.
So I tried to see if it would be possible to check if an object was parseable by Hazelcast, again without success.
Any ideas ?
Thank you
The stacktrace printed by the callee will be looking like this :
Exception in thread "hz._hzInstance_1_dev.cached.thread-1" com.hazelcast.nio.serialization.HazelcastSerializationException: Failed to serialize 'com.hazelcast.spi.impl.operationservice.impl.responses.NormalResponse'
at com.hazelcast.internal.serialization.impl.SerializationUtil.handleSerializeException(SerializationUtil.java:73)
at com.hazelcast.internal.serialization.impl.AbstractSerializationService.toBytes(AbstractSerializationService.java:143)
at com.hazelcast.internal.serialization.impl.AbstractSerializationService.toBytes(AbstractSerializationService.java:124)
at com.hazelcast.spi.impl.operationservice.impl.OperationServiceImpl.send(OperationServiceImpl.java:427)
at com.hazelcast.spi.impl.operationservice.impl.RemoteInvocationResponseHandler.sendResponse(RemoteInvocationResponseHandler.java:51)
at com.hazelcast.spi.Operation.sendResponse(Operation.java:291)
at com.hazelcast.executor.impl.DistributedExecutorService$CallableProcessor.sendResponse(DistributedExecutorService.java:269)
at com.hazelcast.executor.impl.DistributedExecutorService$CallableProcessor.run(DistributedExecutorService.java:253)
at com.hazelcast.util.executor.CachedExecutorServiceDelegate$Worker.run(CachedExecutorServiceDelegate.java:212)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
at com.hazelcast.util.executor.HazelcastManagedThread.executeRun(HazelcastManagedThread.java:76)
at com.hazelcast.util.executor.HazelcastManagedThread.run(HazelcastManagedThread.java:92)
Caused by: com.hazelcast.nio.serialization.HazelcastSerializationException: Failed to serialize 'com.myomain.UnserialiableObject'
at com.hazelcast.internal.serialization.impl.SerializationUtil.handleSerializeException(SerializationUtil.java:73)
at com.hazelcast.internal.serialization.impl.AbstractSerializationService.writeObject(AbstractSerializationService.java:236)
at com.hazelcast.internal.serialization.impl.ByteArrayObjectDataOutput.writeObject(ByteArrayObjectDataOutput.java:371)
at com.hazelcast.spi.impl.operationservice.impl.responses.NormalResponse.writeData(NormalResponse.java:91)
at com.hazelcast.internal.serialization.impl.DataSerializableSerializer.write(DataSerializableSerializer.java:189)
at com.hazelcast.internal.serialization.impl.DataSerializableSerializer.write(DataSerializableSerializer.java:54)
at com.hazelcast.internal.serialization.impl.StreamSerializerAdapter.write(StreamSerializerAdapter.java:43)
at com.hazelcast.internal.serialization.impl.AbstractSerializationService.toBytes(AbstractSerializationService.java:140)
... 12 more
Caused by: com.hazelcast.nio.serialization.HazelcastSerializationException: There is no suitable serializer for class com.myomain.UnserialiableObject
at com.hazelcast.internal.serialization.impl.AbstractSerializationService.serializerFor(AbstractSerializationService.java:469)
at com.hazelcast.internal.serialization.impl.AbstractSerializationService.writeObject(AbstractSerializationService.java:232)
... 18 more
EDIT (SOLUTION)
So I ended up registering a global serializer, that would simply send an Exception whenever it's called. Something like this :
public class GlobalSerializerException implements StreamSerializer<Object> {
#Override
public void write(ObjectDataOutput out, Object object) throws IOException {
String objectInfo;
if(object == null){
objectInfo = "Object was null.";
}else{
objectInfo = String.format("Object of class %s and printed as String gives %s", object.getClass().getCanonicalName(), object.toString());
}
objectInfo = "Hazelcast was unable to serialize an object. " + objectInfo;
out.writeUTF(objectInfo);
}
#Override
public Object read(ObjectDataInput in) throws IOException {
String message = in.readUTF();
HazelcastSerializationException hazelcastSerializationException = new HazelcastSerializationException(message);
return hazelcastSerializationException;
}
#Override
public int getTypeId() {
return 63426;
}
#Override
public void destroy() {
}
}
Caller fails with a timeout because it doesn't get a response from the target. Target node fails to serialize the response, as a reason fails to send a response. This is current behaviour but I think it's also possible to send a special exception to denote that response failure too.
Hazelcast, by default, is able to serialize classes implementing java.io.Serializable, java.io.Externalizable and some Hazelcast specific interfaces, such as DataSerializable, Portable. It's also possible to define custom serializers or delegate to another serialization library. See Hazelcast Reference Manual - Serialization section for more info.
In a distributed system, messages exchanged between nodes must be serializable to a binary form to transmit them through network. So, an entity/service participating in a distributed system must ensure its messages are serializable in some form.
If you don't know the type of the messages, then you can register Hazelcast a global serializer, which first tries to serialize using known formats (Serializable, Externalizable etc), if type is not known then it writes a custom error message instead.
Alternatively, you can wrap result of the service's execution with a custom serializable wrapper object. During serialization, if original wrapped result fails to serialize, then you again write a custom error message.
For example;
class NonSerializableResponseException extends Exception {}
class ServiceResponseWrapper implements DataSerializable {
private Object response;
#Override
public void writeData(ObjectDataOutput out) throws IOException {
try {
out.writeObject(response);
} catch (HazelcastSerializationException e) {
out.writeObject(new NonSerializableResponseException());
}
}
#Override
public void readData(ObjectDataInput in) throws IOException {
response = in.readObject();
}
}

Infinite recursion when deserializing with Jackson and Mockito

I am trying to generically print any object with mapper.writeValueAsString but I am facing a infinite recursion when deserializing objects with Mockito and Jackson. The objects I am trying to deserialize perform underlying Hibernate calls to DBs etc and I have no control over the classes themselves.
Currently I am using the following versions of Mockito and Jackson but the same is happening for older 1.X versions of Jackson.
Mockito: org.mockito:mockito-all:jar:1.9.5:compile
Jackson: com.fasterxml.jackson.core:jackson-databind:jar:2.9.7:compile
As specified, I cannot modify the underlying classes with annotation such as #JsonIgnore because they are outside dependencies not under my control. I also cannot create mixins for my user case because I am trying to generically print the contents of any object that is sent in.
I have tried adding the DeserializationConfig FAIL_ON_UNKNOWN_PROPERTIES to false in older Jackson version and I have tried setting the DeserializationFeature FAIL_ON_MISSING_CREATOR_PROPERTIES to false.
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
public static PrintUtil {
public static String printJSON(Object obj) {
String printstring = "printfailed";
try {
ObjectMapper mapper = new ObjectMapper();
LOG.debug("formatted JSON String ");
printstring = mapper.writeValueAsString(obj);
} catch (JsonProcessingException e) {
e.printStackTrace();
}
return printstring;
}
}
The infinite recursion terminal output is seen when running Mockito tests for methods that contain Log4j statements which in turn call the PrintUtil function. The statement e.printStackTrace() begins printing while running the tests.
Most of the object that are being sent to this utility method are JAXB XML Service Response Objects.
Without being able to modify the classes themselves, I see two possible solutions.
1) Wrap the objects into objects you own as suggested by #TheHeadRush and annotate it appropriately. I would suggest using #JsonIdentityInfo so the objects serialize to their ID, rather than being ignored completely with #JsonIgnore.
2) Use a custom deserializer for the object which is causing the recursion. Here is an example:
public class CustomDeserializer extends StdDeserializer<MyObject>{
// Add constructors as necessary.
#Override
public List<Item> deserialize(
JsonParser jsonparser,
DeserializationContext context)
throws IOException, JsonProcessingException {
return null; // Return something that won't recurse here.
}
}
Then register the deserializer with the ObjectMapper you are using:
// Register the deserializer:
SimpleModule module = new SimpleModule()
.addDeserializer(MyObject.class, new CustomDeserializer());
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.registerModule(module);

Json Serializing JDK Dynamic Proxy with Jackson library

I'm trying to serialize a Java Dynamic proxy using Jackson library but I get this error:
public interface IPlanet {
String getName();
}
Planet implements IPlanet {
private String name;
public String getName(){return name;}
public String setName(String iName){name = iName;}
}
IPlanet ip = ObjectsUtil.getProxy(IPlanet.class, p);
ObjectMapper mapper = new ObjectMapper();
mapper.writeValueAsString(ip);
//The proxy generation utility is implemented in this way:
/**
* Create new proxy object that give the access only to the method of the specified
* interface.
*
* #param type
* #param obj
* #return
*/
public static <T> T getProxy(Class<T> type, Object obj) {
class ProxyUtil implements InvocationHandler {
Object obj;
public ProxyUtil(Object o) {
obj = o;
}
#Override
public Object invoke(Object proxy, Method m, Object[] args) throws Throwable {
Object result = null;
result = m.invoke(obj, args);
return result;
}
}
// TODO: The suppress warning is needed cause JDK class java.lang.reflect.Proxy
// needs generics
#SuppressWarnings("unchecked")
T proxy = (T) Proxy.newProxyInstance(type.getClassLoader(), new Class[] { type },
new ProxyUtil(obj));
return proxy;
}
I get this exception:
Exception in thread "main" com.fasterxml.jackson.databind.JsonMappingException: No serializer found for class $Proxy11 and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationConfig.SerializationFeature.FAIL_ON_EMPTY_BEANS) )
The problem seems to be the same that happens when hibernate proxied objects are serialized but I don't know how and if I can use the Jackson-hibernate-module to solve my issue.
UPDATE:
The BUG was solved from Jackson 2.0.6 release
You can try Genson library http://code.google.com/p/genson/.
I just tested your code with it and it works fine the output is {"name":"foo"}
Planet p = new Planet();
p.setName("foo");
IPlanet ip = getProxy(IPlanet.class, p);
Genson genson = new Genson();
System.out.println(genson.serialize(ip));
It has a couple of nice features that do not exisit in other librairies.
Such as using constructor with arguments without any annotation or applying what is called BeanView on your objects at runtime (acts a as view of your model), can deserialize to concrete types, and more... Take a look at the wiki http://code.google.com/p/genson/wiki/GettingStarted.
It might be a bug in Jackson -- proxied classes may be explicitly prevented from being considered beans. You could file a bug -- if Genson can handle it, Jackson should too. :-)

Categories

Resources