I'm trying to create a simple example with Java EE JMS.
If i try to receive an ObjectMessage, i need to have exactly the same path (packagename) as the other project, which sends the ObjectMessage.
For example i have in my sender project a class called Person in the packege "org.queue.sender" and exactly the same class in my receiver project in the package "org.queue.receiver".
As already said, if i try to get the objectmessage i get the following Exception:
java.lang.ClassNotFoundException: org.queue.sender.Person
If i create a new package in my receiver project named org.queue.sender and transfer the class Peron there, then it run. but i think i couldn't be the really solution.
Is there a better solution?
From the JavaDoc:
An ObjectMessage object is used to send a message that contains a serializable object in the Java programming language ("Java object"). It inherits from the Message interface and adds a body containing a single reference to an object. Only Serializable Java objects can be used.
So, objects passed via ObjectMessages must be Serializable i.e. it must be the same class and the exact same package.
If you need more flexible handling of messages I suggest that you use e.g. TextMessage and serialize/deserialize the objects using e.g. JSON or XML.
ObjectMapper mapper = ... ; // Get hold of a Jackson ObjectMapper
session.createTextMessage(mapper.writeValueAsString(myPojo));
// and on the receiving side
TextMessage message = ....; // From the message receiver
MyPojo myPojo = mapper.readValue(message.getText(), MyPojo.class);
Related
I have a situation while building a stub test server for testing our newly written client for a legacy system, where I would like to deserialize an incoming JSON request body to an object hierarchy provided by a vendor for the same legacy system.
In other words I would like to use the vendors classes instead of building my own.
I've managed to make the Eclipse Microplatform client (running in latest TomEE) come as far as starting to populate the base legacy object but then it fails with not being able to instantiate an interface inside the object which makes sense as there is no metadata for this.
#POST
#Produces({ MediaType.APPLICATION_JSON })
#Consumes({MediaType.APPLICATION_JSON})
public String post(MessageObject messageObject) {
</pre><p><b>Root Cause</b></p><pre>javax.json.bind.JsonbException: interface /vendor interface class/ not instantiable
org.apache.johnzon.jsonb.JohnzonJsonb.fromJson(JohnzonJsonb.java:200)
org.apache.johnzon.jaxrs.jsonb.jaxrs.JsonbJaxrsProvider.readFrom(JsonbJaxrsProvider.java:182)
...
As I do not have the source for these classes, I was thinking of whether the deserializer could be told otherwise (like Providers in Dependency Injection) how to instantiate the interfaces. I have full control over the client.
I am not very familiar with this, so I would appreciate knowing how to get around this or is this a failed cause?
Yes you can bind an implementation for interfaces, see https://johnzon.apache.org/, johnzon.interfaceImplementationMapping part. It can be set in resources.xml in the configurable provider. The class to set is https://github.com/apache/johnzon/blob/master/johnzon-jsonb/src/main/java/org/apache/johnzon/jaxrs/jsonb/jaxrs/JsonbJaxrsProvider.java and the property key is interfaceImplementationMapping, its value is using properties syntax.
Hope it helps
I'm trying to pick up Java and wanted to test around with Java's client/server to make the client send a simple object of a self defined class(Message) over to the server. The problem was that I kept getting a ClassNotFoundException on the server side.
I think the rest of the codes seem to be alright because other objects such as String can go through without problems.
I had two different netbeans projects in different locations for client and server each.
Each of them have their own copy of Message class under their respective packages.
Message class implements Serializable.
On the client side, I attempt to send a Message object through.
On the server side, upon calling the readObject method, it seems to be finding Message class from the client's package instead of it's own. printStackTrace showed: "java.lang.ClassNotFoundException: client.Message" on the server side
I have not even tried to cast or store the object received yet. Is there something I left out?
The package name and classname must be exactly the same at the both sides. I.e. write once, compile once and then give the both sides the same copy. Don't have separate server.Message and client.Message classes, but a single shared.Message class or something like that.
If you can guarantee the same package/class name, but not always whenever it's exactly the same copy, then you need to add a serialVersionUID field with the same value to the class(es) in question.
package shared;
import java.io.Serializable;
public class Message implements Serializable {
private static final long serialVersionUID = 1L;
// ...
}
The reason is, that the readObject() in ObjectInputStream is practically implemented as:
String s = readClassName();
Class c = Class.forName(s); // Here your code breaks
Object o = c.newInstance();
...populate o...
The following exception is thrown while I run my Storm project:
java.lang.RuntimeException: java.io.NotSerializableException: com.youtab.dataType.id.GUID
at backtype.storm.serialization.DefaultSerializationDelegate.serialize(DefaultSerializationDelegate.java:43)
at backtype.storm.utils.Utils.serialize(Utils.java:85)
at backtype.storm.topology.TopologyBuilder.createTopology(TopologyBuilder.java:111)
As part of my Storm project, I am trying to transmit an Object type Event from the first Spout to the first Bolt and then to use it.
Unfortunately, after implementing all needed changes and commits in my configuration variable - as described in the Storm documentations, it is still failing to deserialize one private field of type "GUID" which is one field from my own private class Event.
I have created the following serializing class:
public class GUIDSerializer extends Serializer<GUID> {
#Override
public void write(Kryo kryo, Output output, GUID guid) {
output.write(guid.toString().getBytes());
}
#Override
public GUID read(Kryo kryo, Input input, Class<GUID> aClass) {
return GUID.of(input.readString());
}
}
And I registered the serialize as needed:
Config conf = new Config();
conf.registerSerialization(GUID.class, GUIDSerializer.class);
All classes used as data type for attributes/fields must implement Java's Serializable interface. In your case, this applies to your Event class as well as all used members of Event like GUID. Of course, this applied recursively, ie, if GUID contains custom types, those must implement Serializable, too.
Providing a custom Kryo serializer is actually not required. Storm can use Java's default serializer. For performance reasons however, it is highly recommended to register custom types. In most case, it is sufficient to simple register the user type classes via
conf.registerSerialization(MyUserType.class);
In your case
conf.registerSerialization(Event.class);
conf.registerSerialization(GUID.class);
Registering a class allows Storm to use a more efficient (general) Kryo serializer instead of Java's default serializer.
If this general Kryo serializer is still not efficient enough, you can provide an own Kryo serializer (as you mentioned in your question). However, the class must still implement Java's Serializable interface!
I do not know the Serializer class, but judging from the exception, you need to make your GUID class implement the Interface java.io.Serializable like:
public class GUID implements Serializable {
//...
When in doubt, please post your current code for GUID.
I am playing around with MessagePack and Java. I have had experience with Protobuf and Json (using Jackson and Gson) in the past when it comes to serialization tools.
When it comes to normal serialization and deserialization I have no problems at all. It is when I want to have multiple subclasses of another class that a problem arise.
I am testing this with the following code :
TestMessage.TestMessageSubClass sub = new TestMessage.TestMessageSubClass();
byte[] pack = MsgPack.pack(sub);
Assert.assertTrue(ArrayUtils.isNotEmpty(pack));
List<? extends TestMessage> msg = MsgPack.unpack(pack, TestMessage.class);
Assert.assertNotNull(msg);
Assert.assertFalse(msg.isEmpty());
TestMessage temp = msg.get(0);
Assert.assertNotNull(temp);
Assert.assertTrue(temp instanceof TestMessage.TestMessageSubClass);
TestMessage.TestMessageSubClass sub2 = (TestMessage.TestMessageSubClass) temp;
Assert.assertEquals(sub, sub2);
System.out.println(sub);
System.out.println(sub2);
Those two lines fail because when I deserialize I only get a normal TestMessage, and not a TestMessageSubClass instance.
Assert.assertTrue(temp instanceof TestMessage.TestMessageSubClass);
TestMessage.TestMessageSubClass sub2 = (TestMessage.TestMessageSubClass) temp;
I suppose that this happens because by default the MessagePack unpacker has no way of determining the exact class of he needs to deserialize. In fact this would work just fine if I directly ask him to deserialize into a TestMessageSubClass.
My requirements is that TestMessage might have any number of subclasses with extra data, and with the same code I need to de-serialize them in the right class instance without losing anything. I might be deserializing a stream containing an heterogeneous list of those TestMessage instances.
I can have the behaviour I want using the #JsonSubTypes annotation in JacksonJson.
Is there a way to use the official MessagePack client API and obtain that? Is there a known pattern to do that myself?
Here is the code of my MsgPack wrapper class: GIST
Any advice on using MessagePack more efficiently is welcome too.
First off, I don't think this is necessarily a good idea, I'm just seeing if this is really possible. I could see some benefits, such as not having to explicitly convert to objects that we're sending to the client and using an interface to blacklist certain fields that are security concerns. I'm definitely not stuck on the idea, but I'd like to give it a try.
We're using Spring MVC + Jackson to generate JSON directly from objects. We have our domain object that contains necessary data to send to the client and we have a list of error strings that are added to every outgoing JSON request as needed.
So the return JSON might be something like
{ name: 'woohoo', location : 'wahoo', errors : ['foo'] }
Currently, we have a class that models what should be on the client side, but we always extend a common base class with the error methods.
So, we have:
interface NameAndLoc {
String getName();
String getLocation();
}
and
interface ResponseErrors {
List<String> getErrors();
void appendError(String);
}
We have two classes that implement these interfaces and would like to have CGLIB generate a new class the implements:
interface NameAndLocResponse extends NameAndLoc, ResponseErrors {}
Presently, with CGLIB mixins, I can generate an object with the following:
Object mish = Mixin.create(
new Class [] {NameAndLoc.class, ResponseErrors.class},
new Object [] { new NameAndLocImpl(), new ResponseErrorsImpl() } );
I could then cast the object to either NameAndLoc or ResponseErrors, however, what I would like to do is create an object that uses the same backing classes, but implements the NameAndLocResponse interface, without having to extend our common error handling class and then implement NameAndLoc.
If I attempt to cast with what I have, it errors out. I'm sure this is possible.
I think it is very similar to this, but not quite: http://www.jroller.com/melix/entry/alternative_to_delegate_pattern_with
Simply add the NameAndLocResponse interface to the Class array in the Mixin constructor as the last argument. The resulting object will implement it. You can find an example of this in this blog entry: http://mydailyjava.blogspot.no/2013/11/cglib-missing-manual.html