In shared.jar I have:
abstract class MyParent {
}
abstract class MyClass {
MyParent getFoo();
}
server.jar contains
abstract class MyChild extends MyParent {
}
abstract class MyClassConcrete {
MyParent getFoo() {return new MyChild();}
}
client.jar:
MyParent foo = myClass.getFoo();
If all 3 jars are in one classloader everything works well.
But client and server are in different JVMs while:
JVM-1 contains: server.jar, shared.jar
JVM-2 contains: client.jar, shared.jar
Client makes call to server. Server returns MyConcreteClass and Java fails to deserialize it (ClassNotFoundException).
What I wanna do:
Server serializes class and sends data and set of class's ancestors
Client finds the narrowest ancestor it may deserialize.
And everything is ok: we have instance of MyParent on the client and that is what we need.
I can't believe there is no such engines. Do you know one?
I am sure remote call should be as similar to local calls as possible.
Thanks.
Actually I found solution and made a special package to support it:
MyParent has to me marked with special SerializableParent annotation. This annotation means any child class should be "converted" to MyParent before remoting engine may serialize it and transfer it over the wire. By setting this annotation not only you tell the system MyParent exists on remote JVM but also hierarchy tree does not require polymorphism: if child overrides parent's method it would not be available on remote system because only data but code could be sent.
Before sending result engine should find narrowest ancestor annotated as SerializableParent
Object that should be serialized to XML (with XStream for example) and deserialized back using parent class as alias for child. To prevent "unknown fields" error Xstream has to be hacked by overriding wrapMapper and shouldSerializeMember.
You got "serializable parent" to be transfered
An object implementing the java.io.Serializable interface can use the writeReplace() and readResolve() methods to substitute another object for the one being serialized/deserialized. I can see this being used to address your problem. However, I have not tried this myself.
Related
I have a Serializable Bean class which consist of an interface instance dozerMapper of MapperIF. Everything was working fine before I added PersistentManager in the context.xm file in my tomcat server. With the PersistentManager I am storing all the Objects in session as a file to the folder. But after i added the PersistentManager it started throwing NotSerializableException because of the MapperIF interface inside my Bean class. Adding transient keyword to the MapperIF could solve the NotSerializableException, But it ends up with the NullPointerException as the dozerMapper.map is coming null in the below code. So how can i handle this situation for serializing my bean class.
#Autowired
private transient MapperIF dozerMapper;
public Preferences getUiPreferences() {
if (this.uiPreferences == null) {
this.uiPreferences = ((Preferences) this.dozerMapper.map(
getPrefernces(), Preferences.class));
}
return this.uiPreferences;
}
The MapperIF interface (or its Dozer 5.x replacement Mapper) does not extend Serializable. Its standard implementation classes do not implement it either. Therefore the standard implementations are not going to be serializable.
I can think of ways to solve this:
Don't put the MapperIF reference into an object that you save in the session. It doesn't really belong there. Here's what the javadocs for the DozerBeanMapper class say:
This should be used/defined as a singleton within your application. This class performs several one-time initializations and loads the custom xml mappings, so you will not want to create many instances of it for performance reasons. Typically a system will only have one DozerBeanMapper instance per VM. If you are using an IOC framework (i.e Spring), define the Mapper as singleton="true". If you are not using an IOC framework, a DozerBeanMapperSingletonWrapper convenience class has been provided in the Dozer jar.
This implies that you shouldn't need to put a MapperIF object into a session.
Declare the field as transient and implement a custom readObject method that will repopulate the field (from somewhere) when you deserialize.
Implement your own custom MapperIF / Mapper class that is serializable. (I haven't looked, but this could be a lot of work ... or impossible.)
This is from the Minecraft server source code, also called the Minecraft Bukkit API, now you know as much as I do.
There is an interface called Server:
public interface Server extends PluginMessegeRecipient {
public String getVersion();
}
PluginMessageRecipient is an interface also.
There is a class called Bukkit that instantiates Server:
public final class Bukkit {
private static Server server;
}
Inside methods in the Bucket class they invoke methods from the server object. For example:
server.getVersion();
The thing is, there is no code for getVersion in the Server interface, just a method signature. There is also no code in the PluginMessageRecipient interface nor does it extend anything.
I have read all the questions and answers on SO that say I need an anonymous class or an inner class and this does not seem to fit those solutions.
There is a class called Bucket that instantiates Server:
Actually Bucket doesn't instantiate Server. The class Bucket contains a reference to a Server. You haven't shown how that got set so we don't know the actual class.
However, it is guaranteed that what is assigned to that reference (Bucket.server), assuming it's not null, is a an object of some concrete class that implements Server. That class will provide an implementation of getVersion() and that is what is being called.
Bukkit is just a Modding API. If you want to implement Bukkit, you need to create such an instance yourself and pass it there.
Take for example the unit tests that Bukkit includes:
https://github.com/Bukkit/Bukkit/blob/f210234e59275330f83b994e199c76f6abd41ee7/src/test/java/org/bukkit/TestServer.java#L77
A real implementation that allows you to run a Bukkit server is Spigot.
If I recall correctly, the particular concrete class that's being selected is determined at runtime via reflection. Because Minecraft is not open source, all the developers have are the obfuscated compiled class files to work with.
The code searches through each class file within the minecraft jar, searching for a class that matches certain conditions, and then, using a bytecode library, force that class to implement that interface.
For example, let's say that the following (obfuscated) class was the real Server class within the Minecraft code
class a {
String x_x317() {
return q_q98;
}
static a a_a1;
static String q_q98 = "1.9.4";
}
In this case, the method x_x317 returns the version string. The tool that allows them too hook into this class might do it based on the following conditions:
The class has default access
The class has only one default access static reference to itself
The class has only one default access static String field.
The class has a single method, that has default access, that returns String, and the returned value is the FieldRef found in 3.
This generally returns only one class. In the case that multiple are returned (usually in the dev phase of the new Bukkit version), they get more specific with their conditions to ensure that they only get the right class returned. They do this for every field, class, and method they need to identify.
Since they now know which exact class is the Server class, they can go ahead and make changes to it. First they would need to implement the interface
class a implements org.bukkit.Server
And then implement the method
class a implements org.bukkit.Server {
String x_x317() {
return q_q98;
}
public String getVersionNumber() {
return x_x317();
}
static a a_a1;
static String q_q98 = "1.9.4";
}
Now, we have a class that conforms to the Bukkit API.
When they need to instantiate that class, they just do something along the lines of
Server server = findAndTransformServerClassFromMinecraftJar();
// ...
Server findAndTransformServerClassFromMinecraftJar() {
// load classes from jar
// map them to the appropriate interfaces
// transform and hook the required classes and methods
Class<?> serverClass = doTheFirstThreeSteps();
return (Server) serverClass.newInstance();
}
I have multiple modules with service interfaces binding to their corresponding types and I am able to get an instance by using
injector.getInstance(MyServiceInterface.class)
I would like to retrieve the instance using
injector.getInstance("MyServiceInterface")
i.e. a string literal instead of the class type
How can I achieve this ?
To elaborate my question further - I can retrieve the Class object from the string literal using a Class.forName(literal) call and then use it to retrieve the instance with a injector.getInstance(clsInstance) .
After retrieving the instance which I receive in my base service type interface I need to use reflection to invoke the method of the service object.
so Service serv = injector.getInstance(MyCustomService.class)
Now I need to invoke myCustomMethod() present in MyCustomService through reflection since this invoker is generic and is intended to work with multiple services without being aware of their actual type.
I will also need the Method interceptors configured on the service interfaces to be invoked transparently when I invoke the method on this instance reflectively.
While I'm not certain if there's functionality for that built into Guice itself, you could try getting the relevant Class<?> object yourself.
Something along the lines of:
Class<?> myServiceInterfaceClass = Class.forName("path.to.MyServiceInterface");
injector.getInstance(myServiceInterfaceClass);
This does however require that the current Classloader can access that specific class, etc.
This can't be done within Guice... because it can't be done, period! Think about it, let's say you have two of the same class name in different packages. Which class would you instantiate?
So at the very least the String would have to have the fully qualified class name, e.g. instead of Integer, it would have java.lang.Integer.
However, if you know which classes you want to support in advance, you can use a MapBinder.
Tweaking their example to match your use case:
public class ServiceModule extends AbstractModule {
protected void configure() {
MapBinder<String, MyServiceInterface> mapbinder
= MapBinder.newMapBinder(binder(), String.class, MyServiceInterface.class);
mapbinder.addBinding("MyServiceInterface").to(MyServiceImpl.class);
bind(MyServiceInterface.class).to(MyServiceImpl.class);
}
}
Now you can inject like this:
class ServiceManager {
#Inject
public ServiceManager(Map<String, MyServiceInterface> services) {
MyServiceInterface service = stacks.get("MyServiceInterface");
// etc.
}
}
Please note when you call inj.getInstance() you do have to know the return type of the Object you're trying to create, unless you are planning on doing:
Object foo = inj.getInstance(myString);
I have some Serializable Objects which I use with GWT's RPC mechanism.
I decided to make them all sub-class an Object containing common fields such as "id", "revision" etc.
However, I've noticed that GWT does not serialize fields of the super-class, so I just get every super-class field as null on the client-side.
How can I get the super-class fields serialized as well without having to write a CustomFieldSerializer for each and every one of my Serializable classes? Is it possible?
Example:
public class Super {
private String id;
public String getId() {
return id;
}
}
public class Sub extends Super implements Serializable {
private String name;
// more stuff here
}
// on the client side, inside an AsyncCallback
onSuccess(Sub sub) {
assert(sub.getId() != null);
}
So, when I send this through GWT's RPC mechanism to the client-side, I get a null value in the 'id' field of any instance of Sub. I ensured that in the server, id is not null. I also tried to make the super-class implement Serializable, without luck.
Any advices welcome.
For serialize any class in gwt you have to implements Serializable in super class.
To pass a bean you have to fulfill the following requirements (from GWT site):
1.It implements either Java Serializable or GWT IsSerializable interface, either directly, or because it derives from a superclass that does.
2.Its non-final, non-transient instance fields are themselves serializable
3.It has a default (zero argument) constructor with any access modifier (e.g. private Foo(){} will work)
The problem may have different causes.
1.Verify that the class has a default constructor (without arguments)
2.Verify that the class implements Serializable or IsSerializable or implements an Interface that extends Serializable or extends a class that implement Serializable
3.Verify that the class is in a client.* package or …
4.Verify, if the class is not in client.* package, that is compiled in your GWT xml module definition. By default is present. If your class is in another package you have to add it to source. For example if your class is under domain.* you should add it to xml as . Be aware that the class cannot belong to server package!
5.If you are including the class from another GWT project you have to add the inherits to your xml module definition. For example if your class Foo is in the package com.dummy.domain you have to add to the module definition.
6.If you are including the class from another GWT project released as a jar verify that the jar contains also the source code because GWT recompile also the Java source for the classes passed to the Client.
If you want the data in Super to be serialized, you must make it Serializable.
I am working on a legacy system, where there is a remote bean that has become too big and monolithic, and I would like to keep separate the new functionality I that need to add.
My initial idea was, instead of adding my new methods to the existing interface, create a new interface with all my stuff and add a single method that returns a remote object implementing my interface.
The problem I am facing now is that when I'm invoking the method that returns my object, the runtime tries to serialize it instead of sending the stub.
The code layout is more or less like this:
#Stateless
public class OldBean implements OldRemoteInterface {
//lots of the old unrelated methods here
public MyNewStuff getMyNewStuff() {
return new MyNewStuff();
}
}
#Remote
public interface OldRemoteInterface {
//lots of the old unrelated methods declared here
MyNewStuff getMyNewStuff();
}
public class MyNewStuff implements NewRemoteInterface {
//methods implemented here
}
#Remote
public interface NewRemoteInterface {
//new methods declared here
}
And the exception I am getting is:
"IOP00810267: (MARSHAL) An instance of class MyNewStuff could not be marshalled:
the class is not an instance of java.io.Serializable"
I have tried to do it "the old way", extending the java.rmi.Remote interface instead of using the ejb #Remote annotation, and the exception I get is:
"IOP00511403: (INV_OBJREF) Class MyNewStuff not exported, or else is actually
a JRMP stub"
I know I must be missing something that should be obvious... :-/
your approach here is a bit confusing. when you created the new interface, the next step should have been to have the old bean implement the new interface, like so:
public class OldBean implements OldRemoteInterface, NewRemoteInterface {
Your old bean would get larger, yes, but this is the only way you can expand the functionality of your old bean without creating a new bean or touching the old interface.
The object being returned by getNewStuff() is just a plain object -- it is not remote. That's why you're getting serialization errors, because RMI is trying to transfer your NewRemoteInterface instance across the network. Annotating it with #Remote doesn't do anything (until you actually use the interface on a bean, deploy that bean and then retrieve it using DI or Contexts)