Recently I was asked this question in an interview: how can you pass an object between two JVMs? My response was using serialization, but I don't know if it's the right answer. How else could an object be passed between two JVMs?
Serialization is the perhaps only way out. Depending on your stack you could have one of the several possibilities
serialize class and deserialize them on the other end (remember remote ejb's)
write an object to a file (json etc) and read it on the other end from shared folder
or use use mciroservices to send and receive objects
you could also try out tools like protobuf, avro as they tackle the serialization problem specifically
My personal preference would be to have a small server side component (a service) for exchanging data.
You can pass Java Objects between 2 JVMs in a distributed environment as long as both the versions of JVMs are identical. Please note, your JVM is platform dependent (Platform includes Hardware (Processor) + OS). However, you need to make sure all dependent library files are available on both JVMs. Another drawback is that even if your JVMs are the same, the object passing is of no use if you are passing Java object running in one JVM to JRuby running in receiving JVM as the data types are different between the 2.
If if JVMs are same and all libraries are available, object passing is not recommended as you might decide to upgrade on OS from 32 bit to 64 bit and in that case, you need to upgrade JVM as well.
Object passing is something i prefer only within the same JVM.
You can store the object externally. Instead of a file you can go with a fast access data structure store like redis https://redis.io/ . Both(or more) the instances can read and write data to the same data store. You can also look into JMS(Java Messaging Service)
Related
I've got a java web server and I made it so that the server will respond to a certain a http* request by sending back a java object that contains a "execute" method.
I'd like to be able to execute a remote object's method.
I can't use reflection because I don't send the class, thinking about making a local class that has the same method + package-name so i can try to object.getClass()
I won't like to put the entire block of code in the toString() from the object that I will send. (Override)
I can't cast to an interface.
I'm also thinking about making a .jar library that has the definition for the class file that will be created on the server and accessed on the client, how can this work?
I couldn't find another question regarding this, so I will leave this here.
EDIT:
I'm using URLConnection to communicate with a servlet, the servlet makes an instance of the object on the server then it will send it to the client using ObjectOutputStream, as well as ObjectInputStream on the client for it to get it.
Looking for some alternatives to RMI, if none, I will lookup some RMI tutorials.
Regarding about my choice to not use RMI in the first place: maybe I don't want every time to make a connection between client-server , maybe I want to deserialize objects and check/invoke it's methods.
If you are going to "send" serialized objects from one java virtual machine (java process) to another, you need to have the .class files already present at both ends. If you decide to continue with your current approach, you would need the following:
Your client must be Java, or be able to run Java, and have the .class files that correspond to the objects that it is receiving locally available, or must download them from the server before accessing them.
You must somehow wrap serialized object streams within HTTP. HTTP is a protocol for requesting and sending web pages. It is incompatible with Java's serialization protocol (it contains extra headers, for example), and you would need to wrap Java serialization inside HTTP payloads for things to work as you seem to expect.
When you send serialized objects, you are actually sending "object graphs" (the object and all objects accessible by navigating its fields). This can end up being inefficient. Serialization may not be the best answer for you for this reason.
It is far easier to use other mechanisms:
If you avoid HTTP, you avoid the need for extra wrappers. Writing a simple server that, when connected to, receives and sends serialized objects is much easier and efficient than writing a wrapper for HTTP within a traditional Java webapp (Java app servers tend to be resource-hungry).
Consider using Kryo or other Java serialization/networking libraries - they come with built-in servers, and allow very fine-grained control over what is being sent.
Java has in-built support for RMI ("Remote Method Invocation"). This seems to be what you are actually trying to achieve. You no longer need to be aware that objects are local or remote - they appear to work the same, and all required networking and serialization is done behind the scenes. Read all about it.
I want a simple file format to store and retrieve data from disk in Java.
name=value
list=value1,value2,value3
this is mostly going to be used for initial config settings used at startup of the app. I could invision having a watcher on the file to notify the app if it changes so the new settings can be applied potentially but that would be a nice to have. The first part would be pretty easy to write. I just don't want to reinvent the wheel if something is already out there for this and I'd prefer to avoid something as heavy as spring.
Take a look at the java.util.Properties class.
Properties
You can use the Preferences class. It has a notification system, but alas it doesn't notice changes made outside the running JVM or directly to the underlying configuration store (e.g. the config file). It's a really nice class though.
Have a look at OWNER API.
It incorporates most of the feature of java.util.Properties and adds more.
Version 1.0.4 is under development and it will have:
support for arrays and collections (list, set, arrays). It is already implemented on master branch.
"hot reload", when you change the file the config object gets reloaded (it can be synchronous or asynchronous and it does support event notification for reload). Already implemented in master branch.
a lot of features (variable expansion, type conversion). Available since version 1.0.3 and available on maven central repository.
Also for 1.0.4 is planned a validation mechanism that will check the file to be compliant before discarding the old content of the config file during the reload. (not implemented yet)
If you need some particular feature, just ask on github issues or become a contributor.
Very specifically, in JDI and JPDA context, I have the following questions:
Why ObjectReference does not expose its underlying object? Is it based on some specification? Are all implementations such as Eclipse Debug Project the same that do not expose the underlying object?
Considering the situation that you have the uniqueID() from ObjectReference, is there any way to resolve the underlying object from JVM?
If no to the previous question, then what is the best way to resolve the underlying object? I should add that I am familiar with how Value's can be obtained from StackFrame information, but I really need the object reference not the internal values or structure of the fields.
Why ObjectReference does not expose its underlying object?
I am assuming that you are referring to the com.sun.jdi.ObjectReference interface. If so, it is a combination of two things:
On the face of it, it wouldn't make sense. The ObjectReference is in the JVM running the debugger, but the corresponding Java object exists on the target machine.
Assuming that it did make sense, then it would be a bad thing to expose the actual pbject addresses and memory contents. This would allow the debugger to do things to the target JVM that would lead to hard crashes.
Considering the situation that you have the uniqueID() from ObjectReference, is there any way to resolve the underlying object from JVM?
No.
If no to the previous question, then what is the best way to resolve the underlying object?
AFAIK, there is no way to do this, apart from writing your own debug agent in C / C++ using the JVM Tool Interface and configuring the target JVM to run it.
I am implementing a log server in C++; that accepts log messages from a Java program (via log4j socket appender). How do I read these java logging objects in C++?
You should configure the log4j appender to send XML format messages. Then it is simply a matter of reading XML in C++.
Serialized java objects is a byte stream which need meta information from the Java Runtime to be able to reconstruct the java objects. Without that meta information available in the system you must add that information yourself, which is tedious and error prone. I second the idea of sending XML instead - that is what XML serialization was invented for :)
Another very fast way of language-agnostic serialization is protobuf. proto-files (meta-files that describe your data-structures) are compiled using protoc which writes IO-code for various target languages.
I'm using it in my app and did some benchmarking which might give you a clue if it serves your purpose.
The only downside I'm aware of is that protobuf does not handle references at all. If one of your objects contain the same object twice it will be written twice instead of just once with a reference to the previous instance (which is the case with Java serialization).
Concerning your original question, I agree with Thorbjørn that reading and writing of serialized Java objects will be too hard and error prone.
If you consider going the protobuf way, feel free to use this logging event protobuf file as a starter.
json is the best way to go for this kind of problems.
Log4cxx is a Log4j port to C++, perhaps you can glean some ideas from that or even use it directly?
JSON! JSON! JSON! JSON!
I have one java program that has to be compiled as 1.4, and another program that could be anything (so, 1.4 or 1.6), and the two need to pass serialized objects back and forth. If I define a serializable class in a place where both programs can see it, will java's serialization still work, or do I need 1.6-1.6 or 1.4-1.4 only?
Make sure the classes to be serialized define and assign a value to static final long serialVersionUID and you should be ok.
That said, normally I would not do this. My preference is to only use normal serialization either within a single process, or between two processes are on the same machine and getting the serialized classes out of the same jar file. If that's not the case, serializing to XML is the better and safer choice.
Along with the serialVersionUID the package structure has to remain consistent for serialization, so if you had myjar.mypackage.myclass in 1.4 you have to have myjar.mypackage.myclass in 1.6.
It is not uncommon to have the Java version or your release version somewhere in the package structure. Even if the serialVersionUID remains the same between compilations the package structure will cause an incompatible version exception to get thrown at runtime.
BTW if you implement Serializable in your classes you should get a compiler warning if serialVersionUID is missing.
In my view (and based on some years of quite bitter experience) Java native serialization is fraught with problems and ought to be avoided if possible, especially as there is excellent XML/JSON support. If you do have to serialize natively, then I recommend that you hide your classes behind interfaces and implement a factory pattern in the background which will create an object of the right class when needed.
You can also use this abstraction for detecting the incompatible version exception and doing whatever conversion is necessary behind the scenes for migration of the data in your objects.
Java library classes should have compatible serialised forms between 1.4 and 1.6 unless otherwsie stated. Swing explicitly states that it is not compatible between versions, so if you are trying to serialise Swing objects then you are out of luck.
You may run into problems where the code generated by javac is slightly different. This will change the serialVersionUID. You should ensure you explicitly declare the UID in all your serialisable classes.
No, different version of the JVM will not break the serialization itself.
If some of the objects you are serializing are from the Java runtime, and their classes have evolved incompatibly, you will see failures. Most core Java classes are careful about this, but there have been discontinuities in some packages in the past.
I've successfully used serialization (in the context of RMI) with classes from different compilations on different machines running different versions of the Java runtime for years.
I don't want to digress too far from the original question, but I want to note that evolving a serialized class always requires care, regardless of the format. It is not an issue specific to Java serialization. You have to deal with the same concepts whether you serialize in XML, JSON, ASN.1, etc. Java serialization gives a fairly clear specification of what is allowed and how to make the changes that are allowed. Sometimes this is restrictive, other times it is helpful to have a prescription.
If both sides uses the same jar file, it will work most of the times. However if you use different versions of the same package/module/framework ( for instance different weblogic jars or extended usage of some "rare" exceptions ) a lot of integration test is needed before it can be approved.