Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
Recently,I read code of HBase.I find client use protobuf to communicate with server in HBase's code.
Java has "Serializable". why not use it?
Efficiency: protocol buffers are generally much more efficient at transmitting the same amount of data than Java binary serialization
Portability: Java binary serialization is not widely implemented outside Java, as far as I'm aware (unsurprisingly)
Robustness in the face of unrelated changes: unless you manually specify the serializable UUID, you can end up making breaking changes without touching data at all in Java. Ick.
Backward and forward compatiblity: old code can read data written by new code. New code can read data written by old code. (You still need to be careful, and the implications of changes are slightly different between proto2 and proto3, but basically protobuf makes this a lot easier to reason about than Java.)
It's a lot easier to accidentally introduce non-serializable members into Java binary serialization, as proto descriptor files are all about serialization... you can't refer to an arbitrary class, etc.
I've worked on projects using protocol buffers, and I've worked on projects using Java binary serialization - and I would be very reluctant to use the latter again...
Protocol Buffers is an open serialization protocol. You could write a client in C++ or C# and still be able to communicate with the server if both ends are using the same Protocol Buffer schema. Java Serializable is Java only
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I'm trying to encrypt data with AES algorytm using Libsodium library(https://download.libsodium.org/doc/) on Android. I've several problems. The first one is understanding the library, I can't find a clear explenation of Java implementation of the library, if I don't know how is meant to be used the algorytm with this library I can not use it. Back to the question: can I possibly do that? Should I use implementation of the same algorythm on booth languages or should I find cross-language library like libsodium so that I know that algorytm is implemented the same way?
The AES algorithm is standardized, but the way it should be used is not. So either you have to design your own protocol or you can use an existing protocol. Using AES securely can be tricky, so in general using a library should be preferred. However, sometimes libraries / container formats cannot be validated to be secure or they may not be sufficient for the specific use case. In that case there is nothing to it but to implement and possibly design a protocol yourself.
You might want to have a look at Fernet, which is also implemented in Java for Java 8.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am basically interested in analyzing the object form that we need to send to client over network in client-server model.
i need to know what are the criteria that we need to consider when we choose xml and java serialization respectively. which is the best approach to choose for object transform over network.
why serialization came into picture when we have already XML,JSON transformation already
Edit :
I wanted to know the why serialization is being used when we have XML/JSON already being used before its invention
If XML and JSON works for you I would stick with that. It is much easier to log and check it is doing what you believe it should be.
Java Serialization has many advantages however unless you need these they are unlikely to be the best solution.
it is built in, no extra JAR is required.
it is integrated with existing remote APIs
it supports all serializable object types and graphs. (XML and JSON only support trees of data, not typed objects)
it supports data types and you only write each reference once.
However Java Serialization is not
very efficient
easy to read/validate as a human.
flexible for schema changes esp changes in package or class names.
portable outside Java.
Personally, my preference is YAML. This is because
it was designed to be human readable where as XML is a subset of SGML and JSON a sub-set of Javascript.
it supports graphs and data types.
it is almost as efficient as JSON (it's a bit slower as it does more)
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have an API in C# .Net for uploading files whose size vary between 10 B - 100 KB. Per second, the system receives around 5 such calls. Now I want to pass this file to a JAVA process (Because it is a producer of Kafka, and we want it to be JVM based, while C# API is legacy). Both are going to reside on the same machine almost always. What is the best way of doing that?
I read about jni4net, IKVM for interacting with Java from C#. Would they be better or should I make it socket based (Web API in Java accepting the files), or should I read from the local filesystem where C# App has uploaded or any other option that I am missing?
In environment with high concurrency, reading from the local file-system might not be good idea.
You could use memory mapped files which java supports with FileChannel.
Depending on operating system, you could also use Named pipes for IPC, here is an article showing how to use pipes between .Net and Java:
http://v01ver-howto.blogspot.com/2010/04/howto-use-named-pipes-to-communicate.html
All options considered, i would go with sockets. They are portable and easy to do and will most likely meet performance requirements you have.
You may also use a message queue. You can either put a binary message, serialize the file or put the location of file on the queue if you are storing the files in the file system.
A solution with sockets and message queues will allow you to have a more distributed architecture i.e. not loading a single machine with too much work.
If you want to actually use the C# API from Java, a bridge is probably the way to go. It'll likely be more efficient than a Web API. Some bridges will also allow you to run the C# and the Java in the same process, which is more efficient still.
In addition to the bridges you mention, you might want to consider JNBridgePro. You can find more information at our website.
Disclosure -- I am affiliated with JNBridge.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
For instance, when I know how all representations should look like, which path has more advantages.
Creating/Annotating Java classes and then document them in (lets say) XML documentations.
Write XML representation and then generate Java classes.
From what I may think of:
First approach would increase development speed. Since documentation is the last step (when it's feasible).
Second approach may let me feel as I am the client (for my own App) and will lead me to consider, how practical is my representation from a user standpoint.
But I have no other ideas, while I assume that this topic might be important in some cases, that I just don't know yet.
My answer is "it depends".
I've written quite a lot of REST services and I stick by very simple rules:
If this is a public facing API and by public I mean that you do not control all the clients consuming it, or once they're released they're in the wild then I will write the API first and make my Java model fit into that
If it's purely internal and you have the ability to change at will then go for the Java model first and just use whatever representation gets spit out.
The latter is the faster development model but it relies on you not really caring about what the actual representation looks like.
Java POJO's can easily be serialized to xml so I would generate the xml from existing java POJOS (although I agree with the commenter json is usually better)
I would go for POJOs like ghostbust555. The reason for that is XML is just one of many serialization formats. You could JSON or even your proprietary text format (or binary format).
If you take plain POJOs you can then annotate them for XML marshalling / unmarshalling via JAXB for instance. You could also use them for binary protocols such as Apache Thrift.
Lastly you could actually start from a UML class diagram and use that to generate your POJOs. That helps you document your model better.
creating POJOs is better then XML documents.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I currently need to resolve an issue with duplicated logic on web-based monitoring (Java) and a big legacy C server app.
For this I need to build new clients for the C app, but I have no idea what formats are good for Java to read.
Should I use XML, Json, or some other format?
The answer is completely dependent on your problem domain. Java has libraries available for reading XML, JSON and a host of other protocols.
You need to be asking questions like:
How much data will I be producing?
Does the data need to be human-readable?
Is storage size an issue?
Is the time to read / write the data an issue?
Do I need to support multiple, versioned protocols?
You can use either, JSON is the new standard for web-based messages and there are plenty of Java libraries to handle JSON efficiently.
There's no one stop perfect fit here, but maybe Google's protobuf is a good idea?
There's a native C++ compiler, which is probably of no use to you; there's an active protobuf-c implementation that you might be able to use, though.