Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am basically interested in analyzing the object form that we need to send to client over network in client-server model.
i need to know what are the criteria that we need to consider when we choose xml and java serialization respectively. which is the best approach to choose for object transform over network.
why serialization came into picture when we have already XML,JSON transformation already
Edit :
I wanted to know the why serialization is being used when we have XML/JSON already being used before its invention
If XML and JSON works for you I would stick with that. It is much easier to log and check it is doing what you believe it should be.
Java Serialization has many advantages however unless you need these they are unlikely to be the best solution.
it is built in, no extra JAR is required.
it is integrated with existing remote APIs
it supports all serializable object types and graphs. (XML and JSON only support trees of data, not typed objects)
it supports data types and you only write each reference once.
However Java Serialization is not
very efficient
easy to read/validate as a human.
flexible for schema changes esp changes in package or class names.
portable outside Java.
Personally, my preference is YAML. This is because
it was designed to be human readable where as XML is a subset of SGML and JSON a sub-set of Javascript.
it supports graphs and data types.
it is almost as efficient as JSON (it's a bit slower as it does more)
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I need some guidance in designing an API wrapper for my backend APIs. I have tried to keep it as specific as possible.
Context: We have a project which supports certain file operations like edit, create, merge etc. All the services are exposed as rest APIs. Now I have to create an API wrapper over this (client library) in Java. I've been reading about DDD and trying to approach the problem using that.
As per my thinking, the core object in my project would be File, along with some minor DTOs for talking to the backend. Edit, create, merge will be the verbs here acting on my domain object. I want to make it as easy as possible for the external developer to integrate the API. I would like the design to be something like that
For Creating a file : File.create() For editing : File.edit() Same for other operations Also, I want to have the capability of chaining operations (along the lines of fluent interfaces) for readability
For. eg. if you want to create a file and then convert it, it should be something like : File.create().convert(Required params)
My problem is each of the operation is bulky and async. I don't wanna write all the async error handling logic in the File class. Chaining the methods like above wont be easy as well if they return CompletableFuture objects, and this will be harder to maintain.
Question: What is a better way of solving this problem?
I am not looking for a spoonfed design. I just want to be guided to a design approach which fits the scenario. Feel free to point if I am understanding DDD wrong.
Very roughly: your domain model is responsible for bookkeeping. The effects on the state of the filesystem are implemented in your infrastructure layer, and you send the results back to the bookkeeper.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
I'm working on a project in java which is replacing an old C# program. In the old C# program, users could save out the data they were working on to a GZip file, written by Serializing a wrapper object which contained all the data, and writing it using a GZipStream.
I would like to offer users the option to load in data saved with the old system. Opening the GZip file using Java's GZipInputStream is easy enough, but how do I deserialize the object?
I did a little research and found this question: Can a serialized simple java object be deserialized by C#?
The answer there said that there's no library to do this, but there is documentation on how Java serializes objects, and it is possible to write your own code to deserialize them.
Is there a known way to do this from C# to Java? Or should I write my own converter from scratch?
You have to serialize everything in a JSON or XML file. No binary data should be stored in the file at all. Then you can de-serialize it within Java by using an arbitrary library.
You might serialize it with JSON.net and deserialize with Jackson.
UPDATE: In fact you might in fact write a SaveData class, compile it into a .dll and make it available for Com interop. Then, there are some java third party libraries which can make use of the SaveData class inside the dll so that it can interoperate in the java application. Then, serialize this SaveData into a file (binary format is ok). However I'm not sure how serialization is going to work out on byte-level. Are you really going to get the same object when you deserialize it on the java side? This solution might work but it is ugly as hell and at least as "painful" as re-writing the c# app to write to json
You could embed a tiny .NET application inside your jar and invoke the .NET application using this SO question
The .NET application would deserialize the file into .NET objects and then convert the .NET objects to xml or json. The xml/json could then be deserialized into equivalent java objects in the JVM
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
Recently,I read code of HBase.I find client use protobuf to communicate with server in HBase's code.
Java has "Serializable". why not use it?
Efficiency: protocol buffers are generally much more efficient at transmitting the same amount of data than Java binary serialization
Portability: Java binary serialization is not widely implemented outside Java, as far as I'm aware (unsurprisingly)
Robustness in the face of unrelated changes: unless you manually specify the serializable UUID, you can end up making breaking changes without touching data at all in Java. Ick.
Backward and forward compatiblity: old code can read data written by new code. New code can read data written by old code. (You still need to be careful, and the implications of changes are slightly different between proto2 and proto3, but basically protobuf makes this a lot easier to reason about than Java.)
It's a lot easier to accidentally introduce non-serializable members into Java binary serialization, as proto descriptor files are all about serialization... you can't refer to an arbitrary class, etc.
I've worked on projects using protocol buffers, and I've worked on projects using Java binary serialization - and I would be very reluctant to use the latter again...
Protocol Buffers is an open serialization protocol. You could write a client in C++ or C# and still be able to communicate with the server if both ends are using the same Protocol Buffer schema. Java Serializable is Java only
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
For instance, when I know how all representations should look like, which path has more advantages.
Creating/Annotating Java classes and then document them in (lets say) XML documentations.
Write XML representation and then generate Java classes.
From what I may think of:
First approach would increase development speed. Since documentation is the last step (when it's feasible).
Second approach may let me feel as I am the client (for my own App) and will lead me to consider, how practical is my representation from a user standpoint.
But I have no other ideas, while I assume that this topic might be important in some cases, that I just don't know yet.
My answer is "it depends".
I've written quite a lot of REST services and I stick by very simple rules:
If this is a public facing API and by public I mean that you do not control all the clients consuming it, or once they're released they're in the wild then I will write the API first and make my Java model fit into that
If it's purely internal and you have the ability to change at will then go for the Java model first and just use whatever representation gets spit out.
The latter is the faster development model but it relies on you not really caring about what the actual representation looks like.
Java POJO's can easily be serialized to xml so I would generate the xml from existing java POJOS (although I agree with the commenter json is usually better)
I would go for POJOs like ghostbust555. The reason for that is XML is just one of many serialization formats. You could JSON or even your proprietary text format (or binary format).
If you take plain POJOs you can then annotate them for XML marshalling / unmarshalling via JAXB for instance. You could also use them for binary protocols such as Apache Thrift.
Lastly you could actually start from a UML class diagram and use that to generate your POJOs. That helps you document your model better.
creating POJOs is better then XML documents.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I found this post earlier today about whether to XSD --> POJO or POJO --> to XSD.
Java to XSD or XSD to Java
It made us question whether we should even bother with making XSDs in our scenario. We assumed we should write XSD then generate Java POJOs from it, and then use those POJOs to transfer data between our REST server and clients. But what value do those XSDs have if we could just directly write the POJOs with the required annotations?
We thought it might help to write a custom XSD GUI tool instead of hand coding the XSD and that could be one benefit of having XSD. But I assume there are also GUI tools for creating JAXB beans?
Is it worth having XSD at the cost of the extra project complexity of having them and needing to generate the classes before compile time in our scenario?
One benefit of an XML Scheme over a set of JAXB-annotated classes would be that you have a definition for "foreign" communication partners. Maybe you don't need it now, but what about tomorrow?
Another good reason of using XML Schema is that (if used wisely) restricts your types to what is easily (un)marshalled. It's possible to create POJO classes that are darned hard to serialize.
The third reason iÍ can think of is that you don't have to learn how to cope with all of those javax.xml.annotation annotation classes. It may be that you feel more familiar with them (and they may even permit you to do a trick or two over the XML Schema approach) but I've usually found it more convenient to write an XML Schema and watch xjc to handle the nitty-gritty details for me.
It depends on your requirements. In some cases the requirements (of request/response maybe) are defined by the means of a XSD document as xsd can provide all the information(datatype etc) we need to build the classes, in these cases direct generation of POJO is helpful as we already have an xsd in place.
It might be a overhead for your case, if you have to first write an xsd and then generate an POJO object from it.
See what suits you, Choose wisely!