Related
I am using msgpack to serialize data. I have some code works fine with serializing data.
public void testJackson() throws Exception {
ByteArrayOutputStream out = new ByteArrayOutputStream();
String data1 = "test data";
int data2 = 10;
List<String> data3 = new ArrayList<String>();
data3.add("list data1");
data3.add("list data1");
ObjectMapper mapper = new ObjectMapper();
mapper.writeValue(out, data1);
mapper.writeValue(out, data2);
mapper.writeValue(out, data3);
// TODO: How to deserialize?
}
But now I don't know how to deserialize data.
I am not finding any solution anywhere. It will be good if anyone can help how to proceed.
The problem
I have tried many of the readValue methods, but I only can get the first String, about the second and third value I have no idea
The thing is, Jackson always reads the first data, since the data is neither deleted from the nor did you explicitly tell Jackson that the next data is from position A to position B
Solutions
this example works and is similar to your code, but is not very elegant. Here I explicitly tell Jackson where my data is, but I have to know how it got written, which is a way too specific solution
File dataFile = new File("jackson.txt");
if(!dataFile.exists())
dataFile.createNewFile();
FileOutputStream fileOut = new FileOutputStream(dataFile);
ByteArrayOutputStream out = new ByteArrayOutputStream();
FileInputStream fileIn = new FileInputStream(dataFile);
String writeData1 = "test data";
int writeData2 = 10;
List<String> writeData3 = new ArrayList<String>();
writeData3.add("list data1");
writeData3.add("list data1");
ObjectMapper mapper = new ObjectMapper();
byte[] writeData1Bytes = mapper.writeValueAsBytes(writeData1);
out.write(writeData1Bytes);
byte[] writeData2Bytes = mapper.writeValueAsBytes(writeData2);
out.write(writeData2Bytes);
byte[] writeData3Bytes = mapper.writeValueAsBytes(writeData3);
out.write(writeData3Bytes);
out.writeTo(fileOut);
// TODO: How to deserialize?
int pos = 0;
byte[] readData = new byte[1000];
fileIn.read(readData);
String readData1 = mapper.readValue(readData, pos, writeData1Bytes.length, String.class);
pos += writeData1Bytes.length;
Integer readData2 = mapper.readValue(readData, pos, writeData2Bytes.length, Integer.class);
pos += writeData2Bytes.length;
ArrayList readData3 = mapper.readValue(readData, pos, writeData3Bytes.length, ArrayList.class);
pos += writeData3Bytes.length;
System.out.printf("readData1 = %s%n", readData1);
System.out.printf("readData2 = %s%n", readData2);
System.out.printf("readData3 = %s%n", readData3);
the file looks then like this
"test data"10["list data1","list data1"]
How to do it correctly
a way more elegant way is to encapsulate your data in an object which can be turned into a valid JSON string and from that Jackson won't need any more information
public class JacksonTest {
public static class DataNode {
#JsonProperty("data1")
private String data1;
#JsonProperty("data2")
private int data2;
#JsonProperty("data3")
private List<String> data3;
//needed for Jackson
public DataNode() {
}
public DataNode(String data1, int data2, List<String> data3) {
this.data1 = data1;
this.data2 = data2;
this.data3 = data3;
}
}
public static void main(String[] args) throws Exception {
File dataFile = new File("jackson.txt");
if(!dataFile.exists())
dataFile.createNewFile();
FileOutputStream fileOut = new FileOutputStream(dataFile);
ByteArrayOutputStream out = new ByteArrayOutputStream();
FileInputStream fileIn = new FileInputStream(dataFile);
String writeData1 = "test data";
int writeData2 = 10;
List<String> writeData3 = new ArrayList<String>();
writeData3.add("list data1");
writeData3.add("list data1");
DataNode writeData = new DataNode(writeData1, writeData2, writeData3);
ObjectMapper mapper = new ObjectMapper();
mapper.writeValue(out, writeData);
out.writeTo(fileOut);
// TODO: How to deserialize?
DataNode readData = mapper.readValue(fileIn, DataNode.class);
System.out.printf("readData1 = %s%n", readData.data1);
System.out.printf("readData2 = %s%n", readData.data2);
System.out.printf("readData3 = %s%n", readData.data3);
}
}
the content of the file looks like this
{"data1":"test data","data2":10,"data3":["list data1","list data1"]}
You'll want to use one of the readValue methods from ObjectMapper - probably one that has a Reader or InputStream as the first parameter.
#Japu_D_Cret Thank you for such a detailed answer!
Actually I want to use msgpack to transfer data, and I made it work by using msgpack, here is my code
ByteArrayOutputStream out = new ByteArrayOutputStream();
String data1 = "test data";
int data2 = 10;
List<String> data3 = new ArrayList<String>();
data3.add("list data1");
data3.add("list data1");
MessagePack packer = new MessagePack();
packer.write(out, data1);
packer.write(out, data2);
packer.write(out, data3);
// TODO: How to deserialize?
BufferUnpacker unpacker = packer.createBufferUnpacker(out.toByteArray());
System.out.println(unpacker.readString());
System.out.println(unpacker.readInt());
System.out.println(unpacker.read(Templates.tList(Templates.TString)));
Then I found jackson-databind on msgpack website and it supports msgpack format also.
I do some tests on these two and found that jackson's serialize performance is better than msgpack, so I want to use jackson instead of msgpack.
I'm currently trying to write a custom streams proxy (let's call it in that way) that can change the content from the given input stream and produce a modified, if necessary, output. This requirement is really necessary because sometimes I have to modify the streams in my application (e.g. compress the data truly on the fly). The following class is pretty easy and it uses internal buffering.
private static class ProxyInputStream extends InputStream {
private final InputStream iStream;
private final byte[] iBuffer = new byte[512];
private int iBufferedBytes;
private final ByteArrayOutputStream oBufferStream;
private final OutputStream oStream;
private byte[] oBuffer = emptyPrimitiveByteArray;
private int oBufferIndex;
ProxyInputStream(InputStream iStream, IFunction<OutputStream, ByteArrayOutputStream> oStreamFactory) {
this.iStream = iStream;
oBufferStream = new ByteArrayOutputStream(512);
oStream = oStreamFactory.evaluate(oBufferStream);
}
#Override
public int read() throws IOException {
if ( oBufferIndex == oBuffer.length ) {
iBufferedBytes = iStream.read(iBuffer);
if ( iBufferedBytes == -1 ) {
return -1;
}
oBufferIndex = 0;
oStream.write(iBuffer, 0, iBufferedBytes);
oStream.flush();
oBuffer = oBufferStream.toByteArray();
oBufferStream.reset();
}
return oBuffer[oBufferIndex++];
}
}
Let's assume we also have a sample test output stream that simply adds a space character before every written byte ("abc" -> " a b c") like this:
private static class SpacingOutputStream extends OutputStream {
private final OutputStream outputStream;
SpacingOutputStream(OutputStream outputStream) {
this.outputStream = outputStream;
}
#Override
public void write(int b) throws IOException {
outputStream.write(' ');
outputStream.write(b);
}
}
And the following test method:
private static void test(final boolean useDeflater) throws IOException {
final FileInputStream input = new FileInputStream(SOURCE);
final IFunction<OutputStream, ByteArrayOutputStream> outputFactory = new IFunction<OutputStream, ByteArrayOutputStream>() {
#Override
public OutputStream evaluate(ByteArrayOutputStream outputStream) {
return useDeflater ? new DeflaterOutputStream(outputStream) : new SpacingOutputStream(outputStream);
}
};
final InputStream proxyInput = new ProxyInputStream(input, outputFactory);
final OutputStream output = new FileOutputStream(SOURCE + ".~" + useDeflater);
int c;
while ( (c = proxyInput.read()) != -1 ) {
output.write(c);
}
output.close();
proxyInput.close();
}
This test method simply reads the file content and writes it to another stream, that's probably can be modified somehow. If the test method is running with useDeflater=false, the expected approach works fine as it's expected. But if the test method is invoked with the useDeflater set on, it behaves really strange and simply writes almost nothing (if omit the header 78 9C). I suspect that the deflater class may not be designed to meet the approach I like to use, but I always believed that ZIP format and the deflate compression are designed to work on-fly.
Probably I'm wrong at some point with the specifics of the deflate compression algorithm. What do I really miss?.. Perhaps there could be another approach to write a "streams proxy" to behave exactly as I want it to work... How can I compress the data on the fly being limited with the streams only?
Thanks in advance.
UPD: The following basic version works pretty nice with deflater and inflater:
public final class ProxyInputStream<OS extends OutputStream> extends InputStream {
private static final int INPUT_BUFFER_SIZE = 512;
private static final int OUTPUT_BUFFER_SIZE = 512;
private final InputStream iStream;
private final byte[] iBuffer = new byte[INPUT_BUFFER_SIZE];
private final ByteArrayOutputStream oBufferStream;
private final OS oStream;
private final IProxyInputStreamListener<OS> listener;
private byte[] oBuffer = emptyPrimitiveByteArray;
private int oBufferIndex;
private boolean endOfStream;
private ProxyInputStream(InputStream iStream, IFunction<OS, ByteArrayOutputStream> oStreamFactory, IProxyInputStreamListener<OS> listener) {
this.iStream = iStream;
oBufferStream = new ByteArrayOutputStream(OUTPUT_BUFFER_SIZE);
oStream = oStreamFactory.evaluate(oBufferStream);
this.listener = listener;
}
public static <OS extends OutputStream> ProxyInputStream<OS> proxyInputStream(InputStream iStream, IFunction<OS, ByteArrayOutputStream> oStreamFactory, IProxyInputStreamListener<OS> listener) {
return new ProxyInputStream<OS>(iStream, oStreamFactory, listener);
}
#Override
public int read() throws IOException {
if ( oBufferIndex == oBuffer.length ) {
if ( endOfStream ) {
return -1;
} else {
oBufferIndex = 0;
do {
final int iBufferedBytes = iStream.read(iBuffer);
if ( iBufferedBytes == -1 ) {
if ( listener != null ) {
listener.afterEndOfStream(oStream);
}
endOfStream = true;
break;
}
oStream.write(iBuffer, 0, iBufferedBytes);
oStream.flush();
} while ( oBufferStream.size() == 0 );
oBuffer = oBufferStream.toByteArray();
oBufferStream.reset();
}
}
return !endOfStream || oBuffer.length != 0 ? (int) oBuffer[oBufferIndex++] & 0xFF : -1;
}
}
I don't believe that DeflaterOutputStream.flush() does anything meaningful. the deflater will accumulate data until it has something to write out to the underlying stream. the only way to force the remaining bit of data out is to call DeflaterOutputStream.finish(). however, this would not work for your current implementation, as you can't call finish until you are entirely done writing.
it's actually very difficult to write a compressed stream and read it within the same thread. In the RMIIO project i actually do this, but you need an arbitrarily sized intermediate output buffer (and you basically need to push data in until something comes out compressed on the other end, then you can read it). You might be able to use some of the util classes in that project to accomplish what you want to do.
Why don't use GZipOutputStream?
I'm a little lost. But I should simple use the original outputStream when I don't want to compress and new GZipOutputStream(outputStream) when I DO want to compress. That's all. Anyway, check you are flushing the output streams.
Gzip vs zip
Also: one thing is GZIP (compress a stream, that's what you're doing) and another thing is writing a valid zip file (file headers, file directory, entries (header,data)*). Check ZipOutputStream.
Be careful, if somewhere you use method
int read(byte b[], int off, int len) and in case of exception in line
final int iBufferedBytes = iStream.read(iBuffer);
you will get stuck in infinite loop
I am beginning with Java and testng test cases.
I need to write a class, which reads data from a file and makes an in-memory data structure and uses this data structure for further processing. I would like to test, if this DS is being populated correctly. This would call for dumping the DS into a file and then comparing the input file with the dumped file. Is there any testNG assert available for file matching? Is this a common practice?
I think it would be better to compare the data itself not the written out data.
So I would write a method in the class to return this data structure (let's call it getDataStructure()) and then write a unit test to compare with the correct data.
This only needs a correct equals() method in your data structure class and do:
Assert.assertEquals(yourClass.getDataStructure(), correctData);
Of course if you need to write out the data structure to a file, then you can test the serialization and deserialization separately.
File compare/matching can be extracted to a utility method or something like that.
If you need it only for testing there are addons for jUnit
http://junit-addons.sourceforge.net/junitx/framework/FileAssert.html
If you need file compare outside the testing environment you can use this simple function
public static boolean fileContentEquals(String filePathA, String filePathB) throws Exception {
if (!compareFilesLength(filePathA, filePathB)) return false;
BufferedInputStream streamA = null;
BufferedInputStream streamB = null;
try {
File fileA = new File(filePathA);
File fileB = new File(filePathB);
streamA = new BufferedInputStream(new FileInputStream(fileA));
streamB = new BufferedInputStream(new FileInputStream(fileB));
int chunkSizeInBytes = 16384;
byte[] bufferA = new byte[chunkSizeInBytes];
byte[] bufferB = new byte[chunkSizeInBytes];
int totalReadBytes = 0;
while (totalReadBytes < fileA.length()) {
int readBytes = streamA.read(bufferA);
streamB.read(bufferB);
if (readBytes == 0) break;
MessageDigest digestA = MessageDigest.getInstance(CHECKSUM_ALGORITHM);
MessageDigest digestB = MessageDigest.getInstance(CHECKSUM_ALGORITHM);
digestA.update(bufferA, 0, readBytes);
digestB.update(bufferB, 0, readBytes);
if (!MessageDigest.isEqual(digestA.digest(), digestB.digest()))
{
closeStreams(streamA, streamB);
return false;
}
totalReadBytes += readBytes;
}
closeStreams(streamA, streamB);
return true;
} finally {
closeStreams(streamA, streamB);
}
}
public static void closeStreams(Closeable ...streams) {
for (int i = 0; i < streams.length; i++) {
Closeable stream = streams[i];
closeStream(stream);
}
}
public static boolean compareFilesLength(String filePathA, String filePathB) {
File fileA = new File(filePathA);
File fileB = new File(filePathB);
return fileA.length() == fileB.length();
}
private static void closeStream(Closeable stream) {
try {
stream.close();
} catch (IOException e) {
// ignore exception
}
}
Your choice, but having an utility class with that functionality that can be reused is better imho.
Good luck and have fun.
Personally I would do the opposite. Surely you need a way to compare two of these data structure in the Java world - so the test would read from the file, build the DS, do its processing, and then assert it's equal to an "expected" DS you set up in your test.
(using JUnit4)
#Test
public void testProcessingDoesWhatItShould() {
final DataStructure original = readFromFile(filename);
final DataStructure actual = doTheProcessingYouNeedToDo(original);
final DataStructure expected = generateMyExpectedResult();
Assert.assertEquals("data structure", expected, actual);
}
If this DS is a simple Java Bean. then you can use EqualsBuilder from Apache Commons to compare 2 objects.
compare bytes loaded from file system and bytes you are going to write file system
pseudo code
byte[] loadedBytes = loadFileContentFromFile(file) // maybe apache commons IOUtils.toByteArray(InputStream input)
byte[] writeBytes = constructBytesFromDataStructure(dataStructure)
Assert.assertTrue(java.util.Arrays.equals(writeBytes ,loadedBytes));
I want to add an xml:base declaration to an xml file in java. I currently have the xml output in an OutputStream that was generated by some third party code.
The file starts out like this:
<rdf:RDF
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:owl="http://www.w3.org/2002/07/owl#"
xmlns="http://www.mycompany.com/myNS#"
xmlns:xsd="http://www.w3.org/2001/XMLSchema#"
xmlns:rdfs="http://www.w3.org/2000/01/rdf-schema#">
And I want it to look like this:
<rdf:RDF
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:owl="http://www.w3.org/2002/07/owl#"
xmlns="http://www.mycompany.com/myNS#"
xmlns:xsd="http://www.w3.org/2001/XMLSchema#"
xmlns:rdfs="http://www.w3.org/2000/01/rdf-schema#"
xml:base="http://www.mycompany.com/myNS">
I must be having a brain fart or something, because I can't think of a good way to do this pragmatically.
Any ideas?
You can change the xml:base used in RDF/XML serialization by obtaining the appropriate RDFWriter and setting its xmlbase property to your chosen xmlbase. The following code reads a model from a string (the important part of this question is about how to write the model, not where it comes frm) and then writes it in RDF/XML twice, each time with a different xml:base.
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
import com.hp.hpl.jena.rdf.model.Model;
import com.hp.hpl.jena.rdf.model.ModelFactory;
import com.hp.hpl.jena.rdf.model.RDFWriter;
public class ChangeBase {
public static void main(String[] args) throws IOException {
final String NS = "http://example.org/";
final String text = "" +
"#prefix ex: <"+NS+">.\n" +
"ex:foo a ex:Foo .\n" +
"ex:foo ex:frob ex:bar.\n";
final Model model = ModelFactory.createDefaultModel();
try ( final InputStream in = new ByteArrayInputStream( text.getBytes() )) {
model.read( in, null, "TTL" );
}
// get a writer for RDF/XML-ABBREV, set its xmlbase to the NS, and write the model
RDFWriter writer = model.getWriter( "RDF/XML-ABBREV" );
writer.setProperty( "xmlbase", NS );
writer.write( model, System.out, null );
// change the base to example.com (.com, not .org) and write again
writer.setProperty( "xmlbase", "http://example.com" );
writer.write( model, System.out, null );
}
}
The output is (notice that in the first case, the base is htttp://example.org/ and in the second, it's http://example.com (the difference is .org vs. .com):
<rdf:RDF
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:ex="http://example.org/"
xml:base="http://example.org/">
<ex:Foo rdf:about="foo">
<ex:frob rdf:resource="bar"/>
</ex:Foo>
</rdf:RDF>
<rdf:RDF
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:ex="http://example.org/"
xml:base="http://example.com">
<ex:Foo rdf:about="http://example.org/foo">
<ex:frob rdf:resource="http://example.org/bar"/>
</ex:Foo>
</rdf:RDF>
The ByteArrayInputStream won't scale for large files, and I didn't like the idea of using a temp file. I also thought it was overkill to load the whole file into the DOM just to add the xml:base tag.
Here's an alternate solution using pipes and a simple hand rolled parsing code to add the tag.
PipedInputStream pipedInput = new PipedInputStream();
PipedOutputStream pipedOutput = new PipedOutputStream(pipedInput);
new Thread(new ModelExportThread(model, pipedOutput)).start();
int bufferSize = 1024;
byte[] bytes = new byte[bufferSize];
StringBuffer stringBuffer = new StringBuffer();
int bytesRead = pipedInput.read(bytes, 0, bufferSize);
boolean done = false;
String startRDF = "<rdf:RDF";
while (bytesRead > 0) {
if (!done) {
stringBuffer.append(new String(bytes, 0, bytesRead));
int startIndex = stringBuffer.indexOf(startRDF);
if ((startIndex >= 0)) {
stringBuffer.insert(startIndex + startRDF.length(), " xml:base=\"" + namespace + "\"");
outputStream.write(stringBuffer.toString().getBytes());
stringBuffer.setLength(0);
done = true;
}
} else {
outputStream.write(bytes, 0, bytesRead);
}
bytesRead = pipedInput.read(bytes, 0, bufferSize);
}
outputStream.flush();
Here's the threaded code to write to the output pipe.
public class ModelExportThread implements Runnable {
private final OntModel model;
private final OutputStream outputStream;
public ModelExportThread(OntModel model, OutputStream outputStream) {
this.model = model;
this.outputStream = outputStream;
}
public void run() {
try {
model.write(outputStream, "RDF/XML-ABBREV");
outputStream.flush();
outputStream.close();
} catch (IOException ex) {
Logger.getLogger(OntologyModel.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
After some digging, this is what I did.
NOTE: I had the third party app write the xml to a StringWriter instead of an output stream named 'writer'. 'outputStream' is the name of the stream the resulting XML will be written to.
ByteArrayInputStream inputStream = new ByteArrayInputStream(writer.toString().getBytes());
Document myXML =
DocumentBuilderFactory.newInstance().newDocumentBuilder().parse(inputStream);
myXML.getDocumentElement().setAttribute("xml:base", namespace);
Transformer transformer = TransformerFactory.newInstance().newTransformer();
StreamResult result = new StreamResult(outputStream);
DOMSource source = new DOMSource(myXML);
transformer.transform(source, result);
I really thought this would be easier.
I am able to serialize an object into a file and then restore it again as is shown in the next code snippet. I would like to serialize the object into a string and store into a database instead. Can anyone help me?
LinkedList<Diff_match_patch.Patch> patches = // whatever...
FileOutputStream fileStream = new FileOutputStream("foo.ser");
ObjectOutputStream os = new ObjectOutputStream(fileStream);
os.writeObject(patches1);
os.close();
FileInputStream fileInputStream = new FileInputStream("foo.ser");
ObjectInputStream oInputStream = new ObjectInputStream(fileInputStream);
Object one = oInputStream.readObject();
LinkedList<Diff_match_patch.Patch> patches3 = (LinkedList<Diff_match_patch.Patch>) one;
os.close();
Sergio:
You should use BLOB. It is pretty straighforward with JDBC.
The problem with the second code you posted is the encoding. You should additionally encode the bytes to make sure none of them fails.
If you still want to write it down into a String you can encode the bytes using java.util.Base64.
Still you should use CLOB as data type because you don't know how long the serialized data is going to be.
Here is a sample of how to use it.
import java.util.*;
import java.io.*;
/**
* Usage sample serializing SomeClass instance
*/
public class ToStringSample {
public static void main( String [] args ) throws IOException,
ClassNotFoundException {
String string = toString( new SomeClass() );
System.out.println(" Encoded serialized version " );
System.out.println( string );
SomeClass some = ( SomeClass ) fromString( string );
System.out.println( "\n\nReconstituted object");
System.out.println( some );
}
/** Read the object from Base64 string. */
private static Object fromString( String s ) throws IOException ,
ClassNotFoundException {
byte [] data = Base64.getDecoder().decode( s );
ObjectInputStream ois = new ObjectInputStream(
new ByteArrayInputStream( data ) );
Object o = ois.readObject();
ois.close();
return o;
}
/** Write the object to a Base64 string. */
private static String toString( Serializable o ) throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ObjectOutputStream oos = new ObjectOutputStream( baos );
oos.writeObject( o );
oos.close();
return Base64.getEncoder().encodeToString(baos.toByteArray());
}
}
/** Test subject. A very simple class. */
class SomeClass implements Serializable {
private final static long serialVersionUID = 1; // See Nick's comment below
int i = Integer.MAX_VALUE;
String s = "ABCDEFGHIJKLMNOP";
Double d = new Double( -1.0 );
public String toString(){
return "SomeClass instance says: Don't worry, "
+ "I'm healthy. Look, my data is i = " + i
+ ", s = " + s + ", d = " + d;
}
}
Output:
C:\samples>javac *.java
C:\samples>java ToStringSample
Encoded serialized version
rO0ABXNyAAlTb21lQ2xhc3MAAAAAAAAAAQIAA0kAAWlMAAFkdAASTGphdmEvbGFuZy9Eb3VibGU7T
AABc3QAEkxqYXZhL2xhbmcvU3RyaW5nO3hwf////3NyABBqYXZhLmxhbmcuRG91YmxlgLPCSilr+w
QCAAFEAAV2YWx1ZXhyABBqYXZhLmxhbmcuTnVtYmVyhqyVHQuU4IsCAAB4cL/wAAAAAAAAdAAQQUJ
DREVGR0hJSktMTU5PUA==
Reconstituted object
SomeClass instance says: Don't worry, I'm healthy. Look, my data is i = 2147483647, s = ABCDEFGHIJKLMNOP, d = -1.0
NOTE: for Java 7 and earlier you can see the original answer here
How about writing the data to a ByteArrayOutputStream instead of a FileOutputStream?
Otherwise, you could serialize the object using XMLEncoder, persist the XML, then deserialize via XMLDecoder.
Thanks for great and quick replies. I will gives some up votes inmediately to acknowledge your help. I have coded the best solution in my opinion based on your answers.
LinkedList<Patch> patches1 = diff.patch_make(text2, text1);
try {
ByteArrayOutputStream bos = new ByteArrayOutputStream();
ObjectOutputStream os = new ObjectOutputStream(bos);
os.writeObject(patches1);
String serialized_patches1 = bos.toString();
os.close();
ByteArrayInputStream bis = new ByteArrayInputStream(serialized_patches1.getBytes());
ObjectInputStream oInputStream = new ObjectInputStream(bis);
LinkedList<Patch> restored_patches1 = (LinkedList<Patch>) oInputStream.readObject();
// patches1 equals restored_patches1
oInputStream.close();
} catch(Exception ex) {
ex.printStackTrace();
}
Note i did not considered using JSON because is less efficient.
Note: I will considered your advice about not storing serialized object as strings in the database but byte[] instead.
Java8 approach, converting Object from/to String, inspired by answer from OscarRyz. For de-/encoding, java.util.Base64 is required and used.
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.io.Serializable;
import java.util.Base64;
import java.util.Optional;
final class ObjectHelper {
private ObjectHelper() {}
static Optional<String> convertToString(final Serializable object) {
try (final ByteArrayOutputStream baos = new ByteArrayOutputStream();
ObjectOutputStream oos = new ObjectOutputStream(baos)) {
oos.writeObject(object);
return Optional.of(Base64.getEncoder().encodeToString(baos.toByteArray()));
} catch (final IOException e) {
e.printStackTrace();
return Optional.empty();
}
}
static <T extends Serializable> Optional<T> convertFrom(final String objectAsString) {
final byte[] data = Base64.getDecoder().decode(objectAsString);
try (final ObjectInputStream ois = new ObjectInputStream(new ByteArrayInputStream(data))) {
return Optional.of((T) ois.readObject());
} catch (final IOException | ClassNotFoundException e) {
e.printStackTrace();
return Optional.empty();
}
}
}
XStream provides a simple utility for serializing/deserializing to/from XML, and it's very quick. Storing XML CLOBs rather than binary BLOBS is going to be less fragile, not to mention more readable.
How about persisting the object as a blob
If you're storing an object as binary data in the database, then you really should use a BLOB datatype. The database is able to store it more efficiently, and you don't have to worry about encodings and the like. JDBC provides methods for creating and retrieving blobs in terms of streams. Use Java 6 if you can, it made some additions to the JDBC API that make dealing with blobs a whole lot easier.
If you absolutely need to store the data as a String, I would recommend XStream for XML-based storage (much easier than XMLEncoder), but alternative object representations might be just as useful (e.g. JSON). Your approach depends on why you actually need to store the object in this way.
Take a look at the java.sql.PreparedStatement class, specifically the function
http://java.sun.com/javase/6/docs/api/java/sql/PreparedStatement.html#setBinaryStream(int,%20java.io.InputStream)
Then take a look at the java.sql.ResultSet class, specifically the function
http://java.sun.com/javase/6/docs/api/java/sql/ResultSet.html#getBinaryStream(int)
Keep in mind that if you are serializing an object into a database, and then you change the object in your code in a new version, the deserialization process can easily fail because your object's signature changed. I once made this mistake with storing a custom Preferences serialized and then making a change to the Preferences definition. Suddenly I couldn't read any of the previously serialized information.
You might be better off writing clunky per property columns in a table and composing and decomposing the object in this manner instead, to avoid this issue with object versions and deserialization. Or writing the properties into a hashmap of some sort, like a java.util.Properties object, and then serializing the properties object which is extremely unlikely to change.
The serialised stream is just a sequence of bytes (octets). So the question is how to convert a sequence of bytes to a String, and back again. Further it needs to use a limited set of character codes if it is going to be stored in a database.
The obvious solution to the problem is to change the field to a binary LOB. If you want to stick with a characer LOB, then you'll need to encode in some scheme such as base64, hex or uu.
You can use the build in classes sun.misc.Base64Decoder and sun.misc.Base64Encoder to convert the binary data of the serialize to a string. You das not need additional classes because it are build in.
Simple Solution,worked for me
public static byte[] serialize(Object obj) throws IOException {
ByteArrayOutputStream out = new ByteArrayOutputStream();
ObjectOutputStream os = new ObjectOutputStream(out);
os.writeObject(obj);
return out.toByteArray();
}
Today the most obvious approach is to save the object(s) to JSON.
JSON is readable
JSON is more readable and easier to work with than XML.
A lot of Non-SQL databases that allow storing JSON directly.
Your client already communicates with the server using JSON. (If it doesn't, it is very likely a mistake.)
Example using Gson.
Gson gson = new Gson();
Person[] persons = getArrayOfPersons();
String json = gson.toJson(persons);
System.out.println(json);
//output: [{"name":"Tom","age":11},{"name":"Jack","age":12}]
Person[] personsFromJson = gson.fromJson(json, Person[].class);
//...
class Person {
public String name;
public int age;
}
Gson allows converting List directly. Examples can be easily
googled. I prefer to convert lists to arrays first.
you can use UUEncoding