Recieved the following exception when deserializing a HashMap<String, Integer>:
java.io.InvalidClassException: java.lang.Integer; local class incompatible: stream classdesc serialVersionUID = 1360826667802527544, local class serialVersionUID = 1360826667806852920
Serialized and deserialized on the same machine, with the same JRE. JDK 1.6.0_12
From looking at the JDK source, 1360826667806852920 is the correct serialVersionUID for Integer. I wasn't able to find any classes in the JDK with the serialVersionUID 1360826667802527544.
Interestingly, searching for 1360826667802527544 on Google turned up a few other people with this problem, notably this thread on Sun's forums. The problem there was that the person was storing bytes in a String, and the serialized data was getting mangled. Since you're getting the same serialVersionUID it seems very likely that you're running into a similar problem.
Never store bytes in a String. Use a byte array or a class designed to hold bytes, not chars.
That shouldn't happen. Note that the IDs differ only in the last few digits; the second one ist the one I see in my JDK sources.
My guess is that the serialized stream got corrupted somehow.
check the source code for Integer, here is what I have for Integer in several verions of java:
/** use serialVersionUID from JDK 1.0.2 for interoperability */
private static final long serialVersionUID = 1360826667806852920L;
So I'd say the problem comes from a class of yours that you changed between serialization and deserialization and that has no specific serialVersionUID...
Maybe you should look at this, same problem description and it looks like wrong serialization / deserialization code....
I faced the same issue and it is because when we are trying to store Integer object to String, the character encoding is getting messed up and while deserialization the serialVersionUID read is wrong. That's the root cause of this error.
To avoid this error use Base64 encoding before storing it to String.
see this answer and the problem resolved for me.
/** Read the object from Base64 string. */
private static Object fromString( String str ) throws IOException, ClassNotFoundException {
byte [] data = Base64.getDecoder().decode(str);
ObjectInputStream ois = new ObjectInputStream(new ByteArrayInputStream(data));
Object o = ois.readObject();
ois.close();
return o;
}
/** Write the object to a Base64 string. */
private static String toString( Serializable o ) throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ObjectOutputStream oos = new ObjectOutputStream( baos );
oos.writeObject( o );
oos.close();
return Base64.getEncoder().encodeToString(baos.toByteArray());
}
I've run into the same issue on a compiled jasperreport.
The packed ear on the server was corrupted due to the filtering of the ant build. As a result, the original jasperreport file with the one on the ear had some differences.
I've modified the ant build to copy only the file (instead of filtering) and not filter the
Related
I have a Spring Boot application that converts the XML to JSON. The conversion is done by calling another JAVA application internally and passing some information. The implementation of the called application is out of my control and I do not have access to make any changes to it.
The called Java Application requires a OutputStream so I am using the ByteArrayOutputStream and passing it within the method. After receiving the output I am converting the OutputStream to String. During the conversion, I am running into the warning Inefficient conversion from ByteArrayOutputStream.
I wanted to know how can I fix this warning. I researched a bit and found that we need to pass the size of ByteArrayOutputStream but in my case I am not aware how much size it can have because it would depend on the size of the input XML that I am giving. So I am unable to predict it and set it.
Can someone please guide me on what I can do within my Spring Boot application ByteArrayOutputStream so as to fix the warning that I receive in my Intellij IDE:
Inefficient conversion from ByteArrayOutputStream
Following is my code sample:
final InputStream inputStream = new ByteArrayInputStream(xmlEvents.getBytes(StandardCharsets.UTF_8));
final var output = new ByteArrayOutputStream();
new Converter().convert(inputStream, new Handler<>(new Validator(), new StreamCollector(output)));
return new String(output.toByteArray());
I am getting the warning for the line:
new String(output.toByteArray())
The explanation for this warning is that
new String(output.toByteArray());
creates a byte[] from the contents of the ByteArrayOutputStream, then creates a String from the byte[]. That is doing an unnecessary copy of the data.
The fix suggested by Intellij is:
output.toString(StandardCharsets.UTF_8).
which creates the String in a single operation without creating an intermediate byte[].
How does it do this?
Well toString() is passing the ByteArrayOutputStream's internal byte[] buffer to the String constructor. By contrast, output.toByteArray() is copying the buffer to a new byte[] ... so that the caller cannot interfere with the actual buffer's contents.
I'm trying to figure out how one goes about retrieving the raw bytes stored in a JsonObject and turns that into an InputStream object?
I figured it might be something like:
InputStream fis = new FileInputStream((File)json.getJsonObject("data"));
Granted I haven't tried that out, but just wanted to know if anyone had any experience with this and knew the preferred way to do it?
You can convert a JSONObject into its String representation, and then convert that String into an InputStream.
The code in the question has a JSONObject being cast into File, but I am not sure if that works as intended. The following, however, is something I have done before (currently reproduced from memory):
String str = json.getJSONObject("data").toString();
InputStream is = new ByteArrayInputStream(str.getBytes());
Note that the toString() method for JSONObject overrides the one in java.lang.Object class.
From the Javadoc:
Returns: a printable, displayable, portable, transmittable representation of the object, beginning with { (left brace) and ending with } (right brace).
if you want bytes, use this
json.toString().getBytes()
or write a File savedFile contains json.toString, and then
InputStream fis = new FileInputStream(savedFile);
I have a Map:
Map<String, DistributorAdd> map= new TreeMap<String, DistributorAdd>();
and I save it in a file.txt
FileOutputStream fos = new FileOutputStream("Distrib.txt");
ObjectOutputStream oos = new ObjectOutputStream(fos);
oos.writeObject(map);
oos.close();
The problem is that Distributor who was yesterday like:
public DistributorAdd(String distributor, String
emailAdress, String name, String speciality){...}
Will be tomorrow like this:
public void ajouter(String Distributor, String EmailAdress,
String Name, String Phone, String Image) {..}
My coworker already placed a lot of info in her Distrib.txt so what I want is to be able to put a new String in the Map without destroying it.
I would like to keep Distrib.txt and my DistributorAdd function is there any easy step I could do to do that?
The kind of error I get is:
ObjectInputStream ois = new ObjectInputStream(new FileInputStream("Distrib.txt"));
VendorA = (DistributorAdd) ois.readObject();
Error:
IOException : table.java => table()java.io.StreamCorruptedException: invalid stream header: ACED0573
at java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:780)
at java.io.ObjectInputStream.<init>(ObjectInputStream.java:277)
at car.Table.<init>(Table.java:185)
at car.Table.main(Table.java:837)
If you have any question or any more information that I need to give I will be happy to do it.
Please note following things about serialization:
1)It has to be used very judiciously.
2)When we serialise a class , it is like we are exporting its API in terms of
instance variables.
3)Always explicitely declare SerialVersionUID in your serializable class,if we
don't it will be automatically calculated by JVM based on internal structure of your class
(instance vars,public methods etc.) so changing internal strcture of class
chnages this SerialVersionUID.
Hence , If we serailise a class in one version and then deserialise it in another version
(by version change I mean we are changing some internal structure of serializable
object) , version incompatibility is bound to happen.
I am not sure what exactly may have caused for your code to fail since I don't know
what chnages are you doing in your Serializable class.
But I think you should consider point 3 once
I have a collection of objects:
Map<BufferedImage, Map<ImageTransform, Set<Point>>> map
I want to write those to a file, and then be able to read them back in the same struct.
I can't just write the collection as it is, because BufferedImage doesn't implement the Serializable (nor the Externalizable) interface. So I need to use the methods from the ImageIO class to write the image.
ImageTransform is a custom object that implements Serializable. So, I believe the value part of my map collection, should be writeable as it is.
Here is what I do to write to the file:
ObjectOutputStream out = new ObjectOutputStream(new FileOutputStream(file));
for (BufferedImage image : map.keySet()) {
ImageIO.write(image, "PNG", out); // write the image to the stream
out.writeObject(map.get(image)); // write the 'value' part of the map
}
Here is what I do to read back from the file:
ObjectInputStream in = new ObjectInputStream(new FileInputStream(file));
while(true) {
try {
BufferedImage image = ImageIO.read(in);
Map<ImageTransform, Set<Point>> value =
(Map<ImageTransform, Set<Point>>) in.readObject(); // marker
map.put(image, value);
} catch (IOException ioe) {
break;
}
}
However, this doesn't work. I get a java.io.OptionalDataException at marker.
java.io.OptionalDataException
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1300)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:368)
My question is, firstly, is the writing concept correct ? is ImageIO#write good for this case, or should I think about using/storing the BufferedImage#getRgb int[] array ? is the array more compact (as in, takes up less space in the file) ?
Secondly, how should I be reading the object back from the file ? How do I know when the EOF is reached ? Why doesn't the above work ?
I hope the info provided is enough, if you need more info on something, please tell me.
Thanks in advance.
It's not working as ObjectOutputStream and ObjectInputStream write/expect a certain file format that is violated when you write an image out of order. To use ObjectStreams successfully you will need to observe the contract that is specifed by ObjectStreams.
To do this you will need to create a holding class, and use this class as the key to your map instead of BufferedImages. This holding class should implement Serializable and a three methods (not in any actual interface) that mark the Class as needing special handling during reading and writing. The method signatures must be exactly as specified or serialization won't work.
For more information have a look at the documentation on ObjectOutputStream.
public class ImageHolder implements Serializable {
BufferedImage image;
public ImageHolder(BufferedImage image) {
this.image = image;
}
private void readObject(ObjectInputStream stream)
throws IOException, ClassNotFoundException {
image = ImageIO.read(stream);
}
private void writeObject(ObjectOutputStream stream)
throws IOException {
ImageIO.write(image, "PNG", stream);
}
private void readObjectNoData() throws ObjectStreamException {
// leave image as null
}
And then serialsation should be as simple as outputStream.writeObject(map). Though you will need to check that the implementing class of ImageTransform is serialisable too.
One way to 'cheat' and only have a single object to serialize is to add the group of objects to an expandable, serializable list. Then serialize the list.
BTW - I would tend to use XMLEncoder over serialized Objects because they can be restored in later JVMs. There is no such guarantee for serialized Objects.
#Ivan c00kiemon5ter V Kanak: "I'm trying to keep the file as small in size as possible,..
That is often wasted effort, given disk space is so cheap.
*.. so I guess Serialization is better for that. *
Don't guess. Measure.
..I'll try using a List and see how that goes. ..
Cool. Note that if using the XMLEncoder, I'd recommend Zipping it in most cases. That would reduce the file size of the cruft of XML. This situation is different in storing images.
Image formats typically incorporate compression of a type that is not conducive to being further compressed by Zip. That can be side-stepped by storing the XML compressed, and the images as 'raw' in separate entries in the Zip. OTOH I think you'll find the amount of bytes saved by compressing the XML alone is not worth the effort - given the final file size of the image entries.
I have a file that contains bytes, chars, and an object, all of which need to be written then read. What would be the best way to utilize Java's different IO streams for writing and reading these data types? More specifically, is there a proper way to add delimiters and recognize those delimiters, then triggering what stream should be used? I believe I need some clarification on using multiple streams in the same file, something I have never studied before. A thorough explanation would be a sufficient answer. Thanks!
As EJP already suggested, use ObjectOutputStream and ObjectInputStream an0d wrap your other elements as an object(s). I'm giving as an answer so I could show an example (it's hard to do it in comment) EJP - if you want to embed it in your question, please do and I'll delete the answer.
class MyWrapedData implements serializeable{
private String string1;
private String string2;
private char char1;
// constructors
// getters setters
}
Write to file:
ObjectOutputStream out = new ObjectOutputStream(new FileOutputStream(fileName));
out.writeObject(myWrappedDataInstance);
out.flush();
Read from file
ObjectInputStream in = new ObjectInputStream(new FileInputStream(fileName));
Object obj = in.readObject();
MyWrapedData wraped = null;
if ((obj != null) && (obj instanceof MyWrappedData))
wraped = (MyWrapedData)obj;
// get the specific elements from the wraped object
see very clear example here: Read and Write
Redesign the file. There is no sensible way of implementing it as presently designed. For example the object presupposes an ObjectOutputStream, which has a header - where's that going to go? And how are you going to know where to switch from bytes to chars?
I would probably use an ObjectOutputStream for the whole thing and write everything as objects. Then Serialization solves all those problems for you. After all you don't actually care what's in the file, only how to read and write it.
Can you change the structure of the file? It is unclear because the first sentence of your question contradicts being able to add delineators. If you can change the file structure you could output the different data types into separate files. I would consider this the 'proper' way to delineate the data streams.
If you are stuck with the file the way it is then you will need to write an interface to the file's structure which in practice is a shopping list of read operations and a lot of exception handling. A hackish way to program because it will require a hex editor and a lot of trial and error but it works in certain cases.
Why not write the file as XML, possibly with a nice simple library like XSTream. If you are concerned about space, wrap it in gzip compression.
If you have control over the file format, and it's not an exceptionally large file (i.e. < 1 GiB), have you thought about using Google's Protocol Buffers?
They generate code that parses (and serializes) file/byte[] content. Protocol buffers use a tagging approach on every value that includes (1) field number and (2) a type, so they have nice properties such as forward/backward compatability with optional fields etc. They are fairly well optimized for both speed and file size, adding only ~2 bytes of overhead for a short byte[], with ~2-4 additional bytes to encode the length on larger byte[] fields (VarInt encoded lengths).
This could be overkill, but if you have a bunch of different fields & types, protobuf is really helpful. See: http://code.google.com/p/protobuf/.
An alternative is Thrift by Facebook, with support for a few more languages although possibly less use in the wild last I checked.
If the structure of your file is not fixed, consider using a wrapper per type. First you need to create the interface of your wrapper classes….
interface MyWrapper extends Serializable {
void accept(MyWrapperVisitor visitor);
}
Then you create the MyWrapperVisitor interface…
interface MyWrapperVisitor {
void visit(MyString wrapper);
void visit(MyChar wrapper);
void visit(MyLong wrapper);
void visit(MyCustomObject wrapper);
}
Then you create your wrapper classes…
class MyString implements MyWrapper {
public final String value;
public MyString(String value) {
super();
this.value = value;
}
#Override
public void accept(MyWrapperVisitor visitor) {
visitor.visit(this);
}
}
.
.
.
And finally you read your objects…
final InputStream in = new FileInputStream(myfile);
final ObjectInputStream objIn = new ObjectInputStream(in);
final MyWrapperVisitor visitor = new MyWrapperVisitor() {
#Override
public void visit(MyString wrapper) {
//your logic here
}
.
.
.
};
//loop over all your objects here
final MyWrapper wrapper = (MyWrapper) objIn.readObject();
wrapper.accept(visitor);