I am using Kryo for serialization / deserialization and not registering classes beforehand (I am working on that). That said, upon deserialization, I am getting the exception:
Unable to load class shell.api.model.BatteryStatuo with kryo's ClassLoader. Retrying with current..
Now, my classname is actually shell.api.model.BatteryStatus so I'm not sure what happened during serialization.
Is there a limitation on the length of the classname?
Also, as I am serializing JPA entities which have nested structures and likely have circular references, will that pose a potential issue? I would think I'd see a stack overflow exception if so.
This is a snippet of serializing an object:
protected final Kryo kryo = new Kryo();
try (final ByteArrayOutputStream baos = new ByteArrayOutputStream()) {
try (final Output output = new Output(baos)) {
kryo.writeObject(output, data);
}
return (baos.toByteArray());
} catch (IOException e) {
LOGGER.error("error serializing", e);
throw (new RuntimeException("Error serializing", e));
}
deserialization:
try (final Input input = new Input(inputStream)) {
return ((Serializable) kryo.readObject(input, entityType));
}
entityType is the parent class, in this case:
shell.api.model.Heartbeat
and, inside Heartbeat are several entities, one of which is BatteryStatus.
Kryo can handle serializing and deserializing complex nested objects and circular references. It's part of the reason why so many people love Kryo!
Since the object you are sending could be one of many possible types you should use the writeClassAndObject and readClassAndObject methods.
protected final Kryo kryo = new Kryo();
try (final ByteArrayOutputStream baos = new ByteArrayOutputStream()) {
try (final Output output = new Output(baos)) {
kryo.writeClassAndObject(output, data);
return (baos.toByteArray());
} catch (IOException e) {
LOGGER.error("error serializing", e);
throw (new RuntimeException("Error serializing", e));
}
And
try (final Input input = new Input(inputStream)) {
return ((Serializable) kryo.readClassAndObject(input));
}
Docs here
One other possibility is you are using two different versions of Kryo jar in your project and different version classes are being used in serialization and deserialization.
Related
Edit for clarification:
Why this does not appear to be a normal null reference is the fact that I have already used this code in another form and have had no issue there. I could manually delete the file from my device and the file will just be replaced later. This should be handled by my try/catch statement where if the file does not exist it will initialize a new object for me.
Furthermore when I get the error in android studio upon running it, the error points directly at the
FileInputStream fis = openFileInput(FILENAME);
Another thing to note is that when I add an item to the object that I am trying to pass into gson I have no issues. The object is created and I have no issue whatsoever adding items to it.
Furthermore as context, I have these two functions in an object class that manages my data for me. I create a single one of these objects as a DataManager and use these functions to save/load (among others) an object that contains all of my data within itself.
When I was putting the class together I had to add that the object class
extends Activity
Because I was getting the fileoutput and fileinput as red functions beforehand. However this seems very sketchy to me (and perhaps explains exactly why I am seeing this error...?)
Original before edit:
In my Android Studio application I've been trying my best to utilize gson and at present I am attempting to save a single object containing multiple attributes using gson.
However I just successfully managed to use this same gson code to put an ArrayList into a file. I could delete that file myself and everything would work just fine. However I am now trying to place an entire object in.
The error is: "java.lang.NullPointerException: Attempt to invoke virtual method 'java.io.FileOutputStream android.content.Context.openFileOutput(java.lang.String, int)' on a null object reference"
The code looks like the following:
public void loadFromFile() {
try {
FileInputStream fis = openFileInput(FILENAME);
BufferedReader in = new BufferedReader(new InputStreamReader(fis));
Gson gson = new Gson();
// https://google-gson.googlecode.com/svn/trunk/gson/docs/javadocs/com/google/gson/Gson.html, 2015-09-23
Type objectType = new TypeToken<StatisticsRecordObject>() {}.getType();
recordObject = gson.fromJson(in, objectType);
} catch (FileNotFoundException e) {
recordObject = new StatisticsRecordObject();
}
}
public void saveInFile() {
try {
FileOutputStream fos = openFileOutput(FILENAME, 0);
BufferedWriter out = new BufferedWriter(new OutputStreamWriter(fos));
Gson gson = new Gson();
gson.toJson(recordObject, out);
out.flush(); //same as fflush as before. Buffer must go. FLUSH AFTER WRITING.
fos.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
recordObject = new StatisticsRecordObject();
} catch (IOException e) {
// TODO Auto-generated catch block
throw new RuntimeException(e);
}
}
I think the NullpointerException says that context object that you call "openFileForOutput" is null
http://developer.android.com/reference/android/content/Context.html#openFileOutput(java.lang.String, int)
You should probably pass in a context to the method and make sure that the context is proper initialised.
I have the below code that basically reads a bunch of JSON strings, and converts them to a java object. My problem is if at any point, the transformation fails for any of the JSON strings, it doesn't process the others strings. What I need is -
Find the string for which the error occured.
In the exception block do something to continue processing.
Here is my code to convert from JSON to Java.
public static <T> T convertToObject(String jsonString,Class<T> classType){
T obj = null;
try {
obj = objectMapper.readValue(jsonString, classType);
} catch (Exception e) {
throw new Exception("Unable to convert to DTO :" + e.getMessage(), e);
}
return obj;
}
I think you need a custom deserializer. Standard ObjectMapper will do all or nothing. Read more about creating a custom deserializer for Jackson ObjectMapper here:
http://www.baeldung.com/jackson-deserialization
I have file that I need to deserialize with multiple objects of the same type.
public static ArrayList<Dog> readDogs() {
ArrayList<Dogs> dogs = null;
ObjectInputStream in = null;
try {
in = new ObjectInputStream(new BufferedInputStream(new FileInputStream(filename)));
dogs = new ArrayList<Dog>();
while(true) {
dogs.add((Dog) in.readObject());
}
} catch (EOFException e) {
} catch (Exception e) {
System.err.println(e.toString());
} finally {
try {
in.close();
} catch (IOException e) {
System.err.println(e.toString());
}
}
return dogs;
}
With the current implementation, I rely upon a try clause to catch and ignore an end of file exception, this seems pretty ugly but I'm not sure how else to handle this?
Don't serialize/deserialize each dog. Serialise/deserialize a single List of Dogs.
All List implementations are themselves Serializable, and serializing them will automatically serialize all its elements.
EOFEception represents an exceptional condition that is caused by something outside of your code. If you are not capturing an Error or a RuntimeException which represents some bug that you could fix in your code, then using an Exception is the best way to deal with the problem.
If the problem is not in your code, which is correct, but outside of it (network down, file not found, file corrupted), it is OK and usually recommended to deal with the problems using exceptions.
There are cases where you should validate data before you use it. If there is a large chance of the data being received in an incorrect format (ex: user input), it might be more efficient to validate it first. But in rare cases where you might have a corrupted file among thousands, it's better to catch the exception when it occurs and deal with it.
You could improve your code logging the exceptions, so you can trace them later. You should also consider which class should be responsible for dealing with the exception (if it should catch it and fix the problem, or if it should declare throws and propagate it to the caller.
Use InputStream.available() to test whether data is available in the stream before reading the next dog.
The following code should work:
try {
FileInputStream fin = new FileInputStream(filename);
in = new ObjectInputStream(new BufferedInputStream(fin));
dogs = new ArrayList<Dog>();
while (fin.available() > 0) {
dogs.add((Dog) in.readObject());
}
} catch (EOFException e) {
Note that you should call available() on the FileInputStream object, not the ObjectInputStream object, which doesn't properly support it. Note also that you should still catch and handle EOFException, since it may still be raised in exceptional situations.
Hi I have an issue when trying to append new objects to the existing file..
Once the android app has been lunched again I want to get the existing file and add a new objects then read the objects from the existing file ... Actually, when I'm trying to read the object, the code will read only the first objects .. You can find below the code .. Could you please help ? Thanks
using the following method to write an objects :
public void saveObject(Person p, File f){
try
{
ObjectOutputStream oos = new ObjectOutputStream(new FileOutputStream(f, true));
oos.writeObject(p);
oos.reset();
oos.flush();
oos.close();
}
catch(Exception ex)
{
Log.v("Serialization Save Error : ",ex.getMessage());
ex.printStackTrace();
}
}
Using the following method to read an objects :
public Object loadSerializedObject(File f)
{
try {
ObjectInputStream ois = new ObjectInputStream(new FileInputStream(f));
try{
Object loadedObj = null;
while ((loadedObj = ois.readObject()) != null) {
Log.w(this.getClass().getName(), "ReadingObjects") ;
}
return objects;
}finally{
ois.close();
}
} catch (StreamCorruptedException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
return null;
}
Unfortunately you can't create a new ObjectOutputStream every time you want to append to the stream and then read everything back with a single stream. The constructor adds headers to the underlying stream before you start writing objects. You are probably seeing the java.io.StreamCorruptedException: invalid type code: AC exception, that's because the first header is 0xAC.
I don't know how many objects you are dealing with, but one option might be to read all your objects and then rewriting them all using a single ObjectOutputStream. That can get pricy if there are lots of objects. Alternatively, you might want to consider managing the serialization yourself manually through Externalizable. It can get painful though.
I'm using Mallet through Java, and I can't work out how to evaluate new documents against an existing topic model which I have trained.
My initial code to generate my model is very similar to that in the Mallett Developers Guide for Topic Modelling, after which I simply save the model as a Java object. In a later process, I reload that Java object from file, add new instances via .addInstances() and would then like to evaluate only these new instances against the topics found in the original training set.
This stats.SE thread provides some high-level suggestions, but I can't see how to work them into the Mallet framework.
Any help much appreciated.
Inference is actually also listed in the example link provided in the question (the last few lines).
For anyone interested in the whole code for saving/loading the trained model and then using it for inferring model distribution for new documents - here are some snippets:
After model.estimate() has completed, you have the actual trained model so you can serialize it using a standard Java ObjectOutputStream (since ParallelTopicModel implements Serializable):
try {
FileOutputStream outFile = new FileOutputStream("model.ser");
ObjectOutputStream oos = new ObjectOutputStream(outFile);
oos.writeObject(model);
oos.close();
} catch (FileNotFoundException ex) {
// handle this error
} catch (IOException ex) {
// handle this error
}
Note though, when you infer, you need also to pass the new sentences (as Instance) through the same pipeline in order to pre-process it (tokenzie etc) thus, you need to also save the pipe-list (since we're using SerialPipe when can create an instance and then serialize it):
// initialize the pipelist (using in model training)
SerialPipes pipes = new SerialPipes(pipeList);
try {
FileOutputStream outFile = new FileOutputStream("pipes.ser");
ObjectOutputStream oos = new ObjectOutputStream(outFile);
oos.writeObject(pipes);
oos.close();
} catch (FileNotFoundException ex) {
// handle error
} catch (IOException ex) {
// handle error
}
In order to load the model/pipeline and use them for inference we need to de-serialize:
private static void InferByModel(String sentence) {
// define model and pipeline
ParallelTopicModel model = null;
SerialPipes pipes = null;
// load the model
try {
FileInputStream outFile = new FileInputStream("model.ser");
ObjectInputStream oos = new ObjectInputStream(outFile);
model = (ParallelTopicModel) oos.readObject();
} catch (IOException ex) {
System.out.println("Could not read model from file: " + ex);
} catch (ClassNotFoundException ex) {
System.out.println("Could not load the model: " + ex);
}
// load the pipeline
try {
FileInputStream outFile = new FileInputStream("pipes.ser");
ObjectInputStream oos = new ObjectInputStream(outFile);
pipes = (SerialPipes) oos.readObject();
} catch (IOException ex) {
System.out.println("Could not read pipes from file: " + ex);
} catch (ClassNotFoundException ex) {
System.out.println("Could not load the pipes: " + ex);
}
// if both are properly loaded
if (model != null && pipes != null){
// Create a new instance named "test instance" with empty target
// and source fields note we are using the pipes list here
InstanceList testing = new InstanceList(pipes);
testing.addThruPipe(
new Instance(sentence, null, "test instance", null));
// here we get an inferencer from our loaded model and use it
TopicInferencer inferencer = model.getInferencer();
double[] testProbabilities = inferencer
.getSampledDistribution(testing.get(0), 10, 1, 5);
System.out.println("0\t" + testProbabilities[0]);
}
}
For some reason I am not getting the exact same inference with the loaded model as with the original one - but this is a matter for another question (if anyone knows though, I'd be happy to hear)
And I've found the answer hidden in a slide-deck from Mallet's lead developer:
TopicInferencer inferencer = model.getInferencer();
double[] topicProbs = inferencer.getSampledDistribution(newInstance, 100, 10, 10);