Is there anything wrong with declaring collection transient? transient Map<String, Car> cars = new HashMap<>() is declared in Ownerinstance that is serialized, but the Car class is not serialized.
When program runs for the first time Owner instance it creates Car and insert it into collection Cars, however when running program for second time, Owner is deserialized, it correctly creates Car instance but when adding to collection cars.put(key, object) it causes NullPointerException. Also only when running after deserialization cars.containsKey(regNumIn) causes exception instead of giving true or false. It seems that on second run after Owner is recreated the new hashMap is created.
Does it have to do anything with hasCode() and equals()? I havent declared those, and if they are automatically declared by Netbeans IDE, the program doesnt work at all.
Your problem has nothing at all to do with Collections. "transient" tells Java that you do not want to store the fields value, so when you reload the stored object, transient fields are set to null (or 0, or other respective default values). Therefore, in your example code of cars.put(key, object), you are essentially attempting to do null.put(key, object).
containsKey of course fails for the same reason - you are attempting to call it on something that is null.
If you don't want to serialize your collection, you will have to do something like cars = new HashMap<>() after deserializing it.
That means the problem is also unrelated to equals and hashCode, however the information that your program 'breaks' when you have Netbeans generate them suggests that you may have other issues. Good information about equals and hashCode can be found in this related SO question:
What issues should be considered when overriding equals and hashCode in Java?
Java does not call the default constructor when deserializing the object. Therefore, your code
transient Map<String, Car> cars = new HashMap<>();
will not be executed.
To accomplish this, you can override the readObject method of your class:
public class ... implements Serializable {
...
private transient Map<String, Car> cars = new HashMap<>();
...
private void readObject(ObjectInputStream stream)
throws IOException, ClassNotFoundException {
stream.defaultReadObject();
// Important! recreate transient field cars as empty HashMap
this.cars = new HashMap<>();
}
...
}
I removed transient statement and I implemented Serializable interface in a Car class, and it works. I think the problem was that collection being transient was not saved after first run, and in the second run when Owner object was deserialized, the non-argument constructor is not called, therefore the new cars collection in that constructor was not created. So in the second run it was attempted to add car object to non existing collection.
Related
I want to map a String to a method that builds a certain object, but not necessarily from the same class for every String. Looking around on here a nice solution was to have a Map<String, ObjectBuilder>, ObjectBuilder<T> being an interface with an abstract method T buildObject().
I then have multiple classes, let's say Object1Builder implements ObjectBuilder<Object1>, Object2Builder implements ObjectBuilder<Object2> and so on.
I can then construct my map like so :
stringToBuilder = new HashMap<String, ObjectBuilder>(){{
put(string1, Object1Builder);
put(string2, Object2Builder);
put(string3, Object3Builder);
}};
And I can then do Object1 myObject1 = stringToBuilder.get(string1).buildObject()
Problem is, I get an error
Raw use of parameterized class 'ObjectBuilder'
in IntelliJ when I instanciate and construct stringToBuilder and I understand it has something to do with not specifying the generic of the interface ObjectBuilder when constructing the map, but I don't see how I could circumvent this. Moreover, I'm not very satisfied with the fact that I'm storing these classes in a map, I wish I could access them through the map without having the whole instance in the map.
You've probably noticed I'm quite new to Java and all this but please be sure I'm trying my best. Thank you in advance :)
What you want will never be possible without explicit casts. The reason is that there is no direct relation between the map keys (strings) and values (ObjectBuilders).
If you can switch from strings to use the T values as map keys, this can be done with a little internal casting.
First, declare your map as Map<Class<?>, ObjectBuilder<?>>. Note the two wild-cards; the compiler cannot help us with enforcing that the keys and the values have the same generic type. That's what we need to do ourselves.
Next, initialize it as necessary. I dislike the anonymous class with initializer you use, so I'll use Map.of:
Map<Class<?>, ObjectBuilder<?>> classToBuilder = Map.of(
Object1.class, Object1Builder,
Object2.class, Object2Builder,
Object3.class, Object3Builder,
);
Finally, we need a method to get the builder:
#SuppressWarnings("unchecked")
private <T> getBuilder(Class<T> type) {
// Omitted: presence check
return (ObjectBuilder<T>) classToBuilder.get(type);
}
This can now be used as follows:
Object1 object1 = getBuilder(Object1.class).buildObject();
So my issue is a bit complex. I've got one User class, which I put into ConcurrentHashMap<String, User>. One class corresponds to one user. Key is user's ID.
I'm using GSON to serialize this ConcurrentHashMap and save the data of my users.
Inside of user class i have multiple variables (ints, Strings, etc.) and few Collections.
Problem is in overwriting the file. 2 out of my 4 ArrayLists are serializing as usual, but when I add another ArrayList, or any collection for that matter, the collection won't show up in a file. However when I add a simple variable such as String or Int, the file updated and appends those values for every user. When new user is being created, those collections show up as nothing happened. I need to add those collections for already exsisting users.
My question is why in hell can't add another ArrayList to the class, and why it's not showing up in the file.
public class User {
private String nickname;
private String id;
private int coins;
...bunch of variables
private int bikes = 0;
private int scooters = 0;
private int goldIngots = 0;
private final ArrayList<Car> cars = new ArrayList<>(); //showing up
private final ArrayList<Hotel> hotels = new ArrayList<>(); //showing up
private final ArrayList<AwardType> awards = new ArrayList<>(); //not showing up
...Constructor
...Getters And Setters
sample of UserClass
collections inside UserClass
how it should look
values are not appending
EDIT
AwardType is an enum. This list cointaining AwardType is not showing up for existing users, only for new ones.
EDIT 1
After adding Gson serializeNulls() option, the list is being added to the file but as a null.
"bikes": 0,
"scooters": 0,
"goldIngots": 0,
"cars": [],
"hotels": [],
"awards": null
As mentioned in the comments you need to add a no-arg constructor (sometimes also called "default constructor") to your class. This constructor may be private (so you don't call it by accident).
This constructor is required for Gson to be able to create an instance and then afterwards update its field values during deserialization. Other constructors with parameters do not work because Gson cannot determine which JSON property matches which constructor parameter, and assuming default values (e.g. 0 and null) for the parameters might not be correct either in all situation.
If no no-arg constructor is detected Gson uses a specific JDK class called sun.misc.Unsafe to create an instance without calling any constructor and without executing any initializer block (including initializing fields). This can lead to issues such as the one you experienced. Additionally the Unsafe class might not be available in all environments. Due to this you should avoid relying on this behavior.
Alternatively you can also create an InstanceCreator for your class, but in most cases it is easier to add a no-arg constructor.
This code is familiar.
List<Character> list = new ArrayList<Character>();
// populate the list
list.add('X');
list.add('Y');
// make the list unmodifiable
List<Character> immutablelist = Collections.unmodifiableList(list);
Now i have a typical Model Class with variables, getters and setters. Can i make that Object immutable,after i have invoked the setters i want? Something like this is my dream...
Person person = new Person();
person.setFirstName("firsName");
person.setLastname("lastName");
// i have a lot to set!!- Person is pretty large
// some way to do this
Person stubborn = Object.immutableObject(person);
I know there is no Object.immutableObject()..But is it possible to achieve something like this??
There's no general way to get this behavior.
You can create an ImmutablePerson class with a constructor that would accept a Person and construct an immutable version of that Person .
There is no way to do it without doing some work.
You need to either get a compile time check by creating a new immutable object from the mutable one as Eran suggests, add some code for a runtime check, or get a weaker compiler time check by using a split interface
e.g
interface ReadOnlyPerson {
int getX();
}
interface ModifiablePerson extends ReadOnlyPerson{
void setX();
}
class Person implements ModifiablePerson {
}
You can then pass out the immutable reference after construction.
However this pattern does not give a strong guarantee that the object will not be modified as the ReadOnlyPerson reference can be cast etc.
Sure, just have a boolean flag in the Person object which says if the object is locked for modifications. If it is locked just have all setters do nothing or have them throw exceptions.
When invoking immutableObject(person) just set the flag to true. Setting the flag will also lock/deny the ability to set/change the flag later.
I'm just learning Java but I keep running into the same problem over and over again
How do i revert to an old state of some object efficiently?
public class Example {
MyObject myLargeObject;
public void someMethod(){
MyObject myLargeMyObjectRecovery = myLargeObject;
/**
* Update and change myLargeObject
*/
if(someCondition){
//revert to previous state of myLargeObject
myLargeObject = myLargeMyObjectRecovery;
}
}
}
The above is how I would like the code to work but it obviously doesn't since myLargeObject and myLargeObjectRecovery are references to the same object.
One solution is to create a copy constructor. This is fine for small objects but if I have a large object (in my project the object is a large 2D array meaning I would have to iterate over all of the entries), this way feels wrong.
This must be a very common problem in Java, how do others get around it?
Either deep copy, as you noted, or possibly serialization. You could store a serialized string, and then reconstruct the object from it later.
The best solution depends on whether you have external references to your MyObject instance or not.
If you use the myLargeObject from the Example class only, you can use
Serialize your object at the savepoint, and deserialize at the restore point (the serialized byte[] must be transient)
Create a new instance with a Copy Constructor (doing deep copying) at the savepoint, and replace the reference at the restore point.
If you have access to the MyObject instance from outside, then it becomes a bit more interesting, you must introduce synchronization.
All of your methods on MyObject must be synchronized (to avoid inconsistent read)
You should have a synchronized void saveState() method which saves your state (either by serialization, or by copy constructor) (the latter is better)
You should have a synchronized void restoreState(), where you internally restore your state (for copying fields you can use a common code fragment with the copy constructor)
In all cases it is recommended to close the transaction (kind of commit()) at some point, it means that when you get there, you can delete the saved state.
Also, it is very important, that if you have an underlying data structure, you should traverse the whole structure for it. Otherwise you may experience problems with the object references.
Be careful with JPA Entities or any externally-managed objects, it is unlikely that any of these methods will work with them.
If you assign a object to another, jvm refers to same object in the memory. Therefore, the changes on the object will be in the other objetc which references.
MyObject myLargeMyObjectRecovery = myLargeObject;
It is the same in your code. myLargeMyObjectRecovery and myLargeObject refer to same object in the memory.
If you want to exact copy of any object you can use Object.clone() method. this method will copy the object and return a object which is refer to another object whose fields and their valus same as the coppied object.
Since clone method is protected you can not access it direcly. you can implement a Prototype pattern depending on your requirements.
http://www.avajava.com/tutorials/lessons/prototype-pattern.html
public class MyObject{
//fields, getters, setters and other methods.
public MyObject doClone()
{
MyObject clonedObject = this.clone(); //you may need to override clone()depending on your requirements.
return clonedObject;
}
}
and call it in your code
MyObject myLargeMyObjectRecovery = myLargeObject.doClone();
The Memento Pattern addresses the design issue of the revert to previous state problem. This approach is useful when you need to capture one or multiple states of the object and be able to revert to them. Think of it as an N-step undo operation, just like in a text editor.
In the pattern you have a stateful object Originator, which is responsible for saving and restoring snapshots of it's state. The sate itself is saved in a wrapper class called Memento, which are stored in and accessed via the CareTaker.
In this approach the easiest idea is to deep-copy your objects. However this may be inefficient regarding performance and space, since you store whole objects and not the change-sets only.
Some object persistence libraries provide implementations of transactions and snapshots. Take a look at Prevayler, which is an object persistence library for java and an implementation of the prevalent system pattern. The library captures the changes to your objects in form of transactions and stores them in-memory. If you need a persistent storage of your POJOs, you can save snapshots of your objects on disk periodically and revert to them if needed.
You can find more on serializing POJOs in this SO question: Is there a object-change-tracking/versioning Java API out there?
I have a serialized object MyObject that contains integer foo. I set a value 10 to integer foo and save the object to a file using writeObject().
I add integer bar to object MyObject. I set a value 15 to integer bar and then load the old serialized file using readObject().
The old serializable file doesn't contain integer bar so integer bar will get value 0. I want to keep the value 15 in bar if the old serializable file doesn't contain variable bar.
Should I override readObject() or how could I prevent readObject() from setting "default values" to unknown objects?
I want to do this because in the constructor I'm setting my own default values and would like to use my own default values to control versioning.
Serialization doesn't set default values it defers to Java's default value initialization scheme.
If I can sum up your question. You want serialization to merge what's in the serialized stream with the values in memory. That's not possible with Java serialization as it controls what objects to create. You can read in your serialized object, then manually write the code to merge what fields you want merged together. If your stuck on Java serialization (I'd steer clear of it if I were you), but let's say you want to continue using it.
public class MyObject {
public void merge( MyObject that ) {
// given some other instance of an object merge this with that.
// write your code here, and you can figure out the rules for which values win.
}
}
ObjectInputStream stream = new ObjectInputStream( new FileInputStream( file ) );
MyObject that = stream.readObject();
someObject.merge( that );
Viola you control which fields will be merged from that into someObject. If you want a library to do this merge for you check out http://flexjson.sourceforge.net. It uses JSON serialization, and works from Beans rather than POJO. However, there is a way to take an already populated object and overwrite values from a JSON stream. There are limitations to this. The other benefit of this is you can actually read the stream back AFTER your object structure has changed something that Java serialization can technically do, but its very very hard.
Would adding the following method to your MyObject work for you?
private void readObject(ObjectInputStream ois) throws IOException, ClassNotFoundException
{
bar = 15; // Set a default value of 15 if it's not in the serialized output file
ois.defaultReadObject();
}
Use the keyword transient to exclude fields from serialization/deserialization.