I try to add objects of Employee class to a TreeSet. I don't implement Comparable or Comparator interface. But the add method code behaves differently in different systems. Why so? Code snippet below:-
import java.util.Set;
import java.util.TreeSet;
public class TreeSetTest {
public static void main(String[] args) {
Set<Employee> set = new TreeSet<Employee>();
set.add(new Employee());
// set.add(new Employee());
// set.add(new Employee());
}
}
On the current system (Win 10), whether I write set.add() method once or thrice. It always throws ClassCastException at runtime.
But talking of this question- Why does TreeSet throws ClassCastException
The user there has written, that he doesn't get exception when he uses add method only once.
Also, in another system (Win 7), yesterday I had tried adding object 3 times, calling set method thrice, and there is no ClassCastException!! The size of set remains 1 only, so it appeared that multiple objects are just NOT getting added to set.
So what could be the reason for different-different kind of behavior of add method?
TreeSet.add() delegates to TreeMap.put(), which has differing behavior in Java 6 and Java 8.
Java 6:
public V put(K key, V value) {
Entry<K,V> t = root;
if (t == null) {
// TBD:
// 5045147: (coll) Adding null to an empty TreeSet should
// throw NullPointerException
//
// compare(key, key); // type check
root = new Entry<K,V>(key, value, null);
size = 1;
modCount++;
return null;
}
...
Java 8:
public V put(K key, V value) {
Entry<K,V> t = root;
if (t == null) {
compare(key, key); // type (and possibly null) check
root = new Entry<>(key, value, null);
size = 1;
modCount++;
return null;
}
...
As you can see, the earlier version had the compare() line commented out for some reason, but it was added back in the later version. Hence the exception you're seeing for the first element.
See here also: Why TreeSet can be used as a key for TreeMap in jdk 1.6?
Related
I try use Set interface as value for hazelcast IMap instance and when I run my test I found that test hung inside ConcurrentMap#compute method.
Why do I have infinite loop when I use hazelcast IMap in this code:
import com.hazelcast.config.Config;
import com.hazelcast.config.MapConfig;
import com.hazelcast.core.Hazelcast;
import com.hazelcast.core.IMap;
import java.io.Serializable;
import java.util.*;
public class Main {
public static void main(String[] args) {
IMap<String, HashSet<StringWrapper>> store = Hazelcast.newHazelcastInstance(
new Config().addMapConfig(new MapConfig("store"))
).getMap("store");
store.compute("user", (k, value) -> {
HashSet<StringWrapper> newValues = Objects.isNull(value) ? new HashSet<>() : new HashSet<>(value);
newValues.add(new StringWrapper("user"));
return newValues;
});
store.compute("user", (k, value) -> {
HashSet<StringWrapper> newValues = Objects.isNull(value) ? new HashSet<>() : new HashSet<>(value);
newValues.add(new StringWrapper("user"));
return newValues;
});
System.out.println(store.keySet());
}
// Data class
public static class StringWrapper implements Serializable {
String value;
public StringWrapper() {}
public StringWrapper(String value) {
this.value = value;
}
public String getValue() {
return value;
}
public void setValue(String value) {
this.value = value;
}
#Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
if (!super.equals(o)) return false;
StringWrapper value = (StringWrapper) o;
return Objects.equals(this.value, value.value);
}
#Override
public int hashCode() {
return Objects.hash(super.hashCode(), value);
}
}
}
Hazelcast: 3.9.3
Java:build 1.8.0_161-b12
Operating system: macOS High Sierra 10.13.3
#Alykoff I reproduced the issue based on above example & ArrayList version, which is reported as a github issue: https://github.com/hazelcast/hazelcast/issues/12557.
There are 2 seperate problems:
1 - When using HashSet, the problem is how Java deserialize the HashSet/ArrayList (collections) & how compute method works. Inside compute method (since Hazelcast complied with Java 6 & there is no compute method to override, default implementation from ConcurrentMap called ), this block causes the infinite loop:
// replace
if (replace(key, oldValue, newValue)) {
// replaced as expected.
return newValue;
}
// some other value replaced old value. try again.
oldValue = get(key);
this replace method calls IMap replace method. IMap checks if the current value equal to the user-supplied value. But because of a Java Serialization optimization, the check fails. Please check HashSet.readObject method. You'll see that when deserializing the HashSet, since element size is known, it creates the inner HashMap with a capacity:
// Set the capacity according to the size and load factor ensuring that
// the HashMap is at least 25% full but clamping to maximum capacity.
capacity = (int) Math.min(size * Math.min(1 / loadFactor, 4.0f),
HashMap.MAXIMUM_CAPACITY);
But your HashSet, created without an initial capacity, has a default capacity of 16, while the deserialized one has the initial capacity of 1. This changes the serialization, index 51 contains the current capacity & it seems JDK re-calculate it based on size when deserializing the object to minimize the size.
Please see below example:
HazelcastInstance hz = Hazelcast.newHazelcastInstance();
IMap<String, Collection<String>> store = instance.getMap("store");
Collection<String> val = new HashSet<>();
val.add("a");
store.put("a", val);
Collection<String> oldVal = store.get("a");
byte[] dataOld = ((HazelcastInstanceProxy) hz).getSerializationService().toBytes(oldVal);
byte[] dataNew = ((HazelcastInstanceProxy) hz).getSerializationService().toBytes(val);
System.out.println(Arrays.equals(dataNew, dataOld));
This code prints false. But if you create the HashSet with the initial size 1, then both byte arrays are equal. And in your case, you won't get an infinite loop.
2 - When using ArrayList, or any other collection, there's another problem which you pointed above. Due to how compute method implemented in ConcurrentMap, when you assign the old value to the newValue & add a new element, you actually modify the oldValue thus causing replace method fail. But when you change the code to new ArrayList(value), now you're creating a new ArrayList & value collection is not modified. It's a best practice to wrap a collection before using it if you don't want to modify the original one. Same works for HashSet if you create with size 1 due to the first issue I explained.
So in your case, you should use
Collection<String> newValues = Objects.isNull(value) ? new HashSet<>(1) : new HashSet<>(value);
or
Collection<String> newValues = Objects.isNull(value) ? new ArrayList<>() : new ArrayList<>(value);
That HashSet case seems to be a JDK issue, rather than an optimization. I don't know any of these cases can be solved/fixed in Hazelcast, unless Hazalcast overrides the HashXXX collection serialization & overrides the compute method.
I was just wondering, what would happen if key of a HashMap is mutable, test program below demonstrate that and I am unable to understand when both equals and hashCode methods returns
true and same value, why does hashmap.containsKey return false.
public class MutableKeyHashMap {
public static void main(String []a){
HashMap<Mutable, String> map = new HashMap<Mutable, String>();
Mutable m1 = new Mutable(5);
map.put(m1, "m1");
Mutable m2 = new Mutable(5);
System.out.println(map.containsKey(m2));
m2.setA(6);
m1.setA(6);
Mutable m3 = map.keySet().iterator().next();
System.out.println(map.containsKey(m2)+" "+m3.hashCode()+" "+m2.hashCode()+" "+m3.equals(m2));
}
}
class Mutable {
int a;
public Mutable(int a) {
this.a = a;
}
#Override
public boolean equals(Object obj) {
Mutable m = (Mutable) obj;
return m.a == this.a ? true : false;
}
#Override
public int hashCode(){
return a;
}
public void setA(int a) {
this.a = a;
}
public int getA() {
return a;
}
}
This the output :
true
false 6 6 true
The javadoc explains it
Note: great care must be exercised if mutable objects are used as map keys. The behavior of a map is not specified if the value of an object is changed in a manner that affects equals comparisons while the object is a key in the map.
Basically, don't use mutable objects as keys in a Map, you're going to get burnt
To extrapolate, because the docs may not appear clear, I believe the pertinent point here is `changed in a manner that affects equals', and you seem to be assuming that equals(Object) is called each time contains is invoked. The docs don't say that, the wording implies they may be allowed to cache computations.
Looking at the source, it seems that because your hashCode returns a different value (was 5, now 6), it's possible that it's being looked up in a different bucket based on implementation details.
You can think of if this way, the Map has 16 buckets. When you give it an object with A == 5, it tosses it into bucket 5. Now you can change A to 6, but it's still in bucket 5. The Map doesn't know you changed A, it doesn't rearrange things internally.
Now you come over with another object with A == 6, and you ask the Map if it has one of those. It goes and looks in bucket 6 and says "Nope, nothing there." It's not going to go and check all the other buckets for you.
Obviously how things get put into buckets is more complicated than that, but that's how it works at the core.
The HashMap puts your object at the location for hash key 5. Then you change the key to 6 and use containsKey to ask the map whether it contains the object. The map looks at position 6 and finds nothing, so it answers false.
So don't do that, then.
When you put "m1" the first time around, hashCode() was 5. Thus the HashMap used 5 to place the value into the appropriate bucket. After changing m2, the hashCode() was 6 so when you tried looking for the value you put in, it the bucket it looked in was different.
A code example to accompany ptomli's answer.
import java.util.*;
class Elem {
private int n;
public Elem(int n) {
this.n = n;
}
public void setN(int n) {
this.n = n;
}
#Override
public int hashCode() {
return n;
}
#Override
public boolean equals(Object e) {
if (this == e)
return true;
if (!(e instanceof Elem))
return false;
Elem an = (Elem) e;
return n == an.n;
}
}
public class MapTest {
public static void main (String [] args) {
Elem e1 = new Elem(1);
Elem e2 = new Elem(2);
HashMap map = new HashMap();
map.put(e1, 100);
map.put(e2, 200);
System.out.println("before modification: " + map.get(e1));
e1.setN(9);
System.out.println("after modification using updated key: " + map.get(e1));
Elem e3 = new Elem(1);
System.out.println("after modification using key which equals to the original key: " + map.get(e3));
}
}
Compiles and runs it. The result is:
before modification: 100
after modification using updated key: null
after modification using key which equals to the original key: null
I am using Java 6 on Linux.
I have a home work in a data structures course, the question is:
Implementation of doubly-linked list class.
the methods:
display()
length() or size()
insertSorted(Comparable)
insertToEnd(Comparable)
insertToHead(Comparable)
delete(Comparable)
boolean search(Comparable)
You must do this in JAVA
Create an application layer to test your class and its methods.
Compress all of your source files into a file and rename it as CS214HW1_first_lastName.zip Put your name in the filename. If needed, add a ReadMe.txt file for extra information such as compilation.
I implemented everything correctly and the code is working fine, but I used for example: insertSorted(int) instead of insertSorted(Comparable), because I didn't know how to do it.
I searched online, and read the JAVA documentation for (Comparable) but it is not enough :(
Can anybody help, please it is very important?
Here's some of my code, I can't write it all, cuz I don't want my friends to get the same code.
I will take zero if there is same code.
Code:
class DLL {
class Node {
Node next;
Node prev;
int data;
Node() {
next = null;
prev = null;
data = 0;
}
Node(int dt) {
next = null;
prev = null;
data = dt;
}
}
Node head;
void insertToHead(int dt) {
if (head == null) {
head = new Node(dt);
}
else {
head.prev = new Node(dt);
head.prev.next = head;
head = head.prev;
}
}
public static void main(String args[]) {
DLL dll = new DLL();
dll.insertToHead(1);
dll.insertToHead(2);
dll.insertToHead(3);
}
}
Please, somebody, tell me what to change in the beginning of the class.
are we gone use extends or implements Comparable<E> or what!
and what changes should i do the method insertToHead(Comparable)
what changes should i do to the main.
You would probably like to look into how generics work as well. The basic idea is that you would like to set up your class so that it will not know exactly the specific type of object but can be given some hint at the types of things it can expect of a declared generic type.
In your case, you would like to set up your list so that you can create linked lists of anything that can be compared. Java has a class for that which you have mention called Comparable<E> this tells Java that it will be able to call such methods as compareTo on the provided object.
More specifically to your closing questions:
Use the following style of class declaration MyClass<MyGenericType extends Comparable<MyGenericType>>. In your case DLL<E extends Comparable<E>>.
Switch the method arguments to accept E our declared generic type.
You should use the class Integer instead of the primitive type int, and change the creation of your list to DLL<Integer> dll = new DLL<Integer>().
Fully updated version of provided code:
public class DLL<E extends Comparable<E>> {
class Node {
Node next;
Node prev;
E data;
Node() {
next = null;
prev = null;
data = null;
}
Node(E dt) {
next = null;
prev = null;
data = dt;
}
}
Node head;
void insertToHead(E dt) {
if (head == null) {
head = new Node(dt);
}
else {
head.prev = new Node(dt);
head.prev.next = head;
head = head.prev;
}
}
public static void main(String args[]) {
DLL<Integer> dll = new DLL<Integer>();
dll.insertToHead(1);
dll.insertToHead(2);
dll.insertToHead(3);
}
}
This new implementation should provide a hint for how to proceed with some of the other homework tasks. For instance you can now compare objects just by their compareTo method which might useful for sorting hint hint.
That doc page gives a very good explanation for how to use this method. You should note that in their docs, they use a generic type called T instead of E, it really doesnt make a difference you can call it whatever you want provided it is unique to your program.
Edit:
An each hint in the sorting direction:
Ojbects which extend the Comparable class have a method which is called compareTo this method is set up so you can call:
object1.compareTo(object2);
this method returns an int which will be:
> 0 when object1 is greater than object2
= 0 when object1 is equal to object2
< 0 when object1 is less than object2
I don't want to give away too much as this is a homework assignment but here is my hint:
The way the above code sets up your classes, you would be able to tell the relationship between NodeA and NodeB by calling:
NodeA.data.compareTo(NodeB.data)
this will return an integer which gives your information according to the list above.
The <=,>=,== operators are likely found in the Integer class's compareTo method.
Something like:
public int compareTo(Object o) {
int otherNumber = ((Integer) o).intValue();
int thisNumber = this.intValue();
if (otherNumber > thisNumber) {
return 1;
} else if (otherNumber < thisNumber) {
return -1;
} else {
return 0;
}
}
but more likely they just do something like:
public int compareTo(Object o) {
return this.intValue() - o.intValue(); // possibly normalized to 1, -1, 0
}
See the Docs on Integer for more info on this.
The HashSet Set implementation allows the addition of null. Is there any Collection implementation that will not allow null? I know I can do a remove for the null on the HashSet, but I was wondering if I can restrict it.
public static void main(String[] args) {
Set<String> testing = new HashSet<String>();
testing.add(null);
testing.add("test");
for(String str : testing){
}
}
//TreeSet allows null as well
public static void main(String[] args) {
TreeSet<String> testing = new TreeSet<String>();
testing.add(null);
for(String str : testing){
System.out.println("testing");
}
}
A TreeSet is appropriate for your requirement. Its add(Object) method javadoc states
Throws:
NullPointerException - if the specified element is null and this set
uses natural ordering, or its comparator does not permit null elements
in both Java 6 and Java 7.
Also, if you are looking for Collection implementations (rather than Set implements), there are others.
If you want to use HashSet, but with that restriction for no null, you could also create a class extending HashSet.
Example:
public class HashSetNullLess<E> extends HashSet<E> {
public boolean add(E e) {
if(e==null)
return false; //Or throw unsupported exception message
return super.add(e);
}
}
This should pretty much take care of the problem as I checked the code of HashSet.java and even the constructors accepting collections lead to the add method via addAll in AbstractCollection.java
See:
public boolean addAll(Collection<? extends E> c) {
boolean modified = false;
for (E e : c)
if (add(e))
modified = true;
return modified;
}
I agree with SotiriosDelimanolis. I tried to run this code in JDK 7.
public class Test {
public static void main(String arg[]) {
TreeSet<String> set = new TreeSet<String>();
set.add(null);
set.add("Test");
System.out.println(set);
}
}
But got the following error:
Exception in thread "main" java.lang.NullPointerException
I think using your requirement will be fulfilled.
Consider to wrap Hashtable class. It does not allow nulls, however it implements Map interface. You can do this in a way as HasSet wraps HashMap. Take into account Hashtable is synchronized.
Use next, but it is a really implicit and could make code reading harder
HashSet<String> set = new HashSet<String>() {
#Override
public boolean add(String s) {
if(s == null)
throw new NullPointerException();
return super.add(s);
}
};
Use TreeMap, but before actual using it add just one element, because it accepts null only as initial element
I have the following piece of code:
private final List<WeakReference<T>> slaves;
public void updateOrdering() {
// removes void weak references
// and ensures that weak references are not voided
// during subsequent sort
List<T> unwrapped = unwrap();
assert unwrapped.size() == this.slaves.size();
// **** could be reimplemented without using unwrap() ****
Collections.sort(this.slaves, CMP_IDX_SLV);
unwrapped = null;// without this, ....
}
Method unwrap() just creates a list of T's referenced by the weak references in slaves
and as a side effect eliminates the weak references referencing null in slaves.
Then comes the sort which relies on that each member of slaves references some T;
otherwise the code yields a NullPointerException.
Since unwrapped holds a reference on each T in slaves, during sorting no GC eliminates a T. Finally, unwrapped = null eliminates the reference on unwrapped
and so releases GC again. Seems to work quite well.
Now my question:
If I remove unwrapped = null; this results in NullPointerExceptions when running many tests under some load. I suspect that the JIT eliminates List<T> unwrapped = unwrap();
and so GC applies to the T's in slaves during sorting.
Do you have another explanation? If you agree with me, is this a bug in the JIT?
I personally think that unwrapped = null should not be necessary, because unwrapped is removed from the frame as soon as updateOrdering() returns. Is there a specification what may be optimized and what is not?
Or did I do the thing in the wrong way? I have the idea to modify comparator that it allows weak references on null. What do you think about that?
Thanks for suggestions.
Add on (1)
Now I want to add some missing pieces of information:
First of all Java version:
java version "1.7.0_45"
OpenJDK Runtime Environment (IcedTea 2.4.3) (suse-8.28.3-x86_64)
OpenJDK 64-Bit Server VM (build 24.45-b08, mixed mode)
Then someone wanted to see method unwrap
private synchronized List<T> unwrap() {
List<T> res = new ArrayList<T>();
T cand;
WeakReference<T> slvRef;
Iterator<WeakReference<T>> iter = this.slaves.iterator();
while (iter.hasNext()) {
slvRef = iter.next();
cand = slvRef.get();
if (cand == null) {
iter.remove();
continue;
}
assert cand != null;
res.add(cand);
} // while (iter.hasNext())
return res;
}
Note that while iterating, void references are removed.
In fact i replaced this method by
private synchronized List<T> unwrap() {
List<T> res = new ArrayList<T>();
for (T cand : this) {
assert cand != null;
res.add(cand);
}
return res;
}
using my own iterator but functionally this should be the same.
Then someone wantet the stacktrace. Here is a piece of it.
java.lang.NullPointerException: null
at WeakSlaveCollection$IdxComparator.compare(WeakSlaveCollection.java:44)
at WeakSlaveCollection$IdxComparator.compare(WeakSlaveCollection.java:40)
at java.util.TimSort.countRunAndMakeAscending(TimSort.java:324)
at java.util.TimSort.sort(TimSort.java:189)
at java.util.TimSort.sort(TimSort.java:173)
at java.util.Arrays.sort(Arrays.java:659)
at java.util.Collections.sort(Collections.java:217)
at WeakSlaveCollection.updateOrdering(WeakSlaveCollection.java:183)
it points into the comparator, the line with the return.
static class IdxComparator
implements Comparator<WeakReference<? extends XSlaveNumber>> {
public int compare(WeakReference<? extends XSlaveNumber> slv1,
WeakReference<? extends XSlaveNumber> slv2) {
return slv2.get().index()-slv1.get().index();
}
} // class IdxComparator
and finally,
private final static IdxComparator CMP_IDX_SLV = new IdxComparator();
is an important constant.
Add on (2)
Observed now that indeed NPE occurs even if 'unwrapped = null' is present in updateOrdering().
Weak references may be removed by java runtime
if no strict reference holds after jit optimization.
The source code seems not important at all.
I solved the problem the following way:
public void updateOrdering() {
Collections.sort(this.slaves, CMP_IDX_SLV);
}
without any decoration inserted to prevent slaves to be garbage collected
and the comparator in CMP_IDX_SLV enabled to handle weak references to null:
public int compare(WeakReference<? extends XSlaveNumber> slv1,
WeakReference<? extends XSlaveNumber> slv2) {
XSlaveNumber sSlv1 = slv1.get();
XSlaveNumber sSlv2 = slv2.get();
if (sSlv1 == null) {
return sSlv2 == null ? 0 : -1;
}
if (sSlv2 == null) {
return +1;
}
assert sSlv1 != null && sSlv2 != null;
return sSlv2.index()-sSlv1.index();
}
As a side effect, ordering the underlying list List> slaves;
puts the void weak references at the end of the list, where it can be collected later.
I examine your source code, and I got NullPointerException when JIT compile my method corresponding to your method "updateOrdering" and GC occurs during sorting.
But I got NullPointerException when Collections.sort whether with or without unwrapped = null.
This maybe occurs difference between my sample source code and yours, or Java version difference. I will examine if you tell Java version.
I use java below version.
java version "1.7.0_40"
Java(TM) SE Runtime Environment (build 1.7.0_40-b43)
Java HotSpot(TM) 64-Bit Server VM (build 24.0-b56, mixed mode)
If you want to cheat on JIT compilation, the below code insert your source code instead unwrapped = null(e.g.). Then, JIT compilation doesn't eliminates unwrapped code.
long value = unwrapped.size() * unwrapped.size();
if(value * value % 3 == 1) {
//Because value * value % 3 always is 1 or 0, this code can't reach.
//Insert into this the source code that use unwrapped array, for example, show unwrapped array.
}
My examination result is below.
If JIT don't optimize my method corresponding to updateOrdering, no NullPointerException occurs.
If JIT optimize my method, then NullPointerException occurs at some point.
If JIT optimize my method inserting the above source code cheating JIT compiler, then no NullPointerException occurs.
So, I(and you) suggest JIT optimze eliminates unwrapped code, then NullPointerException occurs.
By the way, if you want to show JIT compiler optimization, you invoke java with -XX:+PrintCompilation.
If you want to show GC, with -verbose:gc.
Just for information, my sample source code is below.
public class WeakSampleMain {
private static List<WeakReference<Integer>> weakList = new LinkedList<>();
private static long sum = 0;
public static void main(String[] args) {
System.out.println("start");
int size = 1_000_000;
for(int i = 0; i < size; i++) {
Integer value = Integer.valueOf(i);
weakList.add(new WeakReference<Integer>(value));
}
for(int i = 0; i < 10; i++) {
jitSort();
}
GcTask gcTask = new GcTask();
Thread thread = new Thread(gcTask);
thread.start();
for(int i = 0; i < 100000; i++) {
jitSort();
}
thread.interrupt();
System.out.println(sum);
}
public static void jitSort() {
List<Integer> unwrappedList = unwrapped();
removeNull();
Collections.sort(weakList,
new Comparator<WeakReference<Integer>>() {
#Override
public int compare(WeakReference<Integer> o1,
WeakReference<Integer> o2) {
return Integer.compare(o1.get(), o2.get());
}
}
);
for(int i = 0; i < Math.min(weakList.size(), 1000); i++) {
sum += weakList.get(i).get();
}
unwrappedList = null;
// long value = (sum + unwrappedList.size());
// if((value * value) % 3 == 2) {
// for(int i = 0; i < unwrappedList.size(); i++) {
// System.out.println(unwrappedList.get(i));
// }
// }
}
public static List<Integer> unwrapped() {
ArrayList<Integer> list = new ArrayList<Integer>();
for(WeakReference<Integer> ref : weakList) {
Integer i = ref.get();
if(i != null) {
list.add(i);
}
}
return list;
}
public static void removeNull() {
Iterator<WeakReference<Integer>> itr = weakList.iterator();
while(itr.hasNext()) {
WeakReference<Integer> ref = itr.next();
if(ref.get() == null) {
itr.remove();
}
}
}
public static class GcTask implements Runnable {
private volatile int result = 0;
private List<Integer> stockList = new ArrayList<Integer>();
public void run() {
while(true) {
if(Thread.interrupted()) {
break;
}
int size = 1000000;
stockList = new ArrayList<Integer>(size);
for(int i = 0; i < size; i++) {
stockList.add(new Integer(i));
}
if(System.currentTimeMillis() % 1000 == 0) {
System.out.println("size : " + stockList.size());
}
}
}
public int getResult() {
return result;
}
}
}
As of Java 9, the correct way to prevent the JIT from discarding unwrapped is to use Reference.reachabilityFence:
public void updateOrdering() {
List<T> unwrapped = unwrap();
Collections.sort(this.slaves, CMP_IDX_SLV);
Reference.reachabilityFence(unwrapped);
}
The presence of the reachabilityFence call causes unwrapped to be considered strongly reachable before the call, preventing collection of unwrapped or its elements until the sort completes. (The strange way in which reachabilityFence's effects seem to propagate backward in time is because it behaves primarily as a JIT directive.) Without reachabilityFence, unwrapped can be collected once the JIT can prove it will never again be accessed, even though the variable is still in scope.
Your question
If I remove unwrapped = null; this results in NullPointerException when running many tests under some load.
According to my understanding I do not think so that unwrapped = null; makes any difference.
Yes, I have also read that making objects = null sometime increases the probability the object referenced will be GC'ed but I don't think it matters here because once the method ends, scope of unwrapped ends and is eligible for GC'ed and in your function sorting Collections.sort(this.slaves, CMP_IDX_SLV); is done prior to unwrapped = null; so it make no sense the you get NPE when adding or removing them.
I think it is just a coincidence that you get NPE, I believe if you run the test again you will get NPE with that statement also.
If you read Java Documentation
Weak reference objects, which do not prevent their referents from being made finalizable, finalized, and then reclaimed. Weak references are most often used to implement canonicalizing mappings.
Suppose that the garbage collector determines at a certain point in time that an object is weakly reachable. At that time it will atomically clear all weak references to that object and all weak references to any other weakly-reachable objects from which that object is reachable through a chain of strong and soft references. At the same time it will declare all of the formerly weakly-reachable objects to be finalizable. At the same time or at some later time it will enqueue those newly-cleared weak references that are registered with reference queues.
So it is really possible when you constructed the List from unwrap() some objects might have been marked finalized and while your Collection.sort is working some WeakRefrence are assigned null. And the point stated by Mattias Buelens is perfectly valid you'll always lose in a fight against the compiler.
If you agree with me, is this a bug in the JIT?
No surely not, I completely disagree with you.
I have the idea to modify comparator that it allows weak references on null. What do you think about that?
I think it will solve your one problem of NPE but your requirement removes void weak references and ensures that weak references are not voided during subsequent sort is not satisfied.
Rather try to call unwrap once again, this will reduce the window for NPE to almost zero,
List<T> unwrapped = unwrap();
unwrapped = unwrap(); //Again to eliminate the chances for NPE as now we would have
//already made strong refrences to all objects which have not been `null`