I'm attempting to resize my hash table however; I am keep getting a NullPointerException.
I know if the size is greater than 0.75 then the table size has to double, if it's less than 0.50 then the table size is halved. So far I have this..
public boolean add(Object x)
{
int h = x.hashCode();
if (h < 0) { h = -h; }
h = h % buckets.length;
Node current = buckets[h];
while (current != null)
{
if (current.data.equals(x)) { return false; }
// Already in the set
current = current.next;
}
Node newNode = new Node();
newNode.data = x;
newNode.next = buckets[h];
buckets[h] = newNode;
currentSize++;
double factor1 = currentSize * load1; //load1 = 0.75
double factor2 = currentSize * load2; //load2 = 0.50
if (currentSize > factor1) { resize(buckets.length*2); }
if (currentSize < factor2) { resize(buckets.length/2); }
return true;
}
Example. Size = 3. Max Size = 5
if we take the Max Size and multiply by 0.75 we get 3.75.
this is the factor that says if we pass it the Max Size must double
so if we add an extra element into the table the size is 4 and is > 3.75 thus the new Max Size is 10.
However; once we increase the size, the hashcode will change with the addition of a new element, so we call resize(int newSize)
private void resize(int newLength)
{
//
HashSet newTable = new HashSet(newLength);
for (int i = 0; i < buckets.length; i++) {
newTable.add(buckets[i]);
}
}
Here is my constructor if the buckets[i] confuses anyone.
public HashSet(int bucketsLength)
{
buckets = new Node[bucketsLength];
currentSize = 0;
}
I feel that the logic is correct, unless my resize method is not retrieving the elements.
If that is all your code for resize(), then you are failing to assign newTable to a class attribute, i.e. your old table. Right now you fill it with data and then don't do anything with it, since it is defined inside resize and therefore not available outside of it.
So you end up thinking you have a larger table now, but in fact you are still using the old one ;-)
Related
I need to create a reallocate method for a HashSet. I will implement to an add and to a remove method, that way I increase or decrease its size if the load factor is greater than 1 or less than 0.5, respectively. The loadFactor = elements/size of the set. This is my work so far but I start losing elements whenever I need to increase the size of my set.
public void reallocate() {
double loadFactor = (double) currentSize / buckets.length;
Node[] newBuckets;
if (loadFactor > 1) {
newBuckets = new Node[buckets.length * 2];
for (Node bucket : buckets) {
if (bucket != null) {
int h = bucket.hashCode();
h = Math.abs(h % newBuckets.length);
newBuckets[h] = bucket;
}
}
buckets = newBuckets;
} else if (loadFactor < 0.5) {
newBuckets = new Node[buckets.length / 2];
for (Node bucket : buckets) {
if (bucket != null) {
int h = bucket.hashCode();
h = Math.abs(h % newBuckets.length);
newBuckets[h] = bucket;
}
}
buckets = newBuckets;
}
}
The original array is buckets and I create newBuckets with the selected size. Used the loop to copy each element to newBuckets and then set buckets = newBuckets. I would prefer some advice instead of handing me the solution since I want to learn how to do this.
Write the method removeRightmostHalf member of the class LinkedList. Do not call any methods of the class and do not use any auxiliary data structures.
If l contains A! B! C! D! E, then after calling l.removeRightmostHalf(), l becomes A! B! C.
int size = 0 ;
int halfSize = 0;
current = head;
while (current.next != null) {
++size;
current=current.next;
}
++size;
if (size % 2 == 0) {
halfSize = (size / 2);
for (int i = halfSize + 1; i < size; i++) {
}
}
I do not know how I will remove inside for loop.
Any help!
I would suggest you to use two pointers, slow and fast pointer. Initially both will be pointing to the start of the linked list.
The slow pointer will move one node at a time.
The fast will move two node a time.
The moment you see that fast pointer has reached the end of the list, just mark the slow pointer node as end of the list, by setting next=null;
Important note that, the discovery of the end of the list will be depend on the even/odd size of the list. So design and test with both cases.
This will work , when you reach the half of the list just cut the link with the rest of it.
public void removeRightMost() {
int size = 0;
int halfSize = 0;
current = head;
while (current!= null) {
size++;
current = current.next;
}
if (size % 2 == 0) {
halfSize = (size / 2);
int count = 0;
current = head;
/* if the number of elements is even you need to decrease the halfSize 1 because
you want the current to reach the exactly half if you have 4 elements the current
should stop on the element number 2 then get out of the loop */
while (count < halfSize-1) {
current = current.next;
count++;
}
current.next=null; //here the process of the deletion when you cut the rest of the list , now nothing after the current (null)
}
else {
halfSize = (size / 2);
int count = 0;
current = head;
while (count < halfSize) {
current = current.next;
count++;
}
current.next=null;
}
current=head; // return the current to the first element (head)
}
good luck
After I check to see if the load factor signals the backing array to be resized, how do I actually do the resizing with quadratic probing?
Here is the code.
It's only part of the class. Also, could you check if I'm implementing the add method correctly?
import java.util.*;
public class HashMap<K, V> implements HashMapInterface<K, V> {
// Do not make any new instance variables.
private MapEntry<K, V>[] table;
private int size;
/**
* Create a hash map with no entries.
*/
public HashMap() {
table = new MapEntry[STARTING_SIZE];
size = 0;
}
#Override
public V add(K key, V value) {
if (key == null || value == null) {
throw new IllegalArgumentException("Passed in null arguments.");
}
if (getNextLoadFactor() > MAX_LOAD_FACTOR) {
resize();
}
MapEntry<K, V> entry = new MapEntry<>(key, value);
V val = null;
int index = Math.abs(key.hashCode()) % table.length;
int temp = index;
int q = 1;
do {
if (table[index] == null) {
table[index] = entry;
} else if (table[index].getKey().equals(key)) {
val = table[index].getValue();
table[index].setValue(value);
}
index = index + q*q % table.length;
q++;
} while (temp != index);
size++;
return val;
}
private double getNextLoadFactor() {
return (double) size / (double) table.length;
}
private void resize() {
MapEntry<K, V>[] temp = table;
table = new MapEntry[table.length * 2 + 1];
for (int i = 0; i < table.length; i++) {
}
}
Following the following from wiki:
1. Get the key k
2. Set counter j = 0
3. Compute hash function h[k] = k % SIZE
4. If hashtable[h[k]] is empty
(4.1) Insert key k at hashtable[h[k]]
(4.2) Stop
Else
(4.3) The key space at hashtable[h[k]] is occupied, so we need to find the next available key space
(4.4) Increment j
(4.5) Compute new hash function h[k] = ( k + j * j ) % SIZE
(4.6) Repeat Step 4 till j is equal to the SIZE of hash table
5. The hash table is full
6. Stop
According to the above, it seems to me that there is a problem in your add method. Notice step (4.1) and (4.2): if table[index] == null, a position for the key has been found and you can stop. Your do will execute again, because right after the insert, you update the index, thus temp != index will be true.
You are also calculating the next index incorrectly, change
index = index + q*q % table.length;
to
index = (Math.abs(key.hashCode()) + q*q) % table.length;
The add will thus change to:
MapEntry<K, V> entry = new MapEntry<>(key, value);
V val = null;
int index = Math.abs(key.hashCode()) % table.length;
int q = 0;
while (table[(index = (Math.abs(key.hashCode()) + q*q++) % table.length)] != null);
table[index] = entry;
size++;
return val;
It can be proven that, if the table size b for b > 3 the first b/2 positions will be unique, so it is safe to assume that if the table is less than half full (b/2 - 1), you will find an empty position. This depends on your MAX_LOAD_FACTOR.
For resizing, you will need to rehash each value into the new table. This is due to your hash function using the size of the table as modulus. Your hash function has basically changed, so you need to create the new array of size + 1, and readd every element to the new array.
private void resize() {
MapEntry<K, V>[] temp = table;
table = new MapEntry[table.length * 2 + 1];
for (MapEntry<K, V> entry:temp) {
this.add(entry.getKey(), entry.getValue());
}
}
Note: I did not test this and only used the theory behind dynamic probing and hashtables to debug your code. Hope it helps!
As you can see in the screenshot, new_mean's capacity is 0 eventhough I've created it with an initial capacity of 2 therefore I'm getting index out of bounds exception.
Does anyone know what I'm doing wrong?
Update: Here's the code
private static Vector<Double> get_new_mean(
Tuple<Set<Vector<Double>>, Vector<Double>> cluster,
Vector<Double> v, boolean is_being_added) {
Vector<Double> previous_mean = cluster.y;
int n = previous_mean.size(), set_size = cluster.x.size();
Vector<Double> new_mean = new Vector<Double>(n);
if (is_being_added) {
for (int i = 0; i < n; ++i) {
double temp = set_size * previous_mean.get(i);
double updated_mean = (temp + v.get(i)) / (set_size + 1);
new_mean.set(i, updated_mean);
}
} else {
if (set_size > 1) {
for (int i = 0; i < n; ++i) {
double temp = set_size * previous_mean.get(i);
double updated_mean = (temp - v.get(i)) / (set_size - 1);
new_mean.set(i, updated_mean);
}
} else {
new_mean = null;
}
}
return new_mean;
}
Capacity is the total number of elements you could store.
Size is the number of elements you have actually stored.
In your code, there is nothing stored in the Vector, so you get an IndexOutOfBoundsException when you try to access element 0.
Use set(int, object) to change an EXISTING element. Use add(int, object) to add a NEW element.
This is explained in the javadoc for Vectors. elementCount should be 0 (it's empty) and capacityIncrement is 0 by default, and is only relevant if you're going to go over the limit you specified (2).
You need to fill your Vector with null values to make it's size equal to the capacity. Capacity is an optimization hint for the collection, it makes no change to the collection usage. Collection will automatically grow as you add elements to it and capacity will increase. So initializing with a higher capacity would requires less expansions and less memory allocations.
I have a Graph class with a bunch of nodes, edges, etc. and I'm trying to perform Dijkstra's algorithm. I start off adding all the nodes to a priority queue. Each node has a boolean flag for whether it is already 'known' or not, a reference to the node that comes before it, and an int dist field that stores its length from the source node. After adding all the nodes to the PQ and then flagging the source node appropriately, I've noticed that the wrong node is pulled off the PQ first. It should be that the node with the smallest dist field comes off first (since they are all initialized to a a very high number except for the source, the first node off the PQ should be the source... except it isn't for some reason).
Below is my code for the algorithm followed by my compare method within my Node class.
public void dijkstra() throws IOException {
buildGraph_u();
PriorityQueue<Node> pq = new PriorityQueue<>(200, new Node());
for (int y = 0; y < input.size(); y++) {
Node v = input.get(array.get(y));
v.dist = 99999;
v.known = false;
v.prnode = null;
pq.add(v);
}
source.dist = 0;
source.known = true;
source.prnode = null;
int c=1;
while(c != input.size()) {
Node v = pq.remove();
//System.out.println(v.name);
//^ Prints a node that isn't the source
v.known = true;
c++;
List<Edge> listOfEdges = getAdjacent(v);
for (int x = 0; x < listOfEdges.size(); x++) {
Edge edge = listOfEdges.get(x);
Node w = edge.to;
if (!w.known) {
int cvw = edge.weight;
if (v.dist + cvw < w.dist) {
w.dist = v.dist + cvw;
w.prnode = v;
}
}
}
}
}
public int compare (Node d1, Node d2) {
int dist1 = d1.dist;
int dist2 = d2.dist;
if (dist1 > dist2)
return 1;
else if (dist1 < dist2)
return -1;
else
return 0;
}
Can anyone help me find the issue with my PQ?
The priority queue uses assumption that order doesn't change after you will insert the element.
So instead of inserting all of the elements to priority queue you can:
Start with just one node.
Loop while priority queue is not empty.
Do nothing, if element is "known".
Whenever you find smaller distance add it to priority queue with "right" weight.
So you need to store a sth else in priority queue, a pair: distance at insertion time, node itself.