Reputation: 409
I can't figure out how to double the size of my hash table. Here is the code:
private void doubleLength () {
//Remember the old hash table array and allocate a new one 2 times as big
HashMap<K,V> resizedMap = new HashMap<K,V>(map.length * 2);
/*Traverse the old hash table adding each value to the new hash table.
Instead, add it to by applying
hashing/compression again (compression will be DIFFERENT, because the
length of the table is doubled, so we compute the same hash value but
compute the remainder using the DIFFERENT TABLE LENGTH).*/
for (int i = 0; i < map.length; i++) {
for (K key : map[i].entry) { //iterator does not work here
resizedMap.put(key, map[i].get(key)); //should go here
}
}
The hash table is an array of LN objects where LN is defined by:
public static class LN<K1,V1> {
public Map.Entry<K1,V1> entry;
public LN<K1,V1> next;
public LN (Map.Entry<K1,V1> e, LN<K1,V1> n)
{entry = e; next = n;}
}
I have an iterable within my class but it doesn't allow for map[i].entry.entries().
public Iterable<Map.Entry<K,V>> entries () {
return new Iterable<Map.Entry<K,V>>() {
public Iterator<Map.Entry<K,V>> iterator() {
return new MapEntryIterator();
}
};
}
I'm very lost on how I can double the size of public LN[] map;
Upvotes: 0
Views: 741
Reputation: 361
The HashMap
already resizes itself when the hash table gets too full. You do not have to resize it.
Upvotes: 3
Reputation: 14212
Your code will not compile. If you want to initialize a map to double the size it is easier to do this (assuming map
is a Map
also):
private void doubleLength () {
//Remember the old hash table array and allocate a new one 2 times as big
HashMap<K,V> resizedMap = new HashMap<K,V>(map.size()* 2);
resizedMap.putAll(map);
}
Also you seem to accessing things strangely. If you needed to loop through a map it shoudl look like:
for (K key : map.keySet()){
V value = map.get(key); //get the value from the map
//do what you need to do
}
Now as already stated, you do not need to resize the HashMap. It already does that. From the JavaDocs:
An instance of HashMap has two parameters that affect its performance: initial capacity and load factor. The capacity is the number of buckets in the hash table, and the initial capacity is simply the capacity at the time the hash table is created. The load factor is a measure of how full the hash table is allowed to get before its capacity is automatically increased. When the number of entries in the hash table exceeds the product of the load factor and the current capacity, the hash table is rehashed (that is, internal data structures are rebuilt) so that the hash table has approximately twice the number of buckets.
Upvotes: 0