Reputation: 29952
Is it possible to have non-lazy delete (without tombstones) in open-addressed hash tables with collision resolution other than linear probing (but still open addressing)?
With linear probing, there is an algorithm here. I wonder, is there an algorithm for non-lazy delete, when we have quadratic probing/double hashing?
Upvotes: 1
Views: 2186
Reputation: 2405
If I save the starting cell of the chain together with the value upon insertion.I would just keep searching until reaching the last occupied cell of the same chain, then I will swap it with the value I want to delete if I encountered it earlier and clear the last cell. I'm not sure why tombstones are needed at all.
Upvotes: 0
Reputation: 81
The algorithm on the wiki page is confusing and incomplete: here is a better version with optimized test for checking if k
is outside range [i, j)
, considers j
may be wrapped around:
function remove(key): boolean
i := find_slot(key)
if not slot[i].used
return false // key is not in the table
j := i
loop
j := (j + 1) modulo num_slots
if not slot[j].used or j = i // if table was 100% full
breakloop
k := hash(slot[j].key) modulo num_slots
if (j < i) xor (k <= i) xor (k > j)
slot[i] := slot[j]
i := j
endloop
slot[i].used := false
num_slots := num_slots - 1
return true
Upvotes: 4
Reputation: 19601
The problem with deletion by pure removal is that empty slots may cause later searches to terminate before finding an item that really is in the table. If you maintain a counter giving the maximum number of probes taken before any insertion and terminate each failed search only after this number of probes then you could delete by simply removing items from their slots - but of course failed searches would be more expensive.
Upvotes: 2
Reputation: 241701
There is no such algorithm for any non-linear probing algorithm which has any value. It works for linear probing because the probe sequence is reversible. If the probe sequence is reversible, then all elements follow the same probe sequence (although they will start in different places along the sequence, based on the initial hash). So the secondary hash does nothing to prevent probe convergence, leading to the clustering of used nodes which characterises linear probing.
In other words, any probing algorithm which allows deletion by moving non-deleted elements backwards along the probe sequence will have the same sensitivity to load factor as linear probing, without the advantage of locality of reference provided by linear probing.
Upvotes: 3