Reputation: 1504
Has anyone ever heard of this heap repair technique: SloppyHeapSort? It uses a "Sloppy" sift-down approach. Basically, it takes the element to be repaired, moves it to the bottom of the heap (without comparing it to its children) by replacing it with its larger child until it hits the bottom. Then, sift-up is called until it reaches its correct location. This makes just over lg n comparisons (in a heap of size n).
However, this cannot be used for heap construction, only for heap repair. Why is this? I don't understand why it wouldn't work if you were trying to build a heap.
Upvotes: 3
Views: 184
Reputation: 241911
The algorithm, if deployed properly, could certainly be used as part of the heap construction algorithm. It is slightly complicated by the fact that during heap construction, the root of the subheap being repaired is not the beginning of the array, which affects the implementation of sift-up (it needs to stop when the current element of the array is reached, rather than continuing to the bottom of the heap).
It should be noted that the algorithm has the same asymptotic performance as the standard heap-repair algorithm; however, it probably involves fewer comparisons. In part, this is because the standard heap-repair algorithm is called after swapping the root of the heap (the largest element) for the last element in the heap array.
The last element is not necessarily the smallest element in the heap, but it is certainly likely to be close to the bottom. After the swap, the standard algorithm will move the swapped element down up to log2N times, with each step requiring two comparisons; because the element is likely to belong near the bottom of the heap, most of the time the maximum number of comparisons will be performed. But occasionally, only two or four comparisons might be performed.
The "sloppy" algorithm instead starts by moving the "hole" from the top of the heap to somewhere near the bottom (log2N comparisons) and then moving the last element up until it finds it home, which will usually take only a few comparisons (but could, in the worst case, take nearly log2N comparisons).
Now, in the case of heapify
, heap repair is performed not with the last element in the subheap, but rather with a previously unseen element taken from the original vector. This actually doesn't change the average performance analysis much, because if you start heap repair with a random element, instead of an element likely to be small, the expected number of sift-down operations is still close to the maximum. (Half of the heap is in the last level, so the probability of needing the maximum number of sift-downs for a random element is one-half.)
While the sloppy algorithm (probably) improves the number of element comparisons, it increases the number of element moves. The classic algorithm performs at most log2N swaps, while the sloppy algorithm performs at least log2N swaps, plus the additional ones during sift-up. (In both cases, the swaps can be improved to moves by not inserting the new element until its actual position is known, halving the number of memory stores.)
As a postscript, I wasn't able to find any reference to your "sloppy" algorithm. On the whole, when asking about a proposed algorithm it is generally better to include a link.
Upvotes: 3
Reputation: 71009
There is a linear time algorithm to construct a heap. I believe what the author meant is that using this approach to build a heap is no efficient and better algorithms exist. Of course you can build heap by adding the elements one by one using the described strategy - you simply can do better.
Upvotes: 1