Reputation: 553
Does kruskal have a lower bound? Since we sort the edges .. Everywhere I see O(mlogn)
Upvotes: 0
Views: 168
Reputation: 372814
Kruskal's algorithm proceeds in two stages:
The runtime cost of step (1) depends on what sorting algorithm is used. For example, if you use quicksort, then step (1) will take Ω(m log n) time and O(m2) time. If you use mergesort, then step (1) will take Ω(m) time and O(m log n) time. If you use a radix sort, and the edge weights range from 0 to U, then step (1) will take Θ(m log U) time. But because this depends on the sorting algorithm used and the particulars of the data fed into the algorithm, we can't give a strong lower bound. (The best lower bound we could give would be Ω(m), since you have to process each edge at least once.)
The runtime cost of step (2) is O(mα(m, n)), where α(m, n) is the Ackermann inverse function, and there is a matching lower bound of Ω(mα(m, n)) here.
So overall the cost of Kruskal's algorithm is "the cost of sorting, plus Θ(mα(m, n))."
Upvotes: 1