Reputation: 51
There is a comparison-based sorting algorithm that runs in O(n*log(sqrt(n))). Given the existence of an Omega(n(log(n)) lower bound for sorting, how can this be possible?
Upvotes: 1
Views: 650
Reputation: 2672
For a sorting algorithm based on comparison you can draw a decision tree. It is a binary tree representing comparisons done by the algorithm, and every leaf of this tree is a permutation of the elements from given set.
There are n! possible permutations, where n is the size of the set, and only one of them represents the sorted set. Path leading to every leaf represents comparisons necessary to achieve permutation represented by the leaf.
Now lets make h the height of our decision tree, and l to be the number of leafs. Every possible permutation of the input set must be in one of the leafs, so n! <= l. A binary tree with height h can have at most 2^h leafs. Therefore we get n! <= l <= 2^h. Apply a logarithm to both sides, so you get h >= log(n!), and log(n!) is Omega(nlog(n)).
Because the height of the decision tree represents a number of comparisons necessary to get to the leaf, this is the proof that the lower bound for sorting algorithms based on comparison is nlog(n). This can't be done faster. So the only option left for this task to be correct is to assume that Omega(nlog(n) is also Omega(nlog(sqrt(n)). log(sqrt(n)) = log(n^(1/2)) = (1/2)log(n) => nlog(sqrt(n)) = n((1/2)log(n)) = (1/2)nlog(n). Ignore const = 1/2 (as we are interested in asympthotic complexity) an you get nlog(sqrt(n)) = nlog(n) in terms of complexity.
Upvotes: 0
Reputation: 411
Basically, this problem is asking you to prove that O(n*log(n)) = O(n*log(√n)), which means that you need to find some constant c > 0 such that: O(n*log(n)) = O(c*n*log(√n)). Remember that √n = n^(1/2) and that log(n^(1/2)) = 1/2*log(n). So, now we have O(n*log(n)) = O(1/2*n*log(n)). Since asymptotic notation ignores constant multipliers, we can rewrite this as O(n*log(n)) = O(n*log(n)). Voila, proof positive that it is possible.
Upvotes: 7