Programmer
Programmer

Reputation: 6753

Is worst case analysis not equal to asymptotic bounds

Can someone explain to me why this is true. I heard a professor mention this is his lecture

Upvotes: 3

Views: 723

Answers (3)

Alexandre C.
Alexandre C.

Reputation: 56976

The two notions are orthogonal.

You can have worst case asymptotics. If f(n) denotes the worst case time taken by a given algorithm with input n, you can have eg. f(n) = O(n^3) or other asymptotic upper bounds of the worst case time complexity.

Likewise, you can have g(n) = O(n^2 log n) where g(n) is the average time taken by the same algorithm with (say) uniformly distributed (random) inputs of size n.

Or you can have h(n) = O(n) where h(n) is the average time taken by the same algorithm with particularly distributed random inputs of size n (eg. almost sorted sequences for a sorting algorithm).

Asymptotic notation is a "measure". You have to specify what you want to count: worst case, best case, average, etc.

Sometimes, you are interested in stating asymptotic lower bounds of (say) the worst case complexity. Then you write f(n) = Omega(n^2) to state that in the worst case, the complexity is at least n^2. The big-Omega notation is opposite to big-O: f = Omega(g) if and only if g = O(f).

Upvotes: 4

GK80
GK80

Reputation: 51

Take quicksort for an example. Each successive recursive call n of quicksort has a run-time complexity T(n) of

T(n) = O(n) + 2 T[ (n-1)/2 ]

in the 'best case' if the unsorted input list is splitted into two equal sublists of size (n-1)/2 in each call. Solving for T(n) gives O(n log n), in this case. If the partition is not perfect, and the two sublists are not of equal size n, i.e.

T(n) = O(n) + T(k) + T(n - 1 - k),

we still obtain O(n log n) even if k=1, just with a larger constant factor. This is because the number of recursive calls of quicksort is rising exponentially while processing the input list as long as k>0.

However, in the 'worst case' no division of the input list takes place, i.e.:

T(n) = O(n) + T(0) + T(n - 1) = O(n) + O(n-1) + T(n-1) + T(n-2) ... .

This happens e.g. if we take the first element of a sorted list as the pivot element.

Here, T(0) means one of the resulting sublists is zero and therefore takes no computing time (since the sublist has zero elements). All the remaining load T(n-1) is needed for the second sublist. In this case, we obtain O(n²).

If an algorithm had no worst case scenario, it would be not only be O[f(n)] but also o[f(n)] (Big-O vs. little-o notation).

Upvotes: 0

Navi
Navi

Reputation: 8756

The asymptotic bound is the expected behaviour as the number of operations go to infinity. Mathematically it is just that lim as n goes to infinity. The worst case behaviour however is applicable to finite number of operations.

Upvotes: -1

Related Questions