Ananya Nayak
Ananya Nayak

Reputation: 33

Use of asymptotic notation

Question-based on asymptotic notation


I have a doubt in this particular question where the answer says that big-Oh(n^2) algorithm will not run faster than big-Oh(n^3) algorithm whereas if the notation in both cases was theta instead then it would have been true but why is that so?

I would love it if anyone could explain it to me in detail because I couldn't find any source from where my doubt could get clarified.

Upvotes: 0

Views: 430

Answers (2)

Paul Hankin
Paul Hankin

Reputation: 58221

Part 1 paraphrased (note the phrasing in the question is ambiguous where "number" is quantified -- it has to be picked after you choose the two algorithms, but I assume that's what's intended).

Given functions f and g with f=Theta(n^2) and g=Theta(n^3), then there exists a number N such that f(n) < g(n) for all n>N.

Part 2 paraphrased:

Given functions f and g with f=O(n^2) and g=O(n^3), then for all n, f(n) < g(n).

1 is true, and you can prove it by application of the definition of big-Theta.

2 is false (as a general statement), and you can disprove it by finding a single example of f and g for which it is false. For example, f(n) = 2, g(n) = 1. Big O is a kind of upper bound, so these constant functions work. The counter-examples given in the question are f(n)=n, g(n)=log(n), but the same principle applies.

Upvotes: 1

trincot
trincot

Reputation: 350147

the answer says that big-Oh(n^2) algorithm will not run faster than big-Oh(n^3) algorithm

It is more subtle: an O(𝑛²) algorithm could run slower than a O(𝑛³) algorithm. It is not will, but could.

The answer gives one reason, but actually there are two:

  1. Big O notation only gives an upper bound. Quoted from Wikipedia:

    A description of a function in terms of big O notation usually only provides an upper bound on the growth rate of the function.

    So anything that is O(𝑛) is also O(𝑛²) and O(𝑛³), but not necessarily vice versa. The answer says that an algorithm that is O(𝑛³) could maybe have a tighter bound that is O(log𝑛). True, this may sound silly, because why should one then say it is O(𝑛³) when it is also O(log𝑛)? Then it seems more reasonable to just talk about O(log𝑛). And this is what we commonly do. But there is also a second reason:

  2. The second option does not have the contraint "n > number" in its claim. This is essential, because irrespective of time complexities, the running time of an algorithm for a given value of 𝑛 cannot be determined from its time complexity. An algorithm that is O(𝑛log𝑛) may take 10 seconds to do its job, while an algorithm that is O(𝑛²) may take 8 seconds to get the same result, even though its time complexity is worse. When comparing time complexities you only get information about asymptotic behaviour, i.e. when 𝑛 is larger than a large enough number.

    Because this extra constraint is part of the first claim, it is indeed true.

Upvotes: 0

Related Questions