SherlockHolmesKePapa
SherlockHolmesKePapa

Reputation: 643

How is the time complexity of the following function O(n³)?

I was going through the Algorithm lecture series of Stanford University on Coursera by Tim Roughgarden. There, he gave a multiple choice question to find the running time complexity of a function, which is given below in the image.

enter image description here

In this question, the last three options are correct. I understand how the first two options for Ω (Omega) and Θ (Theta) are correct just by looking at the function, but I am unable to grasp how the last option is correct because as far as I can tell, the running time complexity should be O(n2) instead of O(n3).

Can anyone please explain where I am wrong?

Upvotes: 3

Views: 8380

Answers (5)

Cilenco
Cilenco

Reputation: 7179

O(n^2) is a subset of O(n^3). Wikipedia is good for this.

In fact f(n) is in O(g(n)) if lim |f(x)|/g(x) is less then infinity (n goes to infinity). Here it would be

O(1/2*n^2+3n) is in O(n^3) 
<=> lim |(1/2*n^2+3n)| / n^3
<=> lim |(n*[n/2 + 3])| / n^3
<=> lim |(n/2 + 3)| / n^2 < infinty

for n to infinity n^2 is allways greater then n/2 so for positiv n this goes against 0

Another way to proof this is: Can you find a k and a c (both positiv) so that f(n) is always smaller then k*g(n) + c (not for all n - It is enough if this is true for one n0 and all numbers greater then n0). Here you can choose k=1 and c=0 because as mentioned before n^2 is always smaller then n^3 for positive n.

Here is a nice picture of the O notation as a Veen-Diagram.

Upvotes: 1

cadaniluk
cadaniluk

Reputation: 15229

First off, I recommend not to think about asympotic bounds as a part of computer science; it's a mathematical tool. Do not associate "Big O" strictly with "worst case" or such. Big O gives an asymptotic upper bound. That's it. It happens to be useful in computer science to describe the worst case running time of an algorithm, but it's math, not computer science that describes how it works.
But that's just my opinion, mind you.


Take this definition for Big O notation. (This is the formal definition I've learnt first.)
From Introduction to Algorithms 3rd Edition (I'm still learning about algorithms myself through that book), page 47:

For a given function g(n), we denote by O(g(n)) [...] the set of functions

O(g(n)) = { f(n) : there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 }.

Observe Big O notation denotes a set and as such, it can be part of a superset and it can be a superset to other subsets. Writing "n = O(n)" is just a formally incorrect way of saying that the function f(n) = n is a member of the set O(n) (n ∈ O(n)).
Abusing Big O notation (i.e., placing it in equations) like this enables us to use it in equations and to write stuff like T(n) = 1/2n + O(n), which basically means that T(n) equals 1/2n plus some function for which the definition of Big O notation given above holds true.

So, you wonder why something like n2 = O(n3)? Let's show through our formal definition:

g(n) = n3
f(n) = n2

0 ≤ n2 ≤ cn3 for all n ≥ n0

Intuitively, we can easily see that there must be some c and n0 (actually, c ≥ 1 and n0 = 0) for this inequality to be true because a cubic function grows asymptotically faster than a quadratic one.

Do the same for g(n) = n2 and you'll see that n2 = O(n2) as well (for c ≥ 1 and n0 = 0).

So, n2 = O(n3) and n2 = O(n2)? This is possible because O(n3) and O(n2) are not disjoint; O(n2) is a subset of O(n3) because every quadratic function can be bounded from above by a cubic function.

As you see, asympotic bounds need not be tight; n = O(n65536) is true. However, tight bounds are obviously preferred.

Upvotes: 1

Sandipan Dey
Sandipan Dey

Reputation: 23129

T(n) = O(n^2) <=> there exists constants c > 0, n0 >= 0 s.t. T(n) <= c.n^2 for all n >= n0.

Now, c.n^2 <= c.n^3 for all n > 1 (an identity)

=> T(n) <= c.n^2 <= c.n^3 for all n >= n1 = max(n0, 1)

=> there exists constants c > 0, n1 = max(n0, 1), s.t. T(n) <= c.n^3

=>     T(n) = O(n^3)

In fact we can show that T(n)=O(n^2) => T(n)=O(n^k) for all n>=2 (e.g. use induction).

Upvotes: 0

Amrinder Arora
Amrinder Arora

Reputation: 726

To be mathematically precise, big O, big Omega, etc are all sets, not functions. So, when we say T(n) = O(n^3), we really mean that T(n) is in the set O(n^3). But since it is not easy to typeset the "in the set" notation, we usually just end up writing that T(n) = O(n^3). Hence, it cause a bit of confusion, but basically O(n^3) is simply the set of functions that do not grow faster than n^3. And of course, the given T(n) does not grow faster than n^3, so T(n) is in the set O(n^3).

And similarly, T(n) is in the sets O(n^4), O(n^5), O(n^3 log n), O(n^127), O(n^n^n), etc.

So, if you had a fifth option in the question: T(n) = O(n^2), that would be true as well.

Upvotes: 1

IVlad
IVlad

Reputation: 43517

If you can prove that it's O(n^2) (which you have to in order to prove it's Big Theta(n^2). Don't do it just by "looking at it", do the math.), then replace all occurrences of n^2 with n^3 in the proof.

Is it still still correct?

Remember that Big-oh is an upper bound.

Upvotes: 0

Related Questions