aniss.bouraba
aniss.bouraba

Reputation: 453

How to calculate Big Oh notation

Please can someone tell me how 2n = O(3n) is calculated?

Here's some other examples:

2^4 = O(1)
10n = O(n)
n log2(n) = O(n log n)

Upvotes: 0

Views: 3093

Answers (4)

phimuemue
phimuemue

Reputation: 35983

There are multiple ways of proving this:

One could be the use of the limit:

f(n) = O(g(n)) iff the following holds:

                f(n)
   limsup    ----------  < infinity
x->infinity     g(n)

So, if you take 2n/(3n) in the limit, this is 1.5.

Upvotes: 0

japreiss
japreiss

Reputation: 11251

There is a rigorous mathematical definition of big-O:

f(x) is O(g(x)) if there exist values x0 >= 0 and k > 0 such that for all x > x0, f(x) <= k*g(x).

To prove statements about the big-O classification of functions, you must show how to find the x0 and k values.

Upvotes: 5

Patrick87
Patrick87

Reputation: 28292

Generally speaking, use the definition of the asymptotic notation, then produce a proof that f(n) = O(g(n)), where f(n) is your function and g(n) is the bound you are trying to prove.

For your first example:

2n =? O(3n)
f(n) = 2n, g(n) = 3n
need c such that for n > n0, f(n) <= c*g(n); guess c = 1
We have f(n) = 2n <= 3n = c*g(n) for n >= 0, so
2n = f(n) = O(g(n)) = O(3n)

For your second example:

2^4 =? O(1)?
f(n) = 2^4, g(n) = 1
need c such that for n > n0, f(n) <= c*g(n); guess c = 2^4 + 1
We have f(n) = 2^4 <= 2^4 + 1 = c*g(n), so
2^4 = f(n) = O(g(n)) = O(1)

Your third example is practically equivalent to the first.

Your fourth example is wrong; it is false that

n log^2(n) = O(n log n)

It is true that

n log(n^2) = O(n log n)

But that's a different expression. To see that n log^2(n) is not O(n log n), we can argue thusly: let c be an arbitary fixed constant. We can find an n such that c*n*log(n) < n*log^2(n). We get

c*n*log(n) < n*log^2(n)
c < log(n)

So choose n = 2^(c+1). Therefore, there is no n0 such that, for n > n0, f(n) <= c*g(n).

EDIT: In the fourth example, if you mean by "log2(n)" the log-base-two of n, then yes, nlog2(n) = O(nlogn). Typically, in doing algorithmic complexities, the logarithm is understood to be to base two, unless otherwise specified. Sorry for the confusion.

Upvotes: 1

jason
jason

Reputation: 241601

2^4 = 2^4 * 1 <= 2^4 * 1

so 2^4 is O(1) with constant 2^4.

10 * n = 10 * n <= 10 * n

so 10 * n is O(n) with constant 10.

n log2 n = n log n / log 2 <= (1 / log 2) * n log n

so n log2 n is O(n log n) with constant 1 / log 2.

Upvotes: 2

Related Questions