Parveen
Parveen

Reputation: 1

Big O complexity in algorithms

I want to order the terms such that each one is big-O of next one

√n√logn

√n log⁡( n^30)

n/〖(logn)〗^2

〖16〗^(log√n)

Can anyone help in finding order?

Upvotes: 0

Views: 323

Answers (4)

Larry B.
Larry B.

Reputation: 753

The intuition behind n being greater than log n is pretty clear, so let's get rigorous about it.

  limit n->infinity (16 log(sqrt(n))/(n / (log n)^2)
= limit n->infinity  8 (log n)^3 / n = 0

If we prove the last equality, the order follows. We can use l'Hospital's rule repeatedly to get:

  limit n->infinity  8 (log n)^3 / n
= limit n->infinity  24 (log n)^2 / n
= limit n->infinity  48 (log n) / n
= limit n->infinity  48 / n = 0

Upvotes: 0

David Stutz
David Stutz

Reputation: 2618

Claim: 16*log(sqrt(n)) is in O(n/(log(n))^2).

By the definition from Wikipedia, f(x) is in O(g(x)) iff lim sup abs(f(x)/g(x)) < infinity for n approaching infinity. If the limit exists, lim sup becomes lim, and using the rule of l'Hospital (assuming the preconditions are fulfilled, see Wikiepdia), we have:

lim abs(f(x)/g(x)) = lim ((8*log(n))/n) * log(n) * log(n)
= lim (8*(log(n))^3)/n = lim (24*(log(n))^2)/n 
= lim (48*log(n))/(n^2) = lim (24/n^3) = 0

Here, I applied the rule of l'Hopstial three times to get rid of (log(n))^3. Hence, the lim exists and is, thus, equal to lim sup and by definition the claim follows.

Upvotes: 2

Alex Aparin
Alex Aparin

Reputation: 4512

If you understand calculus, you can perform following check:

1) limit (№ 2 / №1) = (should be infinity)

2) limit (№ 3 / №2) = (should be infinity)

3) limit (№ 4 / №3) = (should be infinity)

where № i - i-th expression

Upvotes: 1

chepner
chepner

Reputation: 531055

I'm assuming you mean the following four functions:

  1. (n*log(n))^(1/2)
  2. n^(1/2)*log⁡(n^30) = 30*n^(1/2)*log(n)
  3. n/log(n)^2
  4. 16*log(n^(1/2)) = 8*log(n)

and you want to understand why 8*log(n) = O(n/log(n)^2).

(The following is not intended to be fully rigorous, but just provide some intuition by this is true.)


Intuitively, you can start by showing that log(n) = O(n^(1/k)) for any constant k>0. This means that log(n)^2 = O(n^(1/k)) as well, since squaring both sides of the inequality log n < n^(1/k) yields log(n)^2 < n^(2/k), and 2/k is still a constant.

Next, consider the equality n^(1/2) == n/n^(1/2). What happens if you use a smaller root, say the cube root? On the left-hand side, you have a function that grows more slowly. On the right, the ratio grows more quickly, because you are dividing by something "smaller", so that sufficiently large n, n^(1/3) < n/n^(1/3). This is true for larger constants k as well, so in general n^(1/k) = O(n/n^(1/k)

Finally, we'll do a bit of handwaving and note that since log(n)^2 grows even more slowly than any root, you can say the following:

log(n)^2 = O(n^(1/k)) = O(n/n^(1/k)) = O(n/log(n)^2)

Multiplying everything by the constant 8 isn't going to affect the above chain, so we can finally conclude (non-rigorously) that

8*log(n)^2 = O(n/log(n)^2) 

Upvotes: 1

Related Questions