user2793618
user2793618

Reputation: 312

The Mathematical Relationship Between Big-Oh Classes

My textbook describes the relationship as follows:

There is a very nice mathematical intuition which describes these classes too. Suppose we have an algorithm which has running time N0 when given an input of size n, and a running time of N1 on an input of size 2n. We can characterize the rates of growth in terms of the relationship between N0 and N1:

Big-Oh      Relationship

O(log n)    N1 ≈ N0 + c
O(n)        N1 ≈ 2N0
O(n²)       N1 ≈ 4N0
O(2ⁿ)       N1 ≈ (N0)²

Why is this?

Upvotes: 1

Views: 72

Answers (3)

Patrick87
Patrick87

Reputation: 28312

Basically what they are trying to show is just basic algebra after substituting 2n for n in the functions.

O(log n)    
log(2n) = log(2) + log(n)
N1 ≈ c + N0

O(n)
2n = 2(n)        
N1 ≈ 2N0

O(n²)       
(2n)^2 = 4n^2 = 4(n^2)
N1 ≈ 4N0

O(2ⁿ)       
2^(2n) = 2^(n*2) = (2^n)^2
N1 ≈ (N0)²

Upvotes: 1

HackerBoss
HackerBoss

Reputation: 829

Since O(f(n)) ~ k * f(n) (almost by definition), you want to look at what happens when you put 2n in for n. In each case:

N1 ≈ k*log 2n = k*(log 2 + log n) = k*log n + k*log 2 ≈ N0 + c where c = k*log 2

N1 ≈ k*(2n) = 2*k*n ≈ 2N0

N1 ≈ k*(2n)^2 = 4*k*n^2 ≈ 4N0

N1 ≈ k*2^(2n) = k*(2^n)^2 ≈ N0*2^n ≈ N0^2/k

So the last one is not quite right, anyway. Keep in mind that these relationships are only true asymptotically, so the approximations will be more accurate as n gets larger. Also, f(n) = O(g(n)) only means g(n) is an upper bound for f(n) for large enough n. So f(n) = O(g(n)) does not necessarily mean f(n) ~ k*g(n). Ideally, you want that to be true, since your big-O bound will be tight when that is the case.

Upvotes: 1

btilly
btilly

Reputation: 46445

That is because if f(n) is in O(g(n)) then it can be thought of as acting like k * g(n) for some k.

So for example if f(n) = O(log(n)) then it acts like k log(n), and now f(2n) ≈ k log(2n) = k (log(2) + log(n)) = k log(2) + k log(n) ≈ k log(2) + f(n) and that is your desired equation with c = k log(2).

Note that this is a rough intuition only. An example of where it breaks down is that f(n) = (2 + sin(n)) log(n) = O(log(n)). The oscillating 2 + sin(n) bit means that f(2n)-f(n) can be basically anything.

I personally find this kind of rough intuition to be misleading and therefore worse than useless. Others find it very helpful. Decide for yourself how much weight you give it.

Upvotes: 2

Related Questions