Reputation: 129
If a function f(n)
grows more slowly than a function g(n)
, why is f(n) = O(g(n))
?
e.g. if f(n)
is 4n^4
and g(n)
is log(4n^n^4)
My book says f=O(g(n))
because g=n^4*log(4n)=n^4(logn + log4)=O(n^4*logn)
. I understand why g=O(n^4*logn)
, but I'm not sure how they reached the conclusion that f=O(g(n))
from big O
of g
.
I understand that f(n)
grows more slowly than g(n)
just by thinking about the graphs, but I'm having trouble understanding asymptotic behavior and why f=O(g(n))
in general.
Upvotes: 1
Views: 563
Reputation: 372764
The formal definition of big-O notation is that
f(n) = O(g(n)) if there are constants n0 and c such that for any n ≥ n0, we have f(n) ≤ c · g(n).
In other words, f(n) = O(g(n)) if for sufficiently large values of n, the value of f(n) is upper-bounded by some constant multiple of g(n).
Notice that this just says that f(n) is upper-bounded by g(n), not that f(n)'s rate of growth is the same as g(n)'s rate of growth. In that sense, you can think of f(n) = O(g(n)) as akin to saying something like "f ≤ g," that f doesn't grow faster than g, leaving open the possibility that g grows a lot faster than f does.
Upvotes: 2