PetarMI
PetarMI

Reputation: 390

Big-O - growth rate of a function

I wanted to know more about Big-O and found this piece of information:

'if f(x) = O(g(x)) the growth rate of f(x) is asymptotically less than or equal to the growth rate of g(x)'

What does asymptotically mean in this case scenario? Also I have some difficulty understanding why doesn't Big-Theta depend on the computer we are using? Can anyone please provide some more information on these two questions?

Upvotes: 3

Views: 1584

Answers (1)

templatetypedef
templatetypedef

Reputation: 372784

The term "asymptotically" in this context refers to "as x goes to infinity." When someone says "f(x) grows asymptotically slower than g(x)," they mean that for very large values of x, the function g(x) will grow faster than the function f(x). This means that for sufficiently large x, as long as f(x) ≥ 1 and g(x) ≥ 1, the value of g(x) will always eventually be bigger than the function f(x).

As to this question:

Also I have some difficulty understanding why doesn't Big-Theta depend on the computer we are using?

Although O, Θ, and Ω notations are used extensively in CS to describe runtimes, that's not actually what they mean. Technically, this notation is used to quantify the growth rates of functions, regardless of what those functions actually mean. For example, you'll find big-O notation used in Stirling's approximation in discrete math, which estimates ln n! as n ln n - n + O(log n), where O(log n) means "some function whose growth rate is O(log n)." In CS, when we say that an algorithm is O(n), what that really means is "the function that describes the runtime of this function is O(n)." More properly, you'd see expressions like "the runtime of the algorithm is O(n)" or "the algorithm takes time O(n)," highlighting that the big-O notation is used to describe the runtime of the algorithm rather than the algorithm itself. In this sense, one answer to your question of "why doesn't Θ notation depend on the computer?" is "Θ notation just quantifies growth rates of functions and had nothing to do with computers."

In another sense, the reason why the particular computer doesn't matter is that Θ notation talks about asymptotic runtimes, not absolute runtimes. If an algorithm has runtime Θ(n), it means that the algorithm's runtime scales as some linear function. The runtime of that algorithm as a size of the input might actually be something like 100n + 137 or 20,000,000n - 15 because all that matters is the way that those runtimes grow, not the runtimes themselves. If you run the same code on different computers, it might take more or less time to run depending on the computer chosen, but almost certainly it won't change from scaling linearly to scaling quadratically. This means that asymptotically, the runtimes are the same, though absolutely the runtimes might be very different.

Hope this helps!

Upvotes: 4

Related Questions