Reputation: 83
I‘m trying to wrap my head around the meaning of the Landau-Notation in the context of analysing an algorithm‘s complexity.
What exactly does the O formally mean in Big-O-Notation?
So the way I understand it is that O(g(x)) gives a set of functions which grow as rapidly or slower as g(x), meaning, for example in the case of O(n^2):
where t(x) could be, for instance, x + 3 or x^2 + 5. Is my understanding correct?
Furthermore, are the following notations correct?
I saw the following written down by a tutor. What does this mean? How can you use less or equal, if the O-Notation returns a set?
Could I also write something like this?
Upvotes: 2
Views: 680
Reputation: 5409
O(g(x))
gives a set of functions which grow as rapidly or slower asg(x)
That's technically right, but a bit imprecise. A better description contains the addenda
O(g(x))
gives the set of functions which are asymptotically bounded above byg(x)
, up to constant factors.
This may seem like a nitpick, but one inference from the imprecise definition is wrong.
The 'fixed version' of your first equation, if you make the variables match up and have one limit sign, seems to be:
This is incorrect: the ratio only has to be less than or equal to some fixed constant c > 0
.
Here is the correct version:
where c
is some fixed positive real number, that does not depend on n
.
For example, f(x) = 3 (n^2)
is in O(n^2)
: one constant c
that works for this f
is c = 4
. Note that the requirement isn't 'for all c > 0
', but rather 'for at least one constant c > 0
'
The rest of your remarks are accurate. The <=
signs in that expression are an unusual usage, but it's true if <=
means set inclusion. I wouldn't worry about that expression's meaning.
There's other, more subtle reasons to talk about 'boundedness' rather than growth rates. For instance, consider the cosine
function. |cos(x)|
is in O(1)
, but its derivative fluctuates from negative one to positive one even as x
increases to infinity.
If you take 'growth rate' to mean something like the derivative, example like these become tricky to talk about, but saying |cos(x)|
is bounded by 2
is clear.
For an even better example, consider the logistic curve. The logistic function is O(1)
, however, its derivative and growth rate (on positive numbers) is positive. It is strictly increasing/always growing, while 1
has a growth rate of 0
. This seems to conflict with the first definition without lots of additional clarifying remarks of what 'grow' means.
An always growing function in O(1)
(image from the Wikipedia link):
Upvotes: 1
Reputation: 2832
So the way I understand it is that O(g(x)) gives a set of functions which grow as rapidly or slower as g(x).
This explanation of Big-Oh notation is correct.
f(n) = n^2 + 5n - 2, f(n) is an element of O(n^2)
Yes, we can say that. O(n^2)
in plain English, represents "set of all functions that grow as rapidly as or slower than n^2". So, f(n)
satisfies that requirement.
O(n) is a subset of O(n^2), O(n^2) is a subset of O(2^n)
This notation is correct and it comes from the definition. Any function that is in O(n)
, is also in O(n^2)
since growth rate of it is slower than n^2
. 2^n
is an exponential time complexity, whereas n^2
is polynomial. You can take limit of n^2
/ 2^n
as n
goes to infinity and prove that O(n^2)
is a subset of O(2^n)
since 2^n
grows bigger.
O(n) <= O(n^2) <= O(2^n)
This notation is tricky. As explained here, we don't have "less than or equal to" for sets. I think tutor meant that time complexity for the functions belonging to the set O(n)
is less than (or equal to) the time complexity for the functions belonging to the set O(n^2)
. Anyways, this notation doesn't really seem familiar, and it's best to avoid such ambiguities in textbooks.
Upvotes: 1