Reputation: 1072
I have been through this Big-Oh explanation, understanding the complexity of two loops, difference between big theta and big-oh and also through this question.
I understand that we cannot say that Big-oh is used as the worst case, Omega as Best case and theta as average case. Big-oh has its own best, worst and average cases. But how we find out that any specific algorithm belongs from Big-oh, Big-theta or Big-Omega. or how we can check that if any algorithm belongs from all of these.
Upvotes: 1
Views: 169
Reputation: 28292
A function f(n) is Big-Oh of a function g(n), written f(n) = O(g(n)), if there exist a positive constant c and a natural number n0 such that for n > n0, f(n) <= c * g(n). A function f(n) is Big-Omega of g(n), written f(n) = Omega(g(n)), if and only if g(n) = O(f(n)). A function f(n) is Theta of a function g(n), written f(n) = Theta(g(n)), if and only if f(n) = O(g(n)) and f(n) = Omega(g(n)).
To prove any of the free, you do it by showing some function(s) are Big-Oh of some other functions. To show that one function is Big-Oh of another is a difficult problem in the general case. Any form of mathematical proof may be helpful. Induction proofs in conjunction with intuition for the base cases are not uncommon. Basically, guess at values for c and n0 and see if they work. Other options involve choosing one of the two and working out a reasonable value for another.
Note that a function may not be Big-Theta of any other function, if its tightest bounds from above and below are functions with different asymptotic rates of growth. However, I think it's usually a safe bet that most functions are going to be Big-Oh of something reasonably uncomplicated, and all functions typically looked at from this perspective are at least constant-time in the best case - Omega(1).
Upvotes: 2