Reputation: 3457
Like the Big O notation "O(1)" can describe following code:
O(1):
for (int i = 0; i < 10; i++) {
// do stuff
a[i] = INT;
}
O(n):
for (int i = 0; i < n; i++) {
// do stuff
a[i] = INT;
}
O(n^2):
for (int i = 0; i < n; i++) {
for (int j = 0; j < n; j++) {
// do stuff
a[i][j] = INT;
}
}
Another question:
Upvotes: 29
Views: 61255
Reputation: 524
Simplest code with a for loop that you would use to represent:
O(1):
function O_1(i) {
// console.log(i);
return 1
}
O(n):
function O_N(n) {
count = 0;
for (i = 0; i < n; i++) {
// console.log(i);
count++;
}
return count
}
O(n²):
function O_N2(n) {
count = 0;
for (i = 0; i < n; i++) {
for (j = 0; j < n; j++) {
// console.log(i, j);
count++;
}
}
return count
}
O(Log2(n)):
function O_LOG_2(n) {
count = 0;
for (var i = 1; i < n; i = i * 2) {
count++;
}
return count
}
O(Sqrt(n)):
function O_SQRT(n) {
count = 0;
for (var i = 1; i * i < n; i++) {
// console.log(i);
count++;
}
return count
}
Upvotes: 7
Reputation: 95948
Classic example:
while (x > 0) {
x /= 2;
}
This will be:
Iteration | x
----------+-------
0 | x
1 | x/2
2 | x/4
... | ...
... | ...
k | x/2^k
2k = x → Applying log to both sides → k = log(x)
Upvotes: 61
Reputation: 1
In the case of binary search, you are trying to find the maximum number of iterations, and therefore the maximum number of times the search space can be split in half. This is accomplished by dividing the size of the search space, n, by 2 repeatedly until you get to 1.
Let's give the number of times you need to divide n by 2 the label x. Since dividing by 2, x times is equivalent to dividing by 2^x, you end up having to solve for this equation:
n/2^x = 1, which becomes n = 2^x,
So using logarithm, x = log(n), so BIG - O of binary search is O(log(n))
To reiterate: x is the number of times you can split a space of size n in half before it is narrowed down to size 1.
http://www.quora.com/How-would-you-explain-O-log-n-in-algorithms-to-1st-year-undergrad-student
Upvotes: 0
Reputation: 5629
From definition, log(n) (I mean here log with base 2, but the base really doesn't matter), is the number of times, that you have to multiply 2 by itself to get n. So, O(log(n)) code example is:
i = 1
while(i < n)
i = i * 2
// maybe doing addition O(1) code
In real code examples, you can meet O(log(n)) in binary search, balanced binary search trees, many resursive algoritmhs, priority queues.
Upvotes: 4
Reputation: 367
It might be worth emphasizing that the lower complexity algorithms you described are subsets of the the higher complexity ones. In other words,
for (int i = 0; i < 10; i++) {
// do stuff
a[i] = INT;
}
is in O(1), but also in O(n), O(n²), and, if you wanted to be clever, O(log(n)).Why? Because all constant time algorithms are bounded by some linear, quadratic, etc. functions.
What solutions are there for "Big O problems" (what to do, when getting a lot of data as an input)?
This question doesn't make a lot of sense to me. "A lot of data" is a quite arbitrary. Still, keep in mind that Big O isn't the only measure of time complexity; apart from measuring worst case time complexity, we can also examine best-case and average case, though these can be a bit trickier to calculate.
Upvotes: 0
Reputation: 3854
Binary Search is an example O(log(n)). http://en.wikipedia.org/wiki/Binary_search_algorithm.
Upvotes: 1
Reputation: 2079
For O(logn), please have a look at any code that involves divide and conquer strategy Example: Merge sort & quick sort(expected running time is O(nlogn) in these cases)
Upvotes: 3