Reputation: 1368
A Big O notation question...What is the Big O for the following code:
for (int i = n; i > 0; i = i / 2){
for (int j = 0; j < n; j++){
for (int k = 0; k < n; k++){
count++;
}
}
}
My Thoughts:
So breaking it down, I think the outside loop is O(log2(n))
, then each of the inner loops is O(n)
which would result in O(n^2 * log2(n))
Question #1 is that correct?
Question #2: when combining nested loops is it always just as simple as multiply the Big O of each loop?
Upvotes: 13
Views: 15424
Reputation: 2977
You can proceed formally using Sigma Notation, to faithfully imitate your loops:
Upvotes: 0
Reputation: 1810
To answer this slightly (note: slightly) more formally, say T(n)
is the time (or number of operations) required to complete the algorithm. Then, for the outer loop, T(n) = log n*T2(n)
, where T2(n)
is the number of operations inside the loop (ignoring any constants). Similarly, T2(n) = n*T3(n) = n*n.
Then, use the following theorem:
If f1(n) = O(g1(n)) and f2(n) = O(g2(n)), then f1(n)×f2(n) = O(g1(n)×g2(n))
(source and proof)
This leaves us with T(n) = O(n2logn).
"Combining nested loops" is just an application of this theorem. The trouble can be in figuring out exactly how many operations each loop uses, which in this case is simple.
Upvotes: 2
Reputation: 372794
When loop counters do not depend on one another, it's always possible to work from the inside outward.
The innermost loop always takes time O(n), because it loops n times regardless of the values of j and i.
When the second loop runs, it runs for O(n) iterations, on each iteration doing O(n) work to run the innermost loop. This takes time O(n2).
Finally, when the outer loop runs, it does O(n2) work per iteration. It also runs for O(log n) iterations, since it runs equal to the number of times you have to divide n by two before you reach 1. Consequently, the total work is O(n2 log n).
In general, you cannot just multiply loops together, since their bounds might depend on one another. In this case, though, since there is no dependency, the runtimes can just be multiplied. Hopefully the above reasoning sheds some light on why this is - it's because if you work from the inside out thinking about how much work each loop does and how many times it does it, the runtimes end up getting multiplied together.
Hope this helps!
Upvotes: 17
Reputation: 726579
logN
, the other two are N
each, for the total of O(N^2*LogN)
Upvotes: 4