Reputation: 411
What would be the Big O of this function:
def foo(n):
if (n <= 1):
return 0
else:
return n + foo(n/2) + foo(n/2)
I think it might O(2^logn) because in each call, there are two other calls and n is divided by 2 until it gets to 1, thus the logn.
Upvotes: 0
Views: 262
Reputation: 18838
Moreover, you can use the master theorem as well (the first case). In your case as the recursive relation is T(n) = 2T(n/2) + 1
, if we want to write the relation in the form of T(n) = aT(n/b) + f(n)
, we will have:
a = 2, b = 2, f(n) = 1
=> c_crit = log2(2) = 1
As f(n) = 1 = O(n^c)
for all c > 0
, we can find a 0 < c < 1
that c < c_crit = 1
. Hence, the first case of the master theorem is satisfied. Therefore, T(n) = \Theta(n)
.
Upvotes: 1
Reputation: 350212
Yes, it is O(2logn), which really is equivalent to O(n).
You can analyse it as follows:
T(n) = 1 + 2T(n/2)
= 1 + 2(1 + 2T(n/4)) = (2²-1) + 2²T(n/4)
= (2²-1) + 2²(1 + 2T(n/8)) = (2³-1) + 2³T(n/8)
...
= (2logn - 1) + 2logn
= (n-1) + 2n
= O(n)
This may be a surprising result, but a snippet measuring the fraction between n and the number of calls of foo
, illustrates this time complexity:
let counter;
function foo(n) {
counter++;
if (n <= 1)
return 0;
else
return n + foo(n/2) + foo(n/2);
}
for (let n = 1; n < 100000; n*=2) {
counter = 0;
foo(n);
console.log(counter/n);
}
Upvotes: 3