Vik
Vik

Reputation: 53

Calculating Big O Notation with Recursion

I have try to understand the Big O Notation worst case runtime. But I still don't quite understand it.

This is some code that I wrote recently:

def g(n):
    if n==0:
        return 1
    elif n==1:
        return 2
    else:
        n_div=n//2
        n_mod=n%2
        return g(n_div)*g(n_mod)

So I hope that I'm at least right that:

def g(n):
    if n==0:
        return 1

and:

elif n==1:
    return 2

are O(1), so constant.

But what about the else part.

Is it O(n) because it depends on the n that I choose?

Can anyone explain what that Big O complexity of the else part is?

Upvotes: 5

Views: 2270

Answers (2)

Noufal Ibrahim
Noufal Ibrahim

Reputation: 72795

The O notation is not really for a part of the program. It really measures the way in which running time increases as the input size increases. In this case, the time consuming part is the final else part.

You need to find out how this works for your program. Here's a simple empirical analysis that should help you. If we instrument the program a little to print out how many times the final else part runs for a given input, we get this.

   n | times
-----+-----
   2 | 1
   4 | 2
   8 | 3
  16 | 4
  32 | 5
  64 | 6
 128 | 7
 256 | 8
 512 | 9
1024 | 10
2048 | 11
4096 | 12
8192 | 13

If you plot this, you'll see something like this.

enter image description here

You can see that as the size of the input increases, the number of calls also increases but sub linearly. As a matter of fact, it's logarithmic since the your n reduces by 50% each iteration of the loop.

Upvotes: 0

Nick is tired
Nick is tired

Reputation: 7055

Well you've noticed that you can separate your function into 3 individual cases and already identified that the first 2 are O(1). The third is slightly more tricky but you can separate that into two parts as well.

The recursion clearly occurs from:

g(n//2)*g(n%2)

We can immediately see that n%2 will evaluate either to 0 or 1, which will resolve one of the first 2 cases again, so we can disregard that. Leaving us with g(n//2). By rewriting this as a printing loop and examining the output you'll notice something:

>>> n = 37
>>> while n > 1:
...     n //= 2
...     print(n)
...
18
9
4
2
1

As you can see the terms decrease by half each time, and the same occurs in the recursion. This is logarithmic.

As a result the worst case for this function is O(logn).

You can learn more about what logn actually means in Big-O notation by looking at this question.

Upvotes: 2

Related Questions