X.enia
X.enia

Reputation: 33

Calculating Big O complexity of Recursive Algorithms

Somehow, I find that it is much harder to derive Big O complexities for recursive algorithms compared to iterative algorithms. Do provide some insight about how I should go about solving these 2 questions.

*assume that submethod has linear complexity

def myMethod(n)
    if (n>0)
    submethod(n)
    myMethod(n/2) 
    end
end 

def myMethod(k,n)
    if(n>0)
    submethod(k)
    myMethod(k,n/2) 
    end 
end

Upvotes: 1

Views: 316

Answers (2)

Sumeet
Sumeet

Reputation: 8292

For your first problem, the recurrence will be:

    T(n) = n + T(n/2)

    T(n/2) = n/2 + T(n/4)
    ...
    ...
    ...
    T(2) = 2 + T(1)

    T(1) = 1 + T(0) // assuming 1/2 equals 0(integer division)

adding up we get:

    T(n) = n + n/2 + n/4 + n/8 + ..... 1 + T(0)

         = n(1 + 1/2 + 1/4 + 1/8 .....) + k // assuming k = T(0)

         = n*1/(1 - 1/2)  ( sum of geometric series a/(1-r) when n tends to infinity)

         = 2n + k

Therefore, T(n) = O(n). Remember i have assumed n tends to infinity ,cause this is what we do in Asymptotic analysis.

For your second problem its easy to see that, we perform k primitive operations everytime till n becomes 0. This happens log(n) times. Therefore, T(n) = O(k*log(n))

Upvotes: 2

Selçuk Cihan
Selçuk Cihan

Reputation: 2034

All you need to do is count how many times a basic operation is executed. This is true for analysing any kind of algorithm. In your case, we will count the number of times submethod is called.

You could break-down the running time of call myMethod(n) to be 1 + myMethod(n / 2). Which you can further break down to 1 + (1 + myMethod(n / 4)). At some point you will reach the base case, in log(n)th step. That gives you an algorithm of log(n).

The second one is no different, since k is constant all the time, it will again take log(n) time, assuming submethod takes constant time regardless of its input.

Upvotes: 0

Related Questions