shtuken
shtuken

Reputation: 161

Determining the Big O of a recursive method with two recursive cases?

I am currently struggling with computing the Big O for a recursive exponent function that takes a shortcut whenever n%2 == 0. The code is as follows:

public static int fasterExponent(int x, int n){
    if ( n == 0 ) return 1;
    if ( n%2 == 0 ){
        int temp = fasterExponent(x, n/2);
        return temp * temp;
    }
    return x * fasterExponent(x, --n); //3
}

I understand that, without the (n%2 == 0) case, this recursive exponent function would be O(n). The inclusion of the (n%2 == 0) case speeds up the execution time, but I do not know how to determine neither its complexity nor the values of its witness c.

Upvotes: 3

Views: 190

Answers (2)

MartinS
MartinS

Reputation: 2790

The answer is O(log n).

Reason: fasterExponent(x, n/2) this halfes the input in each step and when it reaches 0 we are done. This obviously needs log n steps. But what about fasterExponent(x, --n);? We do this when the input is odd and in the next step it will be even and we fall back to the n/2 case. Let's consider the worst case that we have to do this every time we divide n by 2. Well then we do the second recursive step once for every time we do the first recursive step. So we need 2 * log n operations. That is still O(log n). I hope my explanation helps.

Upvotes: 6

Aravind
Aravind

Reputation: 3189

It's intuitive to see that at each stage you are cutting the problem size by half. For instance, to find x4, you find x2(let's call this A), and return the result as A*A. Again x2 itself is found by dividing it into x and x.

Considering multiplication of two numbers as a primitive operation, you can see that the recurrence is:

T(N) = T(N/2) + O(1)
Solving this recurrence(using say the Master Theorem) yields:
T(N) = O(logN)

Upvotes: 3

Related Questions