Reputation: 309
I'm given code for an algorithm as such:
1 sum =0;
2 for(i=0; i<n; i++)
3 for(j=1; j<= n; j*=3)
4 sum++;
I'm told this algorithm runs in O(nlogn)
where log is in base 3.
So I get that the second line runs n times
, and since line 3 is independent of line 2 I would have to multiply the two to get the Big O, however, I'm not sure how the answer is nlogn(log in base 3), is their a guaranteed way to figure this out everytime? It seems like with nested loops, there's a different case that can occur each time.
Upvotes: 3
Views: 278
Reputation: 169
Yes. Algo runs in nlog(n) times where log is base 3.
The easiest way to calculate complexity is to calculate number of operations done.
The outer for loop runs n times. And let's calculate how many times each inner for loop runs for each n. So for n=1,2, inner for loop runs 1 times. For n=3,4,5,6,7,8 inner for loop runs 2 times. And so on...
That means that the inner for loop runs in logarithmic time (log(n)) for each n.
So n*log(n) will be total complexity.
Upvotes: 1
Reputation: 222879
What you have been told is correct. The algorithm runs in O(nlog(n))
. The third line: for(j=1; j<= n; j*=3)
runs in O(log3(n))
because you multiply by 3 each time. To see it more clearly solve the problem: how many times you need to multiply 1 by 3 to get n. Or 3^x = n
. The solution is x = log3(n)
Upvotes: 4
Reputation: 930
On the second loop you have j *= 3
, that means you can divide n by 3 log3(n) times. That gives you the O(logn) complexity.
Since your first loop has a O(n) you have O(nlogn) in the end
Upvotes: 0