user3261177
user3261177

Reputation: 11

time complexity for this algorithm with two for loops

Can you explain me how to find time complexity for this

sum1=0;
for(k=1;k<=n;k*=2)
  for(j=1;j<n;j++)
     sum++

I am taking consideration of N as base, and tried calculating how many iteration happens when

n=1,2,3...n

But i am not able to genralize it. PLease some one explain me, thoroughly. I have even checked other queries of similar type like: saying it follows

1,2,4,8...n so it will be
2^n
nlogn

But i dint understand. I dont know how they say 2^n=logn. Please any properly explain how to calculate step by step..

Thanks :)

Upvotes: 1

Views: 5566

Answers (4)

Karoly Horvath
Karoly Horvath

Reputation: 96316

k grows exponentially, so it takes O(log n) time to reach n. Check the definition of logarithm. That's your outer loop.

The inner loop is a straightforward O(n).

Upvotes: 2

Prince
Prince

Reputation: 20882

Given your problem structure:

sum1=0;
for(k=1;k<=n;k*=2)  // <-- Takes 1+1/2+1/2^2+1/2^3...+1/2^n = O(log(n)) time
   for(j=1;j<n;j++)  // <-- Takes 1+2+3+...+n = O(n) time
      sum++

The outer loop increments the value of k by k*2, so you are basically half-ing the search space every time:

k = 1*2 = 2 => 1/2^1 time
k = 2*2 = 4 => above + 1/2^2 time
k = 4*2 = 8 => above + 1/2^3 time
k = 8*2 = 16 => above + 1/2^4 time
.   .     .
.   .     .
.   .     .
k = n*2 = 2^n => above + 1/2^n time

To put it another way, you are discarding half the elements every time. With that said, you are actually searching an array of size n in

1+(1/2)+(1/2^2)+(1/2^3)+...+(1/2^n) time = log(n) time

How did the log(n) come?

Take the analogy with binary tree in which at every node we are branching into two nodes which could be shown as a recurrence relation:

T(n) = 2T(n/2)

As the search space reduces by 2, as we go further from the root we must reach a boundary condition when the search space is equal to 1. The search space size at a depth of k would be n/2^k (similar to what we saw above).

On equating the search spaces, our equation becomes:

n/2^k = 1
=> n = 2^k

Taking log on both sides:
=> log n = k log 2
=> log n/ log 2 = k

Rearranging:
k = log n base 2.

So at the leaf node, the height of the binary tree that we would have traversed would be log(n) base 2. In our case we are diving the search space by two and traversing our search space in powers of two reaching the end element in log(n) time.

The overall time complexity of both the loops would be in the order of n * log n = O(nlog(n)).

Upvotes: 1

waTeim
waTeim

Reputation: 9244

Since the inner loop is

for(j=1;j<n;j++)

The inner loop will execute n times for each iteration of the outer loop. You simply need to know how many times the outer loop executes, and multiply the two values.

k will take on the following values

2^0, 2^1, 2^2, ...., 2^i where 2^i < n

take the log (base 2) of both sides of the inequality

     2^i < n
log(2^i) < log(n)
i*log(2) < log(n)
       i < log(n)

The outer loop will execute exactly i + 1 times so the complexity will be O(i*n), but from above, i < log(n), so the complexity will be O(nlogn)

If instead, the inner loop is

for(j=1;j<k;j++)

Then the inner loop will execute

2^0 + 2^1 + ... + 2^i 

Note how this is a series and not a sequence. In fact it's the well known geometric series. The sum of this series is given by

a(1 - r^(i + 1))/(1 - r) 

where a = 1, r = 2, and if you substitute in log(n) for i (it's log(n) from previous analysis), you get

2^(log(n) + 1) - 1

which is

2*n - 1

Upvotes: 0

Martin Dinov
Martin Dinov

Reputation: 8825

In order to get the full complexity of the two loops, you need to multiply the outer loop's by the inner loop's complexity. This is done by thinking of what happens as n gets very large (technically, as n tends to infinity). How many operations would you have to perform in total on the sum, or, alternatively, on the loop variables?

In this case, the outer loop executes log(n) times, as k grows exponentially. Exponentially because k doubles in each iteration of the outer loop, so that means that it grows as 2^k, until 2^k gets to n. A note on multiplying the loop variable like this. As per your question in the comments, if you multiply k by 3, you will still get log(n), but it will be log with base-3, as opposed to log with base-2. Asymptotically, bases of logs are not relevant, so you can simply say that it is log(n), ignoring the base. The first loop therefore has an O(log(n)) complexity. The inner loop is still linear (runs through n times), so if you multiply those you get the O(n*log(n)) complexity.

In summary: O(nlogn)

Upvotes: 1

Related Questions