Reputation: 545
I'm reading a book on algorithm analysis and have found an algorithm which I don't know how to get the time complexity of, although the book says that it is O(nlogn)
.
Here is the algorithm:
sum1=0;
for(k=1; k<=n; k*=2)
for(j=1; j<=n; j++)
sum1++;
Upvotes: 4
Views: 6520
Reputation: 23
To add to @qwerty's answer, if a
is the number of times the outer loop runs:
k
takes values 1, 2, 4, ..., 2^a
and 2^a <= n
Taking log on both sides: log_2(2^a) <= log_2(n)
, i.e. a <= log_2(n)
So the outer loop has a upper bound of log_2(n)
, i.e. it cannot run more than log_2(n)
times.
Upvotes: 0
Reputation: 521864
Perhaps the easiest way to convince yourself of the O(n*lgn)
running time is to run the algorithm on a sheet of paper. Consider what happens when n is 64. Then the outer loop variable k
would take the following values:
1 2 4 8 16 32 64
The log_2(64)
is 6, which is the number of terms above plus one. You can continue this line of reasoning to conclude that the outer loop will take O(lgn)
running time.
The inner loop, which is completely independent of the outer loop, is O(n)
. Multiplying these two terms together yields O(lgn*n)
.
Upvotes: 7
Reputation: 116
To add a bit of mathematical detail...
Let a
be the number of times the outer loop for(k=1; k<=n; k*=2)
runs. Then this loop will run 2^a
times (Note the loop increment k*=2
). So we have n = 2^a
. Solve for a
by taking base 2 log on both sides, then you will get a = log_2(n)
Since the inner loop runs n
times, total is O(nlog_2(n))
.
Upvotes: 4
Reputation: 15982
In your first loop for(k=1; k<=n; k*=2)
, variable k
reaches the value of n
in log n
steps since you're doubling the value in each step.
The second loop for(j=1; j<=n; j++)
is just a linear cycle, so requires n
steps.
Therefore, total time is O(nlogn)
since the loops are nested.
Upvotes: 6