hs2202
hs2202

Reputation: 19

Calculating time complexity for simple programs

I am new to programming and I came across this problem in my text book.

I have to find the worst case running time using Theta notation for this program :

1   i = 1, total = 0
2   while i < n/2 :
3       i = i*2
4       total = total + A[i]

What I understand is, We need to see how many times each line is executed.

The first line runs only 1 time.

The while loop runs multiple times. (lines 2 to 4) The while loop will stop when i < n/2.

And i is doubling up in each iteration of the loop (line 3).

So the total number of iterations will be sqrt(n/2 * 1/i) ?

Please help me out with this.

Upvotes: 0

Views: 77

Answers (3)

dfrib
dfrib

Reputation: 73206

When we're studying time complexity of an algorithm, we're interested in the asymptotic behaviour of the function. Hence, we can without loss of generality assume that n is somewhat large (used in n/2 - 1 ~= n/2 below).

Now, in the worst case scenario, the while loop just barely runs a final run; i.e. the loop is terminated after after iteration i=n/2-1 has finished.

We can then study the number of iterations in the loop using sigma notation:

enter image description here

Hence, the algorithm runs in Θ(log_2 n).

Upvotes: 0

FriedSaucePots
FriedSaucePots

Reputation: 1352

It will be log(n) (log base 2)


Explination: (This is going to be a little rough . . . )

1) Each iteration, you multiply i by 2 with i=1 initially. After x iterations of the loop, i = 2^x.

So, the question is how many loop iterations x occur in terms of n? We know the terminating condition is when i >= n/2. We also know after x iterations, i = 2^x. We can thus substitute 2^x for i to get the terminating condition 2^x >= n/2.

Now we have an equation that we can solve for x in terms of n. . . .

2^x >= n/2 terminating condition

log(2^x) >= log(n/2) take log (base 2) of both sides

x >= log(n/2) left-hand side simplifies to x

When the number of iterations x is greater than or equal to log(n/2), then the loop will terminate. So we expect the loop to run for roof(log(n/2)) iterations. This gives us a big-O runtime of O(log(n/2)) = O(log(n))


Just noticed this is for big-theta, I'll leave this hear anyway.

Upvotes: 0

if we suppose that a assingation, a adds, multiply or a condition is equal to 1. the result is 3 + 5n/2. Becuase, the loop entry 5 n / 2, and the last condition add the first assingnation is 3.

1 i = 1, total = 0 ->2

2 while i < n/2 : ->1 + n

3 i = i*2 ->2n

4 total = total + A[i] ->2n

5 the loop is divided by 2.

Sorry if my grammar have mistakes, i am not native speaker.

Upvotes: 0

Related Questions