lewicki
lewicki

Reputation: 479

Calculating the running time of insertion sort

INSERTION_SORT (A)

FOR j ← 2 TO length[A]           //c1*n
    key ← A[j]                   //c2*(n-1)
    i ← j − 1                    //c3*(n-1)
    WHILE i > 0 and A[i] > key   //c4*sigma(j=2 to n)of(tj)
         A[i +1] ← A[i]          //c5*sigma(j=2 to n)of(tj-1)
         i ← i − 1               //c6*sigma(j=2 to n)of(tj-1)
    A[i + 1] ← key               //c7*(n-1)

c1,c2,c3,c7 make total sense. What doesn't make sense is why:

c4*sigma(j=2 to n)of(tj) becomes c4*sigma(j=2 to n)of(j)

Let's say that we are calculating a worst case of an array of size 5. What the above line is saying is that the time on line 4 is:

c4*(1+2+3+4)

When in reality it is:

c4*((1)+(1+2)+(1+2+3)+(1+2+3+4))

What am I missing? I know the book has to be right.

EDIT Screwed up, should be:

 c4*((1)+(1+1)+(1+1+1)+(1+1+1+1))
=c4*(1+2+3+4)

Upvotes: 0

Views: 1414

Answers (1)

rici
rici

Reputation: 241931

But the time on line 4 is 2 + 3 + 4 + 5. The first time (j = 2), it's 2 (i = 1 and i = 0). The second time (j = 3), it's 3 (i = 2, i = 1, i = 0)...

Upvotes: 1

Related Questions