SOAP
SOAP

Reputation: 179

Analysing running time, Big O

I am having problems with anyalysing the time complexity of an algorithm.

For example the following Haskell code, which will sort a list.

sort xs
|isSorted xs= xs
|otherwise= sort (check xs)
    where 
    isSorted xs=all (==True) (zipWith (<=) xs ( drop 1 xs))
    check [] =[]
    check [x]=[x]
    check (x:y:xs) 
        |x<=y = x:check (y:xs)
        |otherwise=y:check (x:xs)

So for n being the length of the list and t_isSorted(n) the running time function: there is an constant t_drop(n) =c and t_all(n)=n, t_zipWith(n)=n :

   t_isSorted(n)= c + n +n

For t_check:

   t_check(1)=c1

   t_check(n)=c2 + t_check(n-1), c2= for comparing and changing an element
          .
          .
          .
   t_check(n)=i*c2 + tcheck_(n-i), with i=n-1
            =(n-1)*c2 + t_check(1)
            =n*c2 - c2 + c1

And how exactly do I have to combine those to get t_sort(n)? I guess in the worst- case, sort xs has to run n-1 times.

Upvotes: 1

Views: 203

Answers (2)

kraskevich
kraskevich

Reputation: 18556

The time complexity is O(n^2).

You're right, one step takes O(n) time (for both isSorted and check functions). It is called no more than n times (maybe even n - 1, it doesn't really matter for time complexity) (after the first call the largest element is guaranteed to be the last one, the same is the case for the second largest after the second call. We can prove that the last k elements are the largest and sorted properly after k calls). It swaps only adjacent elements, so it removes at most one inversion per step. As the number of inversions is O(n^2) in the worst case (namely, n * (n - 1) / 2), the time complexity is O(n^2).

Upvotes: 1

Drathier
Drathier

Reputation: 14519

isSorted is indeed O(n), since it's dominated by zipWith which in turn is O(n) since it does a linear pass over its argument.

check itself is O(n), since it only calls itself once per execution, and it always removes a constant number of elements from the list. The fastest sorting algorithm (without knowing something more about the list) runs in O(n*log(n)) (equivalent to O(log(n!)) time). There's a mathematical proof of this, and this algorithm is faster, so it cannot possibly be sorting the whole list.

check only moves things one step; it's effectively a single pass of bubble sort.

Consider sorting this list: [3,2,1]

check [3,2,1] = 2:(check [3,1]) -- since 3 > 2
check [3,1] = 1:(check [3]) -- since 3 > 1
check [3] = [3] 

which would return the "sorted" list [2,1,3].

Then, as long as the list is not sorted, we loop. Since we might only put one element in its correct position (as 3 did in the example above), we might need O(n) loop iterations.

This totals at a time complexity of O(n) * O(n) = O(n^2)

Upvotes: 3

Related Questions