Citut
Citut

Reputation: 907

Execution time of these loops?

I am studying for an exam and we have some practice problems. Solutions to the practice exam aren't given until two hours before the actual exam, so I have no way to see if I actually understand these concepts. The questions and my work is below.

Worst-case execution time of the following nested loops:

for(int i = 0; i < n - 1; i++)
{
    for(int j = 0; j < n - i - 1; j++)
    {
        do some comparisons. (this is just O(1))
    }
}

The outer loop has n-1 iterations, I'm not sure about the inner loop, because it is dependent on i. When i is 0, the inner loop has n-1 iterations, when i is 1 the inner loop has n-2 iterations. My answer for the worst case execution time of this algorithm was O(n^2), is this correct?

int n = arr.length
bar(arr); //function bar takes O(n^2)
while(n > 0)
    foo1(arr); //function foo1 takes O(n)
    foo2(arr); //function foo2 takes O(n log n)
    n = n -2;

In this case, the while loop iterates n/2 times (n decreases by 2 every iteration). Because foo1 and foo2 are both in the loop, they collectively take O(n^2*log n). The function bar outside of loop takes O(n^2), so collectively the three functions take O(n^2*log n + n^2) (not including the while loop). I'm not sure how to include the while loop in this case, would it be multiplied by the two inner functions or added?

Upvotes: 0

Views: 297

Answers (2)

GalAbra
GalAbra

Reputation: 5148

  1. In your first question you should notice that the external loop does have a run-time of O(n); and that makes the nested loop run-time to be the sum of the following:

    (n-1) + (n-2) + ... + (1)
    

    Which is an arithmetic sequence with sum:

    S = (a1 + an)*(n/2) = (n-1+1)*(n-1) = n^2 - n = O(n^2)
    

    And therefore the overall run-time is indeed O(n^2)

  2. By definition, T(n) can be described as O(f(n)) if there exist a constant c so that T(n) <= c * f(n), starting from a certain n and further. Hence, the "faster-growing" function determines f(n).

    In your case:

    n^2*log(n) + n^2 = n^2 * (log(n) + 1) <= c * n^2 * log(n)
    

    (Because n goes to infinity, it's easy to find a value of c that satisfies c >= 1 + (1/log(n)))

Upvotes: 0

Tim Han
Tim Han

Reputation: 166

First one is correct. O(n^2). For the second one, technically speaking, the runtime on foo1 compare to foo2 is not significant as the size of n grows very large. It's a simple limit comparison as n approaches infinity. The while loop would be O(n^2*log n) because as you said it is n/2 iterations so n/2 * n log n gives you n^2 log n and just like before 1/2 is not a significant constant as n grows extremely large. The bar function taking O(n^2) will also be insignificant to O(n^2*log n) as n grows very large. So the runtime of the second problem would be O(n^2*log n).

Even in your final guess. If you compare the two terms, n^2*log n will grow much faster than n^2 as n approaches a very large number. So you would get the same conclusion.

Upvotes: 1

Related Questions