DRock99
DRock99

Reputation: 15

Determine runtime complexity of following algorithm

I have to determine the runtime complexity of the following algorithm (see picture). I know how to do so theoretically, but for this specific example I'm having a hard time and also have no sample solution to see if I'm corrrect.

If someone could review my answer and correct it and maybe even give a explanation of the correct solution, that would be greatly appreciated.

    Overall --> Θ(n^2)
    for(int i=42; i<n*Math.log(n); i++)Θ(n)  
        foo(i);                        Θ(n*log2(n))
        for(int j=1; j<4711; j=j*3)    Θ(n)
            bar(i*j);                  Θ(log2(n))

foo(int n)                             --> Θ(n*log2(n))
    if(n%2 == 0)                       Θ(1)
        for(int i=1; i<2*n; i=i+2)     Θ(n)
            bar(i);                    Θ(log2(n))
    else                               --> Θ(n*log2(n))
        for(int i=1; i<n; i++)         Θ(n)
            bar(n);                    Θ(log2(n))
bar(int m)                              --> Θ(log2(n))
     while(m>0)                         Θ(log2(n))
         m = m/4;                       Θ(1)

Upvotes: 0

Views: 193

Answers (1)

Muhteva
Muhteva

Reputation: 2832

  1. Time complexity of bar(int m) is O(log(m)). It's actually log_4(m), however it can be converted to log base 2 like this: log_4(m) = log_2(m) / log_2(4). And log_2(4) is constant. Therefore it's O(log(m)). Your analysis is correct.

  2. Time complexity of foo(int n) is calculated like this: In first loop, we start from 1 and go to 2n two by two. It takes n iterations. In second loop, we start from 1 and go to n one by one. It takes n iterations. Complexity of bar(n) is O(log(n)). However complexity of bar(i) depends on i, not n. It's O(log(i)). So total number of operations can be calculated like this:
    If n is even, then number of total operations:
    log(1) + log(3) + ... + log(2n) = A
    Now, we know that A <= nlog(2n) = O(nlogn).
    We also know that A >= log(1) + log(2)+ ... + log(n) = log(n!) = nlogn = O(nlogn). (*)
    Therefore we squeezed A between O(nlogn) and O(nlogn), therefore time complexity for A is O(nlogn).
    If n is odd, then it is:
    log(n) + log(n) + ... + log(n) = nlog(n) Your time complexity analysis for the second part is completely correct.

  3. We call foo(i) approximately nlog(n) times. Then we call bar(i*j) 8 times. Notice that as n increases, this value doesn't change. Therefore we can ignore j since it's not really a variable based on input size. Time complexity of bar(i*j) is proportional to log(i).
    So overall, if i is even. Number of operations:
    ilog(i) + 8 * log(i).
    It's the same when i is odd. In total:
    Sum of ilog(i) from i=42 to i=nlog(n)
    42log(42) + 43log(43) + ... + nlog(n)log(nlog(n)) <= nlog(n)*nlog(n)*log(nlog(n)) = O(n^2 * log(n)^3))
    Time complexity of the whole function is O(n^2 * log(n)^3).

(*) If you wonder how we switched from log(n!) to nlog(n). Refer to Is log(n!) = Θ(n·log(n))?

Upvotes: 1

Related Questions