Sam Vanspringel
Sam Vanspringel

Reputation: 29

Can someone calculate the time complexity in big Oh notation

Can someone help me explain the big Oh complexity of this algorithm? I've been trying for an hour, but I don't understand the complexity of the recursion. I know the 3 first statements are O(1) and thought that the ones underneath that were O(n/2) and O(n/2) so together O(n), but I am not sure.

def sum(a):
    if len ( a ) == 0:
        return 0
    elif len ( a ) == 1:
        return a [0]
    m = len (a )//2

    b = sum (a [: m ])
    c = sum (a [ m :])

    return b + c

Upvotes: 2

Views: 207

Answers (2)

meowgoesthedog
meowgoesthedog

Reputation: 15035

According to Python's time complexity documentation, slicing a list is O(k) where k is the length of the slice.

The function calls itself recursively on the two slices, each with size n / 2 (ignoring off-by-one due to integer rounding). The recurrence relation is thus:

T(n) = 2 T(n / 2) + O(n)

... where O(n) comes from the two slicing operations a[:m] and a[m:].

According to the Master Theorem, this is O(n log n).


To make the algorithm O(n) as it should be, one could pass sub-array indices instead of directly slicing the array, or use numpy's array type.

Upvotes: 2

abc
abc

Reputation: 11929

If you consider the number of times you call recursively the function, this number is 2n-1. For each call, you have a constant number of constant time operations (so O(1)).

To conclude the time complexity is O(n), linear to the length of the list.

You may try to define a recurrence relation in order to prove it formally.

Note
The question may be tricky, due to the slicing operation used at each iteration. This depends on the implementation details (for python it looks like it is O(k) with k the length of the list but for numpy it should be O(1) as pointed out in the comments) and may be easily replaced by passing indexes (e.g., start, end) and not a sliced list.
Here I assume the operation takes constant time.

Upvotes: 3

Related Questions