sprksh
sprksh

Reputation: 2384

Time complexity of slicing list into two in python

The slicing operation like some_list[i:i+k] takes O(k) time as it has to iterate from i to i+k when creating the new list as per this.

I am confused about the time complexity of the list slicing operation like chunk_1, chunk_2 = some_list[:chunk_size], some_list[chunk_size:].

Correspondingly what should be overall time complexity of this kind of operation:

while some_list:
    chunk, some_list = some_list[:chunk_size], some_list[chunk_size:]

I suppose, in this operation, the cost of copying the elements into new chunks will also add to the overall cost.

Is there a better way for breaking a large list in evenly sized chunks?

Update:

Did some Profiling to check whether the while loop is O(n^2) operation. Adding the results:

In [1]: def chunk(l, c):
   ...:     while l:
   ...:         l_c, l = l[:c], l[c:]
   ...:

In [2]: l = list(range(1000))

In [3]: %timeit chunk(l, 10)
134 µs ± 702 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each)

In [4]: l = list(range(10000))

In [5]: %timeit chunk(l, 10)
16.1 ms ± 99.8 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)

In [6]: l = list(range(100000))

In [7]: %timeit chunk(l, 10)
1.91 s ± 14 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)

In terms of time complexity, can someone suggest a better way? The data in the list is not numerical so can't use Numpy.

Upvotes: 2

Views: 802

Answers (1)

Tasnuva Leeya
Tasnuva Leeya

Reputation: 2795

You can use generator. generator will be much more efficient as it will yield the chunks:

def chunks(lst, n):
    """Yield successive n-sized chunks from lst."""
    for i in range(0, len(lst), n):
        yield lst[i:i + n]

see the original answer here How do you split a list into evenly sized chunks?

Upvotes: 1

Related Questions