Reputation: 349
I have the following situation. Say I have a variable batch_size
and a list called data
. I want to pull batch_size
elements out of data
, so that when I hit the end I wrap around. In other words:
data =[1,2,3,4,5]
batch_size = 4
-> [1,2,3,4], [5,1,2,3], [4,5,1,2], ...
Is there some nice idiomatic way of returning slices like this? The start index is always batch_size * batch
modulo the length of data
, but is there a simple way of "wrapping around" from the beginning if batch_size * (batch+1)
goes beyond the length of the list? I can of course patch together two slices in this case, but I was hoping that there's some really clean way of doing this.
The only assumption I'm making is that batch_size < len(data)
.
Upvotes: 1
Views: 244
Reputation: 9267
You can, also, use deque
from collections
module and do one rotation over the deques like this example:
from collections import deque
def grouper(iterable, elements, rotations):
if elements > len(iterable):
return []
b = deque(iterable)
for _ in range(rotations):
yield list(b)[:elements]
b.rotate(1)
data = [1,2,3,4,5]
elements = 4
rotations = 5
final = list(grouper(data, elements, rotations))
print(final)
Output:
[[1, 2, 3, 4], [5, 1, 2, 3], [4, 5, 1, 2], [3, 4, 5, 1], [2, 3, 4, 5]]
Upvotes: 2
Reputation: 54223
You could use itertools.cycle
and the grouper
recipe from itertools
import itertools
def grouper(iterable, n, fillvalue=None):
"Collect data into fixed-length chunks or blocks"
# grouper('ABCDEFG', 3, 'x') --> ABC DEF Gxx"
args = [iter(iterable)] * n
return itertools.zip_longest(*args, fillvalue=fillvalue)
data = [1,2,3,4,5]
batch_size = 4
how_many_groups = 5
groups = grouper(itertools.cycle(data), batch_size)
chunks = [next(groups) for _ in range(how_many_groups)]
The result of chunks is then:
[(1, 2, 3, 4),
(5, 1, 2, 3),
(4, 5, 1, 2),
(3, 4, 5, 1),
(2, 3, 4, 5)]
So if you actually need those as lists, you'll have to cast it as such ([list(next(groups)) for ...]
)
Upvotes: 2