Reputation: 8402
I would like to save the state of itertools.product() after my program quits. Is it possible to do this with pickling? What I am planning to do is to generate permutations and if the process is interrupted (KeyboardInterrupt), I can resume the process the next time I run the program.
def trywith(itr):
try:
for word in itr:
time.sleep(1)
print("".join(word))
except KeyboardInterrupt:
f=open("/root/pickle.dat","wb")
pickle.dump((itr),f)
f.close()
if os.path.exists("/root/pickle.dat"):
f=open("/root/pickle.dat","rb")
itr=pickle.load(f)
trywith(itr)
else:
try:
itr=itertools.product('abcd',repeat=3)
for word in itr:
time.sleep(1)
print("".join(word))
except KeyboardInterrupt:
f=open("/root/pickle.dat","wb")
pickle.dump((itr),f)
f.close()
Upvotes: 6
Views: 575
Reputation: 10465
Like Raymond's this one creates a new iterator given the pools and start tuple, but instead of generating the full product of the pools and throwing some of it away with islice
, I only produce what's needed. This is done by splitting the remaining product into parts and chaining them. For example, with
pools = 'ABCDEFG', 'abcde', '123456'
start = 'D', 'c', '4'
I basically create and chain these three parts:
product('D', 'c', '456')
product('D', 'de', '123456')
product('EFG', 'abcde', '123456')
My solution, followed by Raymond's and testing:
def restart_product_at(start_group, *pools):
pools = tuple(map(tuple, pools))
ps = list(zip(start_group))
def parts():
inc = 0
for i in reversed(range(len(ps))):
ps[i] = pools[i][pools[i].index(*ps[i])+inc:]
yield product(*ps)
ps[i] = pools[i]
inc = 1
return chain.from_iterable(parts())
# Raymond's
def restart_product_at_Raymond(start_group, *pools):
n = 0 # Position of the start_group
for element, pool in zip(start_group, pools):
n *= len(pool)
n += pool.index(element)
p = product(*pools) # New fresh iterator
next(islice(p, n, n), None) # Advance n steps ahead
return p
# Testing
from itertools import *
pools = 'ABCDEFG', 'abcde', '123456'
start = 'D', 'c', '4'
expect = list(restart_product_at_Raymond(start, *pools))
result = list(restart_product_at(start, *pools))
print(result == expect)
print(*map(''.join, expect))
print(*map(''.join, result))
Upvotes: 0
Reputation: 110591
From the "works for me" department:
In [31]: a = product("abcd", repeat=3)
In [32]: next(a), next(a)
Out[32]: (('a', 'a', 'a'), ('a', 'a', 'b'))
In [33]: b = pickle.loads(pickle.dumps(a))
In [34]: next(b)
Out[34]: ('a', 'a', 'c')
In [35]: next(a)
Out[35]: ('a', 'a', 'c')
Upvotes: 1
Reputation: 226624
Perhaps the easiest thing to do is to save the next group that would be generated. Then a future run can rebuild a new product instance that starts with that group:
def restart_product_at(start_group, *pools):
n = 0 # Position of the start_group
for element, pool in zip(start_group, pools):
n *= len(pool)
n += pool.index(element)
p = product(*pools) # New fresh iterator
next(islice(p, n, n), None) # Advance n steps ahead
return p
For example:
>>> p = restart_product_at(('c', 'e', 'm'), 'abcd', 'efg', 'hijklm')
>>> next(p)
('c', 'e', 'm')
>>> next(p)
('c', 'f', 'h')
>>> next(p)
('c', 'f', 'i')
Upvotes: 3