Reputation: 21914
Reading up on some of the new style points of view in Python and getting around to the shift from StopIteration
to return
in generators. My main question is exactly how this should work in a custom generator. I've got a class where I'm over-writing the __next__
method directly as there's some logic I have to add in to track generation.
Below is pretty close to what I'm doing, has all of the critical elements. Note that I'm not actually just creating a generator that replicates a list, just making this a minimal example.
class Test():
items = <list>
def __init__(self):
self.index = 0
def __next__(self):
if self.index >= len(self.items):
raise StopIteration
value = self.items[self.index]
self.index += 1
return value
def reset(self):
self.index = 0
So in a case like this I'm iterating through a list until it's exhausted and then downstream calls will decide whether to reset the generator or to move on after it's exhausted. However, how do I enable something like this without using StopIteration
since it's being deprecated? The standard advice of using return
here to raise a StopIteration
doesn't seem to apply and I'd really rather not change downstream code to check for a generator yielding Nones
So what am I supposed to do here? Are StopIteration
exceptions still acceptable in __next__
?
Upvotes: 2
Views: 2978
Reputation: 11940
If you want a resettable wrapped generator, one possible option is to keep its data elsewhere:
class Gen:
def __init__(self):
self.items = []
def restart(self):
for x in self.items:
yield x
g = Gen()
for x in g.restart():
pass
for x in g.restart():
pass
Upvotes: 0
Reputation: 1121644
StopIteration
is not being deprecated, you merely misunderstood something about what generators are. You don't actually have a generator, you have an iterator. Generators are simply functions that use yield
to create an iterator.
You are creating your own base iterator implementation without using generators. And iterators raise StopIterator
from __next__
when they are done. Your code correctly does so here.
From the Generator Types section of the Python datamodel documentation:
Python’s generators provide a convenient way to implement the iterator protocol. If a container object’s
__iter__()
method is implemented as a generator, it will automatically return an iterator object (technically, a generator object) supplying the__iter__()
and__next__()
methods. More information about generators can be found in the documentation for the yield expression.
And from the same documentation, in the Iterator Types section:
iterator.__next__()
Return the next item from the container. If there are no further items, raise theStopIteration
exception.
There is a deprecation in place, but this only concerns using StopIteration
in generator functions. See PEP 479 - Change StopIteration handling inside generators:
This PEP proposes a change to generators: when
StopIteration
is raised inside a generator, it is replaced it withRuntimeError
. (More precisely, this happens when the exception is about to bubble out of the generator's stack frame.) Because the change is backwards incompatible, the feature is initially introduced using a__future__
statement.
Inside a generator, using return
triggers the StopIteration
exception. Raising StopIteration
manually can actually create obscure bugs, because any consumer of the resulting iterator can't distinguish between correct use of the exception and incorrect, accidental StopIteration
exceptions eminating from your generator. That makes such issues hard to debug, which is why their use inside a generator function is changing in a future Python version.
Side note: your implementation requires the iterator.__iter__()
method, which simply returns self
.
Upvotes: 6