Reputation: 26981
Let's say I have these parsers:
parsers = {
".foo": parse_foo,
".bar", parse_bar
}
parse_foo
and parse_bar
are both generators that yield rows one by one. If I wish to create a single dispatch function, I would do this:
def parse(ext):
yield from parsers[ext]()
The yield from syntax allows me to tunnel information easily up and down the generators.
Is there any way to maintain the tunneling while modifying the yield results?
Doing so while breaking the tunneling is easy:
def parse(ext):
for result in parsers[ext]():
# Add the extension to the result
result.ext = ext
yield result
But this way I can't use .send()
or .throw()
all the way to the parser.
The only way I'm thinking of is by doing something ugly like try: ... except Exception: ...
and pass the exceptions up, while doing the same for .send()
. It's ugly, messy and bug-prone.
Upvotes: 9
Views: 1947
Reputation: 26981
Unfortunately there is no built-in that does it. You may implement it yourself using classes but a package called cotoolz implements a map()
function that does exactly that.
Their map function is 4 times slower than the builtin map()
but it's aware to the generator protocol, and faster than a similar Python implementation (it's written in C and requires a C99 compiler).
An example from their page:
>>> def my_coroutine():
... yield (yield (yield 1))
>>> from cotoolz import comap
>>> cm = comap(lambda a: a + 1, my_coroutine())
>>> next(cm)
2
>>> cm.send(2)
3
>>> cm.send(3)
4
>>> cm.send(4)
Traceback (most recent call last):
...
StopIteration
Upvotes: 0
Reputation: 29956
There is another way doing this besides try ... yield ... except
: by implementing a new generator. With this class you can transform all the inputs and outputs of your underlying generator:
identity = lambda x: x
class map_generator:
def __init__(self, generator, outfn = identity,
infn = identity, throwfn = identity):
self.generator = generator
self.outfn = outfn
self.infn = infn
self.throwfn = throwfn
self.first = True
def __iter__(self):
return self
def __next__(self):
return self.send(None)
def _transform(self, value):
if self.first:
self.first = False
return value
else:
return self.infn(value)
def send(self, value):
return self.outfn(self.generator.send(self._transform(value)))
def throw(self, value):
return self.outfn(self.generator.throw(self.throwfn(value)))
def next(self): # for python2 support
return self.__next__()
Usage:
def foo():
for i in "123":
print("sent to foo: ", (yield i))
def bar():
dupe = lambda x:2*x
tripe = lambda x:3*x
yield from map_generator(foo(), dupe, tripe)
i = bar()
print("received from bar: ", i.send(None))
print("received from bar: ", i.send("B"))
print("received from bar: ", i.send("C"))
...
received from bar: 11
sent to foo: BBB
received from bar: 22
sent to foo: CCC
received from bar: 33
EDIT: You might want to inherit from collections.Iterator
, but it is not neccessary in this usecase.
Upvotes: 2
Reputation: 4155
Have parse_foo
and parse_bar
add the extensions:
def parse_foo(ext):
# Existing code
...
# Add an extension to the item(s)
item.ext = ext
def parse(ext):
yield from parsers[ext](ext)
Or just hardcode it in each function:
def parse_foo():
# Existing code
...
# Add an extension to the item(s)
item.ext = ".foo"
Upvotes: 0