iterator idea
Paul Rubin
http
Thu Oct 26 03:03:48 EDT 2006
I wonder if Python would benefit from letting a generator jump to
another iterator, i.e. something like
yield return *seq
would be the equivalent of
for s in seq: yield s
return
except that it would work by transferring control directly to seq's
iterator (like a goto) instead of calling it like a subroutine. The
idea is to solve one of the perennial iterator annoyances, the
inability to push items back on an iterator. For example, the
annoyance means itertools.takewhile consumes an extra element with no
reasonable way to retrieve it. The new feature would let us write
something like (obviously untested):
def takewhile_exact (pred, seq):
"""return two iterators (head,tail). head is the same as
itertools.takewhile(pred,seq). tail is the rest of the
elements of seq, with none missing. You can't use tail
until you've iterated through head."""
buf = []
def head():
for s in seq:
if pred(s): yield s
# save non-matching item so that tail can find it
buf.append(s)
def tail():
if not buf:
raise IndexError, "not yet ready"
yield buf.pop()
yield return *seq # transfer control to seq
return head(), tail()
The reason tail() can't just iterate through seq is that you
might call takewhile_exact many times:
def is_even(n): return (n%2 == 0)
def is_odd(n): return (n%2 != 0)
# we want to do something with all the runs of even numbers
# in the sequence, and similarly with the odds. seq is an
# infinite sequence.
while True:
evens, seq = takewhile_exact(is_even, seq)
do_something_with (evens)
odds, seq = takewhile_exact(is_odd, seq)
do_something_else_with (odds)
Without the "goto"-like transfer, we'd get deeper and deeper in nested
iterators and eventually overflow the call stack.
More information about the Python-list
mailing list