PEP 288: Generator Attributes

Raymond Hettinger vze4rx4y at verizon.net
Wed Dec 4 03:41:39 EST 2002


[Bengt Richter]
> I'd rather see generators extended in a backwards compatible way to allow
you to
> write the minimal example thus:
>
>         def mygen(data):
>                while True:
>                    print data
>                    yield None
>
>         g = mygen.iter()       # mygen.__class__.iter() gets called with
the function instance
>         g(1)                   # prints 1 (arg name data is rebound to 1,
then g.next() effect is done)
>         g(2)                   # prints 2 (similarly)

So, the behavior of mygen() is radically different depending on whether it
is called with iter(mygen()) or mygen.iter().    It looks like the parameter
string is being asked to pull double duty.  How would you code an equivalent
to:

def logger(afile, author):
    while True:
        print >> file, __self__.data, ':changed by', author
        yield None



> but also allowing something with varying arguments, e.g.,
>
>     def mygen(*args, **kwds):

Lesson 1 from a PEP author:  the more you include in
a proposal, the less likely you are to get any of it.


> I.e., in addition to having a .next() method for backwards compatibility,
> the generator object would have a def __call__(self, *args, **kwargs)
method,
> to make the generator look like a callable proxy for the function.

Lesson 2:  the more clever  the proposal, the more likely
it is to be deemed unpythonic.


> When the generator's __call__ method was called, it would rebind the
associated
> function's arg locals each time, as if calling with arg unpacking like
mygen(*args, **kwargs)
> -- but then resuming like .next()

See Lesson 1.

>
> Possibly g() could be treated as another spelling of g.next(), skipping
arg name rebinding.

See Lesson 2

> A default argument should be allowable, and def mygen(x, y=123):

See Lesson 1


.> .. would have
> x and y rebound as you would expect on the first call, at least. You could
argue
> whether to use the original default for subsequent calls or allow a
rebinding
> of a default arg name to act like a default for a subsequent call, but I
think
> optional args should remain optional for all calls.

See Lesson 2 ;)


>
> Note that a generator could now be also used with map, reduce, and filter
as functions
> as well as sequence sources. E.g., IWT you could write compile as a string
filter.
>
> For backwards compatibility. mygen.iter() would have to set a flag or
whatever so
> that the first g() call would not call mygen and get a generator, but
would just
> bind the mygen args and "resume" like .next() from the very start instead.

I'm sure there's another lesson here too ;)


> Calls to g.next() could be intermixed and would just bypass the rebinding
of the
> mygen arg names.

Really?


> I.e., maybe .iter() could be generalized as a factory method of the
function class/type,
> which might mean that you could set up to capture the frame of an ordinary
function
> and allow yields from nested calls -- which would open more multitasking
possibilities
> using generators.

Yes!
See my cookbook recipe for an example.
It implements most of the PEP 288 in pure python using a factory function.
http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/164044


> I.e., yield would look down the stack until it found the first call of a
__call__ of
> a generator, and apparently return from that, while the generator kept the
whole
> stack state from the yield down to itself for continuing when called
again.
>
> Also, it seems like this would make a generator *relate to and control* an
associated
> normal function instead of *being* a weirded-up function. And yield would
be decoupled
> enough that you could probably write exec 'yield' and have it work from a
dynamically
> determined place, so long as there was the frame from a call to a
generator's __call__
> somewhere on the stack.

See PEP 667, proposed mandatory drug testing for pythonistas ;)


Raymond Hettinger





More information about the Python-list mailing list