PEP 288: Generator Attributes

Bengt Richter bokr at oz.net
Mon Dec 9 21:51:51 EST 2002


On 8 Dec 2002 19:09:02 -0800, roccomoretti at netscape.net (Rocco Moretti) wrote:

>bokr at oz.net (Bengt Richter) wrote in message news:<asjh27$80v$0 at 216.39.172.122>...
>> I saw this in the python-dev summary:
>> 
>> ===================================
>> `PEP 288:  Generator Attributes`__
>> ===================================
>
>The attribute passing scheme seems to be complex and cumbersome. The
I'm not sure why you are quoting me saying I saw a reference to PEP 288 in
the python-dev summary ;-)

>process of set attributes -> call processing function -> repeat is an
>akward way of achieving what you really want to do - continue the
>generator with given data. I'd agree with what Guido said on
I guess you noticed I was trying to propose and alternative to the attribute
methodology? (see more detail below)

>Python-Dev: there is no advantage in this scheme over passing a
>mutable dummy object when the generator is created (you could even
>call it __self__ if you really wanted too ...).
As I showed in an example, pointing this out ;-)

>
>That said, I'm still not sure what the actual objections to the
>previous (rejected) alternative are. I think that the reason for the
>empty/ignored .next() is obvious if you understand how generators
>work, and I question why this is a show-stopper issue (you'll never
>see it if you never pass values to generators). I get the impression
Right, and you need it for compatibility with for x in foo(...)
even if you deprecate that spelling in favor of for x in foo.iter(...)
(which IMO expresses the semantics better ;-).

>that there are other reasons people have for being against using
>.next() (and generator value passing in general) that aren't mentioned
>in the PEP - it would be nice to get them down for posterity.
>
>I would also quibble with RH's assertation that the unused first
>.next(value) problem is "intractable." If we're open to using magic
>locals (as in the __self__ value proposed), you could just use a magic
>local to store the parameter passed to the generator (__genval__?,
>__nextval__?). If you also allowed yield to return the value as well
>(as in: invalue = yield outvalue) you could (in most cases) circumvent
IMO "invalue = yield outvalue" is ugly for several reasons, one of them
being special casing the first invalue, but also the lack of a parameter
signature for the invalue. E.g., should it be treated as args in foo(*args)
or what? You could say that it should match the parameter list of foo,
but does that mean you should write inx, iny = yield outvalue for a generator
based on foo(x, y)? That seems silly if you could just use the original parameter
names and expect the latest values to be bound to them (which is what I proposed ;-)
when you continue at the next line after the yield.

>Guido's complaint that "__self__ is butt-ugly."
At least used for that purpose. OTOH I think it would be nice for all objects
to be able to get references to themselves in some way without looking up a
global binding for themselves. A magic __self__ doesn't seem worse than a magic
__doc__ for that, but using it to pass parameters tacked to a function's __self__
is "butt-ugly."

>
>As for GvR's issue of why generator passing is better than passing a
                           ^^^^^^^^^^^^^^^^^ ??
>dummy object at the creation of the generator, it is, in essence, the
>same argument for why true object orientation is better than emulating
>it in C: when presented simply, the concept flows more naturally.
>(Note the following argument flownders a bit for the current PEP) The
>.next(value) passing method frames itself as a symetric dialog between
>two functions (I give you data, you reply, I give you data ...).
>Whereas the using of a dummy object presents itself more as an
>asymetric, type-up-a-memo-and-post-it-on-the-bulletin-board type
>arrangement. Besides- it adds another entity you have to keep track
>of.
Yes, and IMHO my proposal below is comparatively butt-beautiful ;-)

If foogen.gi_frame.f_locals could be modified, I could just about write a class
to implement a new style generator without modifying the old, I think.

 >>> def foo(x):
 ...     yield 1
 ...     yield 2
 ...     yield x
 ...
 >>> foo
 <function foo at 0x007D96E0>
 >>> foogen = foo('x value')
 >>> foogen.next
 <method-wrapper object at 0x007D8FE0>
 >>> foogen.gi_frame.f_code.co_varnames
 ('x',)
 >>> foogen.gi_frame.f_code.co_nlocals
 1
 >>> foogen.gi_frame.f_code.co_argcount
 1
 >>> foogen.gi_frame.f_locals
 {'x': 'x value'}

all you'd need is a __call__ method in the generator class,
and a modified next that would take care of the first bindings for the
ordinary case of for x in foo.iter(param):

# untested(!!), and for concept illustration mainly
class NewGen:
    def __init__(self, fun, *args, **kwargs):
        self.fun = fun
        self.gen = None
        self.iterargs = args
  
    def __call__(self, *args, **kwargs):
        if self.gen is None:
            if self.iterargs!=() or self.iterkwargs!={}:
                raise ValueError, 'Initial args already specified via .iter(...)'
            self.iterargs = args
            self.iterkwargs = kwargs
            return self.next()
        fc = self.gi_frame.f_code
        for i in range(fc.co_argcount):
            self.gi_frame.f_locals[fc.co_varnames[i]] = args[i] # read-only prevents this now
        # not sure how to handle kwargs
        return self.gen.next()       # the old next

    def next(self):
        if self.gen is None: # first call args come from func.iter(func,*args)
            self.gen = self.fun(*self.iterargs, **self.iterkwargs)   # make generator old way
            return self.gen.next()
        else:
            return self.gen.next()

    # Also, a new method in the function class would be added
    # so you can call foo.iter(parameters)
    def iter(self, *args, **kwargs):   # the self instance would be the function
        return NewGen(self, *args, **kwargs)  # will deferred-ly handle initial nonsense for backwards
                                              # compatibility, depending on how called


Then, you could make a generator either old or new way

    foogen = foo.iter('first x value')  # style for use in for x in foo.iter(...)
    first_value = foogen.next()
or
    foogen = foo.iter()                 # style for explicit co-routine style calls from start
    first_value = foogen('first x value')

and then be able to continue with either

    next_value = foogen.next()

(which would re-use the previously bound value foogen.gi_frame.f_locals['x'])
or

    next_value = foogen('new_x_arg')

which would have the x parameter bound to 'new_x_arg' when it continue from the previous yield

Old style generators could coexist without change. (In fact the above uses them).
I probably overlooked something, but I hope the idea is getting clearer ;-)

>
>sig.__Limerick__ = '''
>There once was a young man from Occam,
>Whose rhetorical theories would shock them.
>"When entities multiply,
>Your arguments die.
>So please use my razor to dock them."
>'''
>
>P.S. I believe further PEP division may be called for. Value passing
>and exception throwing in generators are synergystic, yes, but can be
>implemented independantly if need be. The chance of passing at least
>one may be increased if split into separate PEPs, instead of having
>their separate destinies thrown into a common PEP number.

Perhaps. OTOH a two-legged stool will probably sell less well than a
three-legged one ;-)

Regards,
Bengt Richter



More information about the Python-list mailing list