Comments invited: PEP 279 Enhanced Generators

Cromwell, Jeremy jcromwell at ciena.com
Fri Feb 1 13:45:45 EST 2002


>-----Original Message-----
>From: Raymond Hettinger [mailto:othello at javanet.com]
>Subject: Comments invited: PEP 279 Enhanced Generators
>
>
>Your review and comments are invited for PEP 279:
>   http://python.sourceforge.net/peps/pep-0279.html
>
[snip good suggestions and change history]
>
>Raymond Hettinger

I like all of these.  It might be worth noting that these functions would
come in at 2.3, the same time that generators are final (no longer imported
from the __future__). 

I'm a little confused by your example in Specification for two-way Generator
Parameter Passing 

>From the PEP:

    Example of a Consumer Stream

        def filelike(packagename, appendOrOverwrite):
            cum = []
            if appendOrOverwrite == 'w+':
                cum.extend( packages[packagename] )
            try:
                while 1:
                    dat = yield None
                    cum.append(dat)
            except FlushStream:
                packages[packagename] = cum
        ostream = filelike('mydest','w')   # Analogous to
file.open(name,flag)
        ostream.next()                     # Advance to the first yield
        ostream.next(firstdat)             # Analogous to file.write(dat)
        ostream.next(seconddat)
        ostream.throw( FlushStream )       # This feature proposed below

I don't see why the no-argument first call to .next() is necessary.  I know
you previously proposed control flow so that the code up to the first yield
was executed at iterator construction, rather than the lazy way they are
now.  The current flow is this:
	process, yield output
Unless I'm wrong, you are proposing this: 
	accept input (.next arguments), process, yield output
It seems more natural to do this:
	process, yield output, accept input
even though from the caller's perspective, you see this:
	resultFromLastInput = myGen.next(inputToBeProcessed)
since internally, the iterator is doing this:
	inputToBeProcessed = yield resultFromLastInput
Assuming that throwing the exception causes the processing to proceed to the
yield before it is raised, this would allow the above example to be
unchanged, except for the removal of the now-unnecessary pump-priming call
to ostream.next()
(If you didn't understand my explanation, you obviously can't understand the
nine commands that constitute all are actions. ;)

Also, this example appears before the section that describes the .throw()
addition.



I also have minor quibble with Specification for Generator Comprehensions

>From the PEP:
    If a list comprehension starts with a 'yield' keyword, then
    express the remainder of the statement as generator.  For example:

        g = [yield (len(line),line) for line in file.readline() if
len(line)>5]
        print g.next()
        print g.next()

    This would be implemented as if it had been written:

        def __temp_gen():
            for line in file.readline():
                if len(line) > 5:
                    yield (len(line), line)
        g = __temp_gen()
        print g.next()
        print g.next()

Currently, this code:
	g = [(len(line),line) for line in file.readline() if len(line)>5]
would leave the variable "line" (with its last value) left in the locals
(and clobber an existing "line"), since it implemented as if it had been
written:
	g = []
       for line in file.readline():
           if len(line) > 5:
               g.append(line)

When many read this, they assumed that it wouldn't _actually_ create a local
called "line".  They were wrong.  Will your code: 
a) actually create a local called "__temp_gen"
b) not create a local called "line"


Otherwise, it's very impressive and I hope they're all implemented.

-Jeremy Cromwell




More information about the Python-list mailing list