Abstracting try..finally with generators (instead of macros)

Beni Cherniavsky cben at techunix.technion.ac.il
Sun Dec 15 12:41:11 EST 2002


[PEP 288's author cc:'ed due to need to trigger a return rather than an
exception in a generator, extending the PEP - see below; besides this
contains a lot of motivation]

On 2002-12-15, I wrote:

> Hi,
>
> I believe I've just discovered a new cool use for generators, one that
> people thought they need macros for. [snip]
>
> def withmyout():
>   saved = sys.stdout
>   sys.stdout = myout
>   yield None
>  sys.stdout = saved
>
> print 1
> for dummy in withmyout():
>   print 2
> print 3
>
> [snip]
>
> Generators with side effects would greatly benefit from a defining
> try:..finally around yield.  It is possible to define sensible semantics
>  for it - to be executed when the last reference to the iterator is
> deleted (if `.next()` wasn't called).  I strongly vote for it.  Any
> chance for Python 2.3?
>
Correction: such semantics can't work with Jython beacuse it doesn't
guarantee finalization at predictable times.  The best alternative I see
is augmenting the definition of the for loop to make explicit finilization
effort on iterators that support it (see below for how).  It should only
happen on iterators having the appropriate method (similarly to how for
falls back to __getitem__ when __iter__ is absent).  Maybe this should be
added to PEP 288?

> Possible implementation: on `.__del__()`, resume the generator with an
> exception (`KillGenerator`?) and silently catch it if the generator
> doesn't.
>
Correction: an exception would be caught by a bare except.  From PEP 255:

  Note that return isn't always equivalent to raising StopIteration:  the
  difference lies in how enclosing try/except constructs are treated.

So we want instead to trigger from outside the equivallent of a ``return``
staement instead of an exception.  I'm not sure whether this will make any
difference in real life but it would be cleaner.

So I propose to extend the `.throw()` method of generators in PEP 288:
when called with no arguments (or maybe with `None`?), perform the
equivallent of a return statement (i.e. it's only caught by try..finally
but never by try..except).

After some thought, I'm not sure I see the need for passing all kinds of
exceptions.  There is something unclean about a yield statement being able
to emit any exception that can be confused with exceptions internal to the
generator (unless one codes really carefully).  Does anybody else see the
need for multiple exceptions?

The example in the PEP suggests that named exceptions lead to more
self-documenting code and more importantly, that an except clause might
not get executed but a finally clause always will -- which might not be
what you want.  The particular example might be cleaner without exceptions
at all by yielding the log object all the time and writeing outside.
Here the exception is actually abused instead of adding a method to the
iterator produced by a generator.  It has nothing to do with clean-up (for
which a return is more appropriate).  I see no good way to do really do
this (keeping integration with generator function's namespace).  Since the
effect of the methods can depend on the generator's state (just like the
effect of `.next()` does), maybe exceptions are after all the best way to
approxiamte added methods...  Or maybe yielding an object with methods is
a cleaner way:

def logger():
    start = time.time()
    log = []
    helper = Helper(log)
    while True:
        log.append(time.time() - start)
        yield helper

class Helper:   # for effeciency this should go outside?
    def __init__(self, log):
        self.log = log
    def WriteLog(self):
        writelog(self.log)

g = logger()
for i in [10,20,40,80,160]:
    testsuite(i)
    g_helper = g.next()
g_helper.WriteLog()

Compare to my crazy idea from a previous post to make generator functions
subclassable:

def _base_logger():
    start = time.time()
    log = []
    while True:
        log.append(time.time() - start)
        yield log[-1]

class logger(_base_logger):
    def WriteLog(self):
        # Assuming the locals are accesible as attributes...
        writelog(self.log)

Better (but py-in-the-sky)...  The example is too simple to decide anyway.

If multiple exceptions indeed are dropped, I propose a new name for the
method:  `.finalize()`.  This makes more sense when the intent is to
trigger only try..finally statements.  If not, calling without arguments
is already defined.  Maybe having both `.finalize()` and `.throw()` is a
good idea.

BTW, the PEP talks about::

     throw([expression, [expression, [expression]]])

but only lists calls with up to 2 arguments.

> Wart to watch out: explain that try..except in generators doesn't catch
> exceptions in the consumer while try..finally effectively does (would).
>
See above -- if it's not an exception but a return, this makes sense.

> This can be currently emulated (with somewhat different semantics) with
> something like the above wrappers.  That's slower and less powerful.
>
As said, they also can't work in Jython.  That's why a language extension
is needed more strongly.  This is an extra motivation for this part of PEP
288.

----

See also (just now found):
http://sourceforge.net/mailarchive/forum.php?thread_id=386835&forum_id=2445
and
http://sourceforge.net/mailarchive/forum.php?thread_id=765404&forum_id=2445
so I'm not the first to invent this...  Surprisingly Guido himself
suggested to rely on __del__ in a reply to the later link.  Is there an
accepted approach of "use __del__ and ignore JPython" or was it simply
from lack of better solution?

try-yield-finally y'rs,  Beni Cherniavsky <cben at tx.technion.ac.il>




More information about the Python-list mailing list