[Python-ideas] Cofunctions - Back to Basics

Nick Coghlan ncoghlan at gmail.com
Sat Oct 29 12:22:21 CEST 2011


On Sat, Oct 29, 2011 at 5:37 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> If the flag were moved into the stack frame instead, it would
> be possible to run any function in either "normal" or "coroutine"
> mode, depending on whether it was invoked via __call__ or
> __cocall__.
>
> So there would no longer be two kinds of function, no need for
> 'codef', and any pure-Python code could be used either way.
>
> This wouldn't automatically handle the problem of C code --
> existing C functions would run in "normal" mode and therefore
> wouldn't be able to yield. However, there is at least a clear
> way for C-implemented objects to participate, by providing
> a __cocall__ method that returns an iterator.

Ah, now we're getting somewhere :)

OK, so in this approach, *any* Python function could potentially be a
coroutine - it would depend on how it was invoked rather than any
inherent property of the function definition. An ordinary call would
be unchanged, while a cocall would set a flag on the new stack frame
to say that this is a coroutine. Yes, I think that's a good step
forward.

The behaviour of call() syntax in the eval loop would then depend on
whether or not the flag was set on the frame. This would add a tiny
amount of overhead to all function calls (to check the new flag), but
potentially *does* solve the language bifurcation problem (with some
additional machinery). However, I think we still potentially have a
problem due to the overloading of a single communications channel
(i.e. using 'yield' both to suspend the entire coroutine, but also to
return values to the next layer out).

To illustrate that, I'll repeat the toy example I posted earlier in the thread:

    # The intervening function that we want to "just work" as part of
a coroutine
    def print_all(iterable):
        for item in iterable:
            print(item)

    # That means we need "iterable" to be able to:
    #    - return items to 'print_all' to be displayed
    #    - suspend the entire coroutine in order to request more data

    # Now, consider if our 'iterable' was an instance of the following generator
    def data_iterable(get_data, sentinel=None):
        while 1:
            x = get_data()
            if x is sentinel:
                break
            yield x

In coroutine mode, the for loop would implicitly invoke
iterable.__next__.__cocall__(). The __cocall__ implementations on
generator object methods are going to need a way to tell the
difference between requests from the generator body to yield a value
(which means the __cocall__ should halt, and the value be returned)
and requests to suspend the coroutine emanating from the "get_data()"
call (which means the __cocall__ should yield the value provided by
the frame).

That's where I think 'coyield' could come in - it would tell the
generator __cocall__ functionality it should pass the suspension
request up the chain instead of returning the value to the caller of
next/send/throw. This could also be done via a function and a special
object type that the generator __cocall__ implementation recognised.

The limitation would then just be that any operation invoked via
__call__ (typically a C function with no __cocall__ method) would
prevent suspension of the coroutine. The end result would actually
look a lot like Lua coroutines, but with an additional calling
protocol that allowed C extensions to participate if they wanted to.

I think eventually the PEP should move towards a more explanatory model:
- "coroutines are a tool for lightweight cooperative multitasking"
- "coroutines are cool for a variety of reasons (aka people didn't
create Twisted, Stackless, greenlets and gevent just for fun)"
- "here's how this PEP proposes this functionality should look (aka
Lua's coroutines look pretty nice)"
- "rather than saving and switching entire stack frames as a unit,
Python's generator model supports coroutines by requiring that every
frame on the stack know how to suspend itself for later resumption"
- "doing this explicitly can be quite clumsy, so this PEP proposes a
new protocol for invoking arbitrary functions in a way that sets them
up for cooperative multitasking rather than assuming they will run to
completion immediately"
- "this approach allows C extensions to play nicely with coroutines
(by implementing __cocall__ appropriately), but doesn't require low
level assembly code the way Stackless/greenlets do"

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia



More information about the Python-ideas mailing list