Trying to wrap my head around futures and coroutines

Oscar Benjamin oscar.j.benjamin at gmail.com
Wed Jan 15 07:58:26 EST 2014


On Mon, Jan 06, 2014 at 09:15:56PM -0600, Skip Montanaro wrote:
> From the couple responses I've seen, I must have not made myself
> clear. Let's skip specific hypothetical tasks. Using coroutines,
> futures, or other programming paradigms that have been introduced in
> recent versions of Python 3.x, can traditionally event-driven code be
> written in a more linear manner so that the overall algorithms
> implemented in the code are easier to follow? My code is not
> multi-threaded, so using threads and locking is not really part of the
> picture. In fact, I'm thinking about this now precisely because the
> first sentence of the asyncio documentation mentions single-threaded
> concurrent code: "This module provides infrastructure for writing
> single-threaded concurrent code using coroutines, multiplexing I/O
> access over sockets and other resources, running network clients and
> servers, and other related primitives."
> 
> I'm trying to understand if it's possible to use coroutines or objects
> like asyncio.Future to write more readable code, that today would be
> implemented using callbacks, GTK signals, etc.

Hi Skip,

I don't yet understand how asyncio works in complete examples (I'm not sure
that many people do yet) but I have a loose idea of it so take the following
with a pinch of salt and expect someone else to correct me later. :)

With asyncio the idea is that you can run IO operations concurrently in the
a single thread. Execution can switch between different tasks while each task
can be written as a linear-looking generator function without the need for
callbacks and locks. Execution switching is based on which task has avilable
IO data. So the core switcher keeps track of a list of objects (open files,
sockets etc.) and executes the task when something is available.

>From the perspective of the task generator code what happens is that you yield
to allow other code to execute while you wait for some IO e.g.:

@asyncio.coroutine
def task_A():
    a1 = yield from futureA1()
    a2 = yield from coroutineA2(a1) # Indirectly yields from futures
    a3 = yield from futureA3(a2)
    return a3

At each yield statement you are specifying some operation that takes time.
During that time other coroutine code is allowed to execute in this thread.

If task_B has a reference to the future that task_A is waiting on then it can
be cancelled with the Future.cancel() method. I think that you can also cancel
with a reference to the task. So I think you can do something like

@asyncio.coroutine
def task_A():
    # Cancel the other task and wait
    if ref_taskB is not None:
        ref_taskB.cancel()
        asyncio.wait([ref_taskB])
    try:
        # Linear looking code with no callbacks
        a1 = yield from futureA1()
        a2 = yield from coroutineA2(a1) # Indirectly yields from futures
        a3 = yield from futureA3(a2)
    except CancelledError:
        stop_A()
        raise # Or return something...
    return a3

Then task_B() would have the reciprocal structure. The general form with more
than just A or B would be to have a reference to the current task then you
could factor out the cancellation code with context managers, decorators or
something else.


Oscar



More information about the Python-list mailing list