[Python-ideas] Making concurrent.futures.Futures awaitable

Guido van Rossum guido at python.org
Sat Aug 8 11:31:28 CEST 2015


+1on the name change.
On Aug 8, 2015 10:12 AM, "Nick Coghlan" <ncoghlan at gmail.com> wrote:

> On 8 August 2015 at 03:08, Guido van Rossum <guido at python.org> wrote:
> > FWIW, I am against this (as Alex already knows), for the same reasons I
> > didn't like Nick's proposal. Fuzzing the difference between threads and
> > asyncio tasks is IMO asking for problems -- people will stop
> understanding
> > what they are doing and then be bitten when they least need it.
>
> I'm against concurrent.futures offering native asyncio support as well
> - that dependency already goes the other way, from asyncio down to
> concurrent.futures by way of the loop's pool executor.
>
> The only aspect of my previous suggestions I'm still interested in is
> a name and signature change from "loop.run_in_executor(executor,
> callable)" to "loop.call_in_background(callable, *, executor=None)".
>
> Currently, the recommended way to implement a blocking call like
> Alex's example is this:
>
>     from asyncio import get_event_loop
>
>     async def handler(self):
>         loop = asyncio.get_event_loop()
>         result = await loop.run_in_executor(None,
> some_blocking_api.some_blocking_call)
>         await self.write(result)
>
> I now see four concrete problems with this specific method name and
> signature:
>
>     * we don't run functions, we call them
>     * we do run event loops, but this call doesn't start an event loop
> running
>     * "executor" only suggests "background call" to folks that already
> know how concurrent.futures works
>     * we require the explicit "None" boilerplate to say "use the
> default executor", rather than using the more idiomatic approach of
> accepting an alternate executor as an optional keyword only argument
>
> With the suggested change to the method name and signature, the same
> example would instead look like:
>
>     async def handler(self):
>         loop = asyncio.get_event_loop()
>         result = await
> loop.call_in_background(some_blocking_api.some_blocking_call)
>         await self.write(result)
>
> That should make sense to anyone reading the handler, even if they
> know nothing about concurrent.futures - the precise mechanics of how
> the event loop goes about handing off the call to a background thread
> or process is something they can explore later, they don't need to
> know about it in order to locally reason about this specific handler.
>
> It also means that event loops would be free to implement their
> *default* background call functionality using something other than
> concurrent.futures, and only switch to the latter if an executor was
> specified explicitly.
>
> There are still some open questions about whether it makes sense to
> allow callables to indicate whether or not they expect to be IO bound
> or CPU bound, and hence allow event loop implementations to opt to
> dispatch the latter to a process pool by default (I saw someone
> suggest that recently, and I find the idea intriguing), but I think
> that's a separate question from dispatching a given call for parallel
> execution, with the result being awaited via a particular event loop.
>
> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20150808/d9109ff6/attachment.html>


More information about the Python-ideas mailing list