[Python-Dev] Status of PEP 3145 - Asynchronous I/O for subprocess.popen

Terry Reedy tjreedy at udel.edu
Fri Mar 28 21:58:25 CET 2014


On 3/28/2014 6:20 AM, Victor Stinner wrote:

> Full example of asynchronous communication with a subprocess (the
> python interactive interpreter) using asyncio high-level API:

Thank you for writing this. As I explained in response to Josiah, Idle 
communicates with a python interpreter subprocess through a socket. 
Since making the connection is not dependable, I would like to replace 
the socket with the pipes. http://bugs.python.org/issue18823

However, the code below creates a subprocess for one command and one 
response, which can apparently be done now with subprocess.communicate. 
What I and others need is a continuing (non-blocking) conversion with 1 
and only 1 subprocess (see my response to Josiah), and that is much more 
difficult. So this code does not do what he claims his will do.

However it is done, I agree with the intent of the PEP to make it much 
easier to talk with a subprocess. Victor, if you can rewrite the below 
with a run_forever loop that can accept new write-read task pairs and 
also make each line read immediately accessible, that would be really 
helpful. Post it on the  issue above if you prefer.

Another difference from what you wrote below and what Idle does today is 
that the shell, defined in idlelib/PyShell.py, does not talk to the 
subprocess interpreter directly but to a run supervisor defined in 
idlelib/run.py through an rpc protocol ('cmd', 'arg string'). To use the 
pipes, the supervisor would grab all input from stdin (instead of the 
socket) and exec user code as it does today, or it could be replaced by 
a supervisor class with an instance with a name like 
_idle_supervisor_3_4_0_ that would be extremely unlikely to clash with 
any name created by users.

> ---
> import asyncio.subprocess
> import time
> import sys
>
> @asyncio.coroutine
> def eval_python_async(command, encoding='ascii', loop=None):
>      proc = yield from asyncio.subprocess.create_subprocess_exec(
>                           sys.executable, "-u", "-i",
>                           stdin=asyncio.subprocess.PIPE,
>                           stdout=asyncio.subprocess.PIPE,
>                           stderr=asyncio.subprocess.STDOUT,
>                           loop=loop)
>
>      # wait for the prompt
>      buffer = bytearray()
>      while True:
>          data = yield from proc.stdout.read(100)
>          buffer.extend(data)
>          if buffer.endswith(b'>>> '):
>              break
>
>      proc.stdin.write(command.encode(encoding) + b"\n")
>      yield from proc.stdin.drain()
>      proc.stdin.close()
>
>      output = yield from proc.stdout.read()
>
>      output = output.decode(encoding)
>      output = output.rstrip()
>      if output.endswith('>>>'):
>          output = output[:-3].rstrip()
>      return output
>
> def eval_python(command, timeout=None):
>      loop = asyncio.get_event_loop()
>      task = asyncio.Task(eval_python_async(command, loop=loop), loop=loop)
>      return loop.run_until_complete(asyncio.wait_for(task, timeout))
>
> def test_sequential(nproc, command):
>      t0 = time.monotonic()
>      for index in range(nproc):
>          eval_python(command)
>      return time.monotonic() - t0
>
> def test_parallel(nproc, command):
>      loop = asyncio.get_event_loop()
>      tasks = [asyncio.Task(eval_python_async(command, loop=loop), loop=loop)
>               for index in range(nproc)]
>      t0 = time.monotonic()
>      loop.run_until_complete(asyncio.wait(tasks))
>      return time.monotonic() - t0
>
> print("1+1 = %r" % eval_python("1+1", timeout=1.0))
>
> slow_code = "import math; print(str(math.factorial(20000)).count('7'))"
>
> dt = test_sequential(10, slow_code)
> print("Run 10 tasks in sequence: %.1f sec" % dt)
>
> dt2 = test_parallel(10, slow_code)
> print("Run 10 tasks in parallel:  %.1f sec (speed=%.1f)" % (dt2, dt/dt2))
>
> # cleanup asyncio
> asyncio.get_event_loop().close()


-- 
Terry Jan Reedy



More information about the Python-Dev mailing list