popen pipe limit

Gabriel Genellina gagsl-py2 at yahoo.com.ar
Mon Apr 7 21:17:31 EDT 2008


En Mon, 07 Apr 2008 20:52:54 -0300, skunkwerk <skunkwerk at gmail.com>  
escribió:

> I'm getting errors when reading from/writing to pipes that are fairly
> large in size.  To bypass this, I wanted to redirect output to a file
> in the subprocess.Popen function, but couldn't get it to work (even
> after setting Shell=True).  I tried adding ">","temp.sql" after the
> password field but mysqldump gave me an error.
>
> the code:
> p1 = subprocess.Popen(["mysqldump","--all-databases","--user=user","--
> password=password"], shell=True)
> p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout)
> output = p2.communicate()[0]
> file=open('test.sql.gz','w')
> file.write(str(output))
> file.close()

You need a pipe to chain subprocesses:

import subprocess
p1 =  
subprocess.Popen(["mysqldump","--all-databases","--user=user","--password=password"],  
stdout=subprocess.PIPE)
ofile = open("test.sql.gz", "wb")
p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout, stdout=ofile)
p1.wait()
p2.wait()
ofile.close()

If you don't want the final file on disk:

p1 =  
subprocess.Popen(["mysqldump","--all-databases","--user=user","--password=password"],  
stdout=subprocess.PIPE)
p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout,  
stdout=subprocess.PIPE)
while True:
   chunk = p2.stdout.read(4192)
   if not chunk: break
   # do something with read chunk

p1.wait()
p2.wait()

-- 
Gabriel Genellina




More information about the Python-list mailing list