popen pipe limit

skunkwerk skunkwerk at gmail.com
Wed Apr 9 20:54:39 EDT 2008


On Apr 7, 6:17 pm, "Gabriel Genellina" <gagsl-... at yahoo.com.ar> wrote:
> En Mon, 07 Apr 2008 20:52:54 -0300,skunkwerk<skunkw... at gmail.com>  
> escribió:
>
> > I'm getting errors when reading from/writing to pipes that are fairly
> > large in size.  To bypass this, I wanted to redirect output to a file
> > in the subprocess.Popen function, but couldn't get it to work (even
> > after setting Shell=True).  I tried adding ">","temp.sql" after the
> > password field but mysqldump gave me an error.
>
> > the code:
> > p1 = subprocess.Popen(["mysqldump","--all-databases","--user=user","--
> > password=password"], shell=True)
> > p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout)
> > output = p2.communicate()[0]
> > file=open('test.sql.gz','w')
> > file.write(str(output))
> > file.close()
>
> You need a pipe to chain subprocesses:
>
> import subprocess
> p1 =  
> subprocess.Popen(["mysqldump","--all-databases","--user=user","--password=password"],  
> stdout=subprocess.PIPE)
> ofile = open("test.sql.gz", "wb")
> p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout, stdout=ofile)
> p1.wait()
> p2.wait()
> ofile.close()
>
> If you don't want the final file on disk:
>
> p1 =  
> subprocess.Popen(["mysqldump","--all-databases","--user=user","--password=password"],  
> stdout=subprocess.PIPE)
> p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout,  
> stdout=subprocess.PIPE)
> while True:
>    chunk = p2.stdout.read(4192)
>    if not chunk: break
>    # do something with read chunk
>
> p1.wait()
> p2.wait()
>
> --
> Gabriel Genellina

thanks Gabriel - tried the first one and it worked great!



More information about the Python-list mailing list