corrupt download with urllib2
Ulli Horlacher
framstag at rus.uni-stuttgart.de
Tue Nov 10 10:51:05 EST 2015
Ulli Horlacher <framstag at rus.uni-stuttgart.de> wrote:
> Peter Otten <__peter__ at web.de> wrote:
> > - consider shutil.copyfileobj to limit memory usage when dealing with data
> > of arbitrary size.
> >
> > Putting it together:
> >
> > with open(sz, "wb") as szo:
> > shutil.copyfileobj(u, szo)
>
> This writes the http stream binary to the file. without handling it
> manually chunk by chunk?
I have a problem with it: There is no feedback for the user about the
progress of the transfer, which can last several hours.
For small files shutil.copyfileobj() is a good idea, but not for huge ones.
--
Ullrich Horlacher Server und Virtualisierung
Rechenzentrum IZUS/TIK E-Mail: horlacher at tik.uni-stuttgart.de
Universitaet Stuttgart Tel: ++49-711-68565868
Allmandring 30a Fax: ++49-711-682357
70550 Stuttgart (Germany) WWW: http://www.tik.uni-stuttgart.de/
More information about the Python-list
mailing list