Memory errors with large zip files

Lorn efoda5446 at yahoo.com
Fri May 20 13:10:22 EDT 2005


Is there a limitation with python's zipfile utility that limits the
size of a file that can be extracted? I'm currently trying to extract
125MB zip files with files that are uncompressed to > 1GB and am
receiving memory errors. Indeed my ram gets maxed during extraction and
then the script quits. Is there a way to spool to disk on the fly, or
is necessary that python opens the entire file before writing? The code
below iterates through a directory of zip files and extracts them
(thanks John!), however for testing I've just been using one file:

zipnames = [x for x in glob.glob('*.zip') if isfile(x)]
for zipname in zipnames:
    zf =zipfile.ZipFile (zipname, 'r')
    for zfilename in zf.namelist():
        newFile = open ( zfilename, "wb")
        newFile.write (zf.read (zfilename))
        newFile.close()
    zf.close()


Any suggestions or comments on how I might be able to work with zip
files of this size would be very helpful.

Best regards,
Lorn




More information about the Python-list mailing list