[Python-bugs-list] [Bug #124981] zlib decompress of sync-flushed data fails

noreply@sourceforge.net noreply@sourceforge.net
Thu, 7 Dec 2000 23:25:46 -0800


Bug #124981, was updated on 2000-Dec-07 23:25
Here is a current snapshot of the bug.

Project: Python
Category: Library
Status: Open
Resolution: None
Bug Group: None
Priority: 5
Submitted by: abo
Assigned to : Nobody
Summary: zlib decompress of sync-flushed data fails

Details: I'm not sure if this is just an undocumented limitation or a genuine bug. I'm using python 1.5.2 on winNT.

A single decompress of a large amount (16K+) of compressed data that has been sync-flushed fails to produce all the data up to the sync-flush. The data remains inside the decompressor untill further compressed data or a final flush is issued. Note that the 'unused_data' attribute does not show that there is further data in the decompressor to process (it shows ''). 

A workaround is to decompress the data in smaller chunks. Note that compressing data in smaller chunks is not required, as the problem is in the decompressor, not the compressor.

The following code demonstrates the problem, and raises an exception when the compressed data reaches 17K;

from zlib import *
from random import *

# create compressor and decompressor
c=compressobj(9)
d=decompressobj()

# try data sizes of 1-63K
for l in range(1,64):
    # generate random data stream
    a=''
    for i in range(l*1024):
        a=a+chr(randint(0,255))
    # compress, sync-flush, and decompress
    t=d.decompress(c.compress(a)+c.flush(Z_SYNC_FLUSH))
    # if decompressed data is different to input data, barf,
    if len(t) != len(a):
        print len(a),len(t),len(d.unused_data)
        raise error


For detailed info, follow this link:
http://sourceforge.net/bugs/?func=detailbug&bug_id=124981&group_id=5470