[issue11277] Crash with mmap and sparse files on Mac OS X

Steffen Daode Nurpmeso report at bugs.python.org
Tue May 3 14:14:40 CEST 2011


Steffen Daode Nurpmeso <sdaoden at googlemail.com> added the comment:

> Should we fix Python 2.7?
>  - backport issue #8651
>  - use PY_SSIZE_T_CLEAN in zlibmodule.c

I really thought about this over night.
I'm a C programmer and thus:
- Produce no bugs
- If you've produced a bug, fix it at once
- If you've fixed a bug, scream out loud "BUGFIX!" -
  or at least incorporate the patch in the very next patch release

But i have no experience with maintaining a scripting language.
My survey of something like this spans about three months now.
And if even such a heavy known bug as #1202 survives at least two
minor releases (2.6 and 2.7) without being fixed, then maybe no
more effort should be put into 2.7 at all.

> 11277-27.1.diff contains "# Issue #10276 - check that inputs
> =4GB are handled correctly.". I don't understand this comment
> because the test uses a buffer of 2 GB + 2 bytes.
> How is it possible to pass a buffer of 2 GB+2 bytes to crc32(),
> whereas it stores the size into an int. The maximum size is
> INT_MAX which is 2 GB-1 byte. It looks like the "i" format of
> PyArg_ParseTuple() doesn't check for integer overflow => issue
> #8651. This issue was fixed in 3.1, 3.2 and 3.3, but not in
> Python 2

11277-27.2.diff uses INT_MAX and thus avoids any such pitfall.
Maybe it brings up memory mapping errors somewhere which i surely
would try fix everywhere i can.

----------
Added file: http://bugs.python.org/file21869/11277-27.2.diff

_______________________________________
Python tracker <report at bugs.python.org>
<http://bugs.python.org/issue11277>
_______________________________________
-------------- next part --------------
diff --git a/Lib/test/test_zlib.py b/Lib/test/test_zlib.py
--- a/Lib/test/test_zlib.py
+++ b/Lib/test/test_zlib.py
@@ -1,10 +1,16 @@
 import unittest
-from test import test_support
+from test.test_support import TESTFN, run_unittest, import_module, unlink, requires
 import binascii
 import random
 from test.test_support import precisionbigmemtest, _1G
+import sys
 
-zlib = test_support.import_module('zlib')
+try:
+    import mmap
+except ImportError:
+    mmap = None
+
+zlib = import_module('zlib')
 
 
 class ChecksumTestCase(unittest.TestCase):
@@ -66,6 +72,34 @@
                          zlib.crc32('spam',  (2**31)))
 
 
+# Backport to 2.7 due to Issue #11277: why not also verify INT32_MAX on 2.7?
+# Be aware of issues #1202, #8650, #8651 and #10276
+class ChecksumBigBufferTestCase(unittest.TestCase):
+    int_max = 0x7FFFFFFF
+
+    @unittest.skipUnless(mmap, "mmap() is not available.")
+    def test_big_buffer(self):
+        if sys.platform[:3] == 'win' or sys.platform == 'darwin':
+            requires('largefile',
+                     'test requires %s bytes and a long time to run' %
+                     str(self.int_max))
+        try:
+            with open(TESTFN, "wb+") as f:
+                f.seek(self.int_max-4)
+                f.write("asdf")
+                f.flush()
+                try:
+                    m = mmap.mmap(f.fileno(), 0, access=mmap.ACCESS_READ)
+                    self.assertEqual(zlib.crc32(m), 0x709418e7)
+                    self.assertEqual(zlib.adler32(m), -2072837729)
+                finally:
+                    m.close()
+        except (IOError, OverflowError):
+            raise unittest.SkipTest("filesystem doesn't have largefile support")
+        finally:
+            unlink(TESTFN)
+
+
 class ExceptionTestCase(unittest.TestCase):
     # make sure we generate some expected errors
     def test_badlevel(self):
@@ -546,8 +580,9 @@
 
 
 def test_main():
-    test_support.run_unittest(
+    run_unittest(
         ChecksumTestCase,
+        ChecksumBigBufferTestCase,
         ExceptionTestCase,
         CompressTestCase,
         CompressObjectTestCase


More information about the Python-bugs-list mailing list