upper memory limit

Jeff Epler jepler at unpythonic.net
Sun Aug 8 16:32:19 EDT 2004


Not only does your OS have some per-process memory limit, but Python
uses the C library's realloc() to increase the size of certain items,
like lists.

Memory fragmentation (after many allocate/free cycles) means that the
largest block that can be allocated is almost always less than the total
amount of free address space.

Secondly, realloc() may temporarily require a larger amount of memory,
because it can relocate a block.  Even a reasonable implementation may
temporarily require new_size + old_size + overhead.  I've been told that
the realloc implementation on windows always relocates a block.

Below about 2GB, Python doesn't pose any relevant constraints on object
size.  However, Python uses "int" to store the "number of items in the
variable part" of an object, which means that an item with a granularity
of 1 byte (byte string) can't be longer than 2 gigabytes, and an item
with granularity of 4 bytes (list object) can't contain more than 2
billion items.  (but this would be an 8GB list object, bigger than the
total address space of ILP32 machines like x86)

Jeff
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 196 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/python-list/attachments/20040808/2bcf796c/attachment.sig>


More information about the Python-list mailing list