Looking for info on Python's memory allocation

Sybren Stuvel sybrenUSE at YOURthirdtower.com.imagination
Tue Oct 11 02:48:30 EDT 2005


Steven D'Aprano enlightened us with:
> he says that every time you try to append to a list that is already
> full, Python doubles the size of the list. This wastes no more than
> 50% of the memory needed for that list, but has various advantages
> -- and I'm damned if I can remember exactly what those advantages
> were.

Allocating extra memory usually requires the following steps:

    - Allocate more memory
    - Copy all data from old to new memory
    - Deallocate old memory

That second step requires N actions if there are N data items in the
list. If you add one 'block' of memory to the list for each item you
add, you waste no memory, but every item added requires N steps to
copy all the old items. That means that after you've added N items,
N**2 steps have been taken. That's very bad for performance.

If, on the other hand, you double the memory every time you run out,
you have to copy much less data, and in the end it turns out you need
roughly N steps to add N items to the list. That's a lot better, isn't
it?

Sybren
-- 
The problem with the world is stupidity. Not saying there should be a
capital punishment for stupidity, but why don't we just take the
safety labels off of everything and let the problem solve itself? 
                                             Frank Zappa



More information about the Python-list mailing list