Looking for info on Python's memory allocation

Lasse Vågsæther Karlsen lasse at vkarlsen.no
Tue Oct 11 05:22:39 EDT 2005


Sybren Stuvel wrote:
> Steven D'Aprano enlightened us with:
> 
>>he says that every time you try to append to a list that is already
>>full, Python doubles the size of the list. This wastes no more than
<snip>
> If, on the other hand, you double the memory every time you run out,
> you have to copy much less data, and in the end it turns out you need
> roughly N steps to add N items to the list. That's a lot better, isn't
> it?

This begs a different question along the same lines.

If I have a generator or other iterable producing a vast number of 
items, and use it like this:

s = [k for k in iterable]

if I know beforehand how many items iterable would possibly yield, would 
a construct like this be faster and "use" less memory?

s = [0] * len(iterable)
for i in xrange(len(iterable)):
     s[i] = iterable.next()

-- 
Lasse Vågsæther Karlsen
http://usinglvkblog.blogspot.com/
mailto:lasse at vkarlsen.no
PGP KeyID: 0x2A42A1C2



More information about the Python-list mailing list