Memory pre-allocation for large lists

Jeremy Fincher tweedgeezer at hotmail.com
Wed Feb 18 08:21:26 EST 2004


aahz at pythoncraft.com (Aahz) wrote in message news:<c0v3il$bk3$1 at panix2.panix.com>...
> In article <ift530d51l8td0uoohm5t9ad96pq5n4e3i at 4ax.com>,
> Avi Kak  <kak at purdue.edu> wrote:
> >
> >Is it possible in Python to define an empty list of a predetermined
> >size?
> >
> >To illustrate my question further, I can do the following in Perl
> >
> >   my @arr = ();
> >   $#arr = 999;
> >
> >that wold cause @arr to become an empty array of size 1000.  This
> >array could be used to store 1000 elements without any further memory
> >allocation for the array itself.
> >
> >Can the same be done in Python?
> 
> Can, yes.  But why?  Python uses an allocation mechanism for lists that
> results in constant amortized time; there's really no reason to
> pre-allocate.

Python's allocation scheme for lists may be constant amortized time,
but it's *very* conservative and thus has a significant constant
factor.  For example:

functor% python /usr/lib/python2.3/timeit.py 'L = [None]*1000' 'for _
in xrange(1000): pass'
1000 loops, best of 3: 248 usec per loop
functor% python /usr/lib/python2.3/timeit.py 'L = []' 'for _ in
xrange(1000): L.append(None)'
1000 loops, best of 3: 1.88e+03 usec per loop

So there's a factor of 7.5 difference in speed there.  I doubt it
matters, but it at least lends some credibility to the idea that
preallocation might be useful.

Jeremy



More information about the Python-list mailing list