confused about resizing array in Python

Dongsheng Ruan ruan at jcmills.com
Sat Feb 3 18:49:25 EST 2007


You mentioned "it doubles in size".

Are you saying that a new double sized array is allocated and the contents
of the old list is copied there?

Then the old list is freed from memory?

It seems to be what is called amortized constant.

Say the list size is 100, before it is fully used, the append takes O(1)
time. But for the 101th element, the time will be O(100+1), and then from
then on, it is O(1) again. Like John Machin said in the previous post?

But on average, it is O(1). I guess this is the amortized constant. Isn't
it?

"Roel Schroeven" <rschroev_nospam_ml at fastmail.fm> wrote in message 
news:vc8xh.325172$Au6.6345787 at phobos.telenet-ops.be...
> Ruan schreef:
>> "Roel Schroeven" <rschroev_nospam_ml at fastmail.fm> wrote:
>>> Ruan schreef:
>>>> My confusion comes from the following piece of code:
>>>>
>>>>  memo = {1:1, 2:1}
>>>> def fib_memo(n):
>>>> global memo
>>>> if not n in memo:
>>>> memo[n] = fib_memo(n-1) + fib_memo(n-2)
>>>> return memo[n]
>>>>
>>>> I used to think that the time complexity for this code is O(n) due to
>>>> its use of memoization.
>>>>
>>>> However, I was told recently that in Python, dictionary is a special
>>>> kind of array and to append new element to it or to resize it, it is in 
>>>> fact
>>>> internally inplemented by creating another array and copying the old 
>>>> one to
>>>> it and append a new one.
>
>>> That's not correct. Python dictionaries are highly optimized and I
>>> believe the time complexity is amortized constant (i.e. O(1)) for both
>>> insertions and lookups.
>
>> Then how about Python's list?
>>
>> What is done exactly when list.append is executed?
>>
>> For list, is there another larger list initialized and the contents from 
>> the
>> old list is copied to it together with the new appended list?
>
> I'm not sure, but I think each time the list needs to grow, it doubles in 
> size. That leaves room to add a number of elements before the allocated 
> space needs to grow again. It's a frequently used approach, since it is 
> quite efficient and the memory needed is never double the amount of memory 
> strictly needed for the elements of the list.
>
> You can always study the source code for all gory details of course.
>
> -- 
> If I have been able to see further, it was only because I stood
> on the shoulders of giants.  -- Isaac Newton
>
> Roel Schroeven 





More information about the Python-list mailing list