lists, performance..

holger krekel pyth at devel.trillke.net
Thu Oct 31 09:26:40 EST 2002


gabor wrote:
> i'm working on a graphic application ( python + opengl)..
> 
> and i have some lists of vertices... but that's not important..
> 
> let's say a have loooooooong ( long = 500 to 5000 ) lists of relatively
> simple objects..
> 
> 2 questions:
> 
> 1. is a list implemented as a vector? i mean looking up the n-th element
> is done in o[1] (constant time)?

yes (IIRC).

> 2. i want to insert several lists into a big list.. better to say i want
> to concatenate them... but i don't really like the 'extent' method,
> because i want to do a deep-copy... for example:
> list1 = [ ['a','b','c'], [1,2,3,4] ,[5,6,7]]
> list2 = []
> list2.extend(list1)
> but now if i modify an element in list1, list2 gets modified too..
> for now i'm doing:
> 
> list1 = ...
> list2 = ...
> 
> for item in list2:
> 	list1.append(item)

the fastest method i know for your problem is:

>>> a,b=[1,2,3],[2,3,4]
>>> n=[]
>>> filter(n.append, a)
[]
>>> filter(n.append, b)
[]
>>> n
[1, 2, 3, 4, 5, 6]
>>>

It's very fast because 'list.append' and 'filter' both have
c-level implementations.
  
> but maybe this is too slow.. or isn't? ideally i'd like to have
> something like reserve in c++....
> so let's say i have a list which len() is 30. now i want to insert
> 20elements... wouldn't it be faster to somehow resize the list directly
> to 50, and then add the elements?
> i'm worried about how many times would the list resize itself to be able
> to contain the additional 20elements if i would add them one-by-one

time the above solution and see if it's fast enough for you.

HTH, holger




More information about the Python-list mailing list