Performance of lists vs. list comprehensions

Steven D'Aprano steve at REMOVE-THIS-cybersource.com.au
Tue Jan 19 20:52:41 EST 2010


On Tue, 19 Jan 2010 11:26:43 -0500, Gerald Britton wrote:

> Interestingly, I scaled it up to a million list items with more or less
> the same results.

A million items is not a lot of data. Depending on the size of each 
object, that might be as little as 4 MB of data:

>>> L = ['' for _ in xrange(10**6)]
>>> sys.getsizeof(L)
4348732

Try generating a billion items, or even a hundred million, and see how 
you go.

This is a good lesson in the dangers of premature optimization. I can't 
think how many times I've written code using a generator expression 
passed to join, thinking that would surely be faster than using a list 
comprehension ("save building a temporary list first").



-- 
Steven



More information about the Python-list mailing list