creating really big lists

Ricardo Aráoz ricaraoz at gmail.com
Sat Sep 8 12:24:40 EDT 2007


Dr Mephesto wrote:
> On Sep 8, 3:33 am, "Gabriel Genellina" <gagsl-... at yahoo.com.ar> wrote:
>> En Fri, 07 Sep 2007 16:16:46 -0300, Dr Mephesto <dnh... at googlemail.com>
>> escribi?:
>>
>>> hey, that defaultdict thing looks pretty cool...
>>> whats the overhead like for using a dictionary in python?
>> Dictionaries are heavily optimized in Python. Access time is O(1),
>> adding/removing elements is amortized O(1) (that is, constant time unless
>> it has to grow/shrink some internal structures.)
>>
>> --
>> Gabriel Genellina
> 
> well, I want to (maybe) have a dictionary where the value is a list of
> 5 lists. And I want to add a LOT of data to these lists. 10´s of
> millions of pieces of data. Will this be a big problem? I can just try
> it out in practice on monday too :)
> 
> thanks
> 
> 

targetList = myDict[someKey]	# This takes normal dict access time
for j in xrange(5) :
    for i in xrange(50000000) :    # Add a LOT of data to targetList
        targetList[j].append(i)    # This takes normal list access time






More information about the Python-list mailing list