dict is really slow for big truck

Thomas G. Willis tom.willis at gmail.com
Tue Apr 28 09:16:50 EDT 2009


On Apr 28, 8:54 am, forrest yang <Gforrest.y... at gmail.com> wrote:
> i try to load a big file into a dict, which is about 9,000,000 lines,
> something like
> 1 2 3 4
> 2 2 3 4
> 3 4 5 6
>
> code
> for line in open(file)
>    arr=line.strip().split('\t')
>    dict[arr[0]]=arr
>
> but, the dict is really slow as i load more data into the memory, by
> the way the mac i use have 16G memory.
> is this cased by the low performace for dict to extend memory or
> something other reason.
> is there any one can provide a better solution

I think you need to upgrade the l2 cache on your cpu.



More information about the Python-list mailing list