dict is really slow for big truck
forrest yang
Gforrest.yang at gmail.com
Tue Apr 28 08:54:49 EDT 2009
i try to load a big file into a dict, which is about 9,000,000 lines,
something like
1 2 3 4
2 2 3 4
3 4 5 6
code
for line in open(file)
arr=line.strip().split('\t')
dict[arr[0]]=arr
but, the dict is really slow as i load more data into the memory, by
the way the mac i use have 16G memory.
is this cased by the low performace for dict to extend memory or
something other reason.
is there any one can provide a better solution
More information about the Python-list
mailing list