Large Amount of Data

Jack nospam at invalid.com
Sat May 26 04:19:36 EDT 2007


I suppose I can but it won't be very efficient. I can have a smaller 
hashtable,
and process those that are in the hashtable and save the ones that are not
in the hash table for another round of processing. But chunked hashtable
won't work that well because you don't know if they exist in other chunks.
In order to do this, I'll need to have a rule to partition the data into 
chunks.
So this is more work in general.

"kaens" <apatheticagnostic at gmail.com> wrote in message 
news:mailman.8201.1180141324.32031.python-list at python.org...
> On 5/25/07, Jack <nospam at invalid.com> wrote:
>> I need to process large amount of data. The data structure fits well
>> in a dictionary but the amount is large - close to or more than the size
>> of physical memory. I wonder what will happen if I try to load the data
>> into a dictionary. Will Python use swap memory or will it fail?
>>
>> Thanks.
>>
>>
>> --
>> http://mail.python.org/mailman/listinfo/python-list
>>
>
> Could you process it in chunks, instead of reading in all the data at 
> once? 





More information about the Python-list mailing list