Cache a large list to disk

Paul Rubin http
Mon May 17 21:56:25 EDT 2004


iamlevis3 at hotmail.com (Chris) writes:
> I have a set of routines, the first of which reads lots and lots of
> data from disparate regions of disk.  This read routine takes 40
> minutes on a P3-866 (with IDE drives).  This routine populates an
> array with a number of dictionaries, e.g.,
> 
> [{'el2': 0, 'el3': 0, 'el1': 0, 'el4': 0, 'el5': 0},
>  {'el2': 15, 'el3': 21, 'el1': 9, 'el4': 33, 'el5': 51},
>  {'el2': 35, 'el3': 49, 'el1': 21, 'el4': 77, 'el5': 119},
>  {'el2': 45, 'el3': 63, 'el1': 27, 'el4': 99, 'el5': 153}]
>         (not actually the data i'm reading)

The dict entries are the same for each list item?

> Now, if this were C I'd know how to do this in a pretty
> straightforward manner.  But being new to Python, I don't know how I
> can (hopefully easily) write this data to a file, and then read it out
> into memory on subsequent launches.
> 
> If anyone can provide some pointers, or even some sample code on how
> to accomplish this, it would be greatly appreciated.

I dunno what the question is.  You can open files, seek on them, etc.
in Python just like in C.  You can use the mmap module to map a file
into memory.  If you want to lose some efficiency, you can write out
the Python objects (dicts, lists, etc) with the pickle or cpickle modules.



More information about the Python-list mailing list