Cache a large list to disk

Chris iamlevis3 at hotmail.com
Mon May 17 15:55:11 EDT 2004


I have a set of routines, the first of which reads lots and lots of
data from disparate regions of disk.  This read routine takes 40
minutes on a P3-866 (with IDE drives).  This routine populates an
array with a number of dictionaries, e.g.,

[{'el2': 0, 'el3': 0, 'el1': 0, 'el4': 0, 'el5': 0},
 {'el2': 15, 'el3': 21, 'el1': 9, 'el4': 33, 'el5': 51},
 {'el2': 35, 'el3': 49, 'el1': 21, 'el4': 77, 'el5': 119},
 {'el2': 45, 'el3': 63, 'el1': 27, 'el4': 99, 'el5': 153}]
        (not actually the data i'm reading)

This information is acted upon by subsequent routines.  These routines
change very often, but the data changes very infrequently (the
opposite pattern of what I'm used to).  This data changes once per
week, so I can safely cache this data to a big file on disk, and read
out of this big file -- rather than having to read about 10,000 files
-- when the program is loaded.

Now, if this were C I'd know how to do this in a pretty
straightforward manner.  But being new to Python, I don't know how I
can (hopefully easily) write this data to a file, and then read it out
into memory on subsequent launches.

If anyone can provide some pointers, or even some sample code on how
to accomplish this, it would be greatly appreciated.

Thanks in advance for any help.
-cjl



More information about the Python-list mailing list