Speed up loading and free memory...

GrelEns grelens at NOSPAMyahoo.NOTNEEDEDfr
Thu Mar 11 03:33:43 EST 2004


Hello,

i need advices on this : having all files are text files, 1 is an index
(6mo) to another big data file (20mb), i can not put that in a database and
must work from raw files.

i've tried to design an oo interface to theses files, well, it took 10
minutes to just load and build the whole wrapper objects (really simple
ones - with not the full logic behind) and it tooks 446mb of memoy! ('top'
command executed while loading to check). if i use tuples instead of objects
it still need about 2 minutes and 240mb...

as i did not need all data in memory at a moment but only index and then
follow to some data, please have you any clue on how i can handle this ?
having to load only index tooks less than 1 minutes and only 60mb in memory
which is acceptable, but if i read data directly from a disk it is
desperatly slow...

thanks a lot,





More information about the Python-list mailing list