[Tutor] Any way of monitoring a python program's memoryutilization?

Alan Gauld alan.gauld at btinternet.com
Thu Jul 17 16:36:48 CEST 2008


"Monika Jisswel" <monjissvel at googlemail.com> wrote...
>I see no problem, if you open very BIG files then your memory will 
>get
> filled up & your system will halt,

Only if you were to foolishly read the whole file into memory at once.
Early computers only had memories of a few kilobytes but could
process files much larger than that  - I recall using one with
only 16K RAM and no hard drive( a server not a home computer)
that was able to process data files of several megabytes. And
many PC databases are 10's of GB in size and a database is
no more than a few very big files. Given that many PCs have
a hardware upper memory limit of 2G and 32bit processors
a limit of 4G that would make life very difficult if we were 
restricted
in file size by the RAM available.

You just have to make sure you read the files in small chunks.
and process each chunk in turn. (A technique usually referred to
as paging.) In python the chunk size is usually a line... There is
only a problem if you try saving all of the lines into a collection
of some sort

>> So I can debug the problem.

The problem he is trying to debug is why the files are using up
his memory when he (presumably) was not expecting them to
fill it. To use the fridge analogy, why is the fridge still full after
I've eaten all the food?

Alan G. 




More information about the Tutor mailing list