Memory usage, in droves

Danyel Fisher danyelf at ics.uci.edu
Fri Jul 27 03:50:33 EDT 2001


I've been running a mail ... well, all sorts of stuff, including digester,
that sucks a lot of stuff into memory. As a test, it has been digesting
the python mailing list every 24 hours or so. Phew! That is a big pile
of data. It's multi-threaded, failry complex, and has a fair number of
little twiddly bits.

I just went away for two weeks. I came back to note that it was eating
something like 79% of my linux box's memory--something over 100
MB.

So I want to figure out the memory leaks--what objects am I not letting go?
What is the GC missing?

I can certainly search for the inefficiencies and the errors by hand, but it
seems to me that there ought to be a way to generate a table of the form
    <object name>    <object size>

I like the gc.debug() and associated cookbook recipe; but I'm looking,
too, to learn what I'm simply forgetting about. Yes, weak references
are valuable, and is careful use of explicit deletion. What I'm curious
about, though, is whether there's a way to get a sense of what is
floating around in memory. What the heck _am_ I doing with all
that memory?

Danyel






More information about the Python-list mailing list