Memory usage per object

Pedro Garrett pedrogarrett at mediaone.net
Thu Jun 21 20:30:24 EDT 2001


I have a web traffic analyzer that loads large amounts of data from log
files.  The data structures that I keep are all dictionaries and lists and
all of the data items (and keys) are strings, ints, or long ints.

At the end of a run, I recurse all my structures to add up the total
memory size and get about 11MB, though Linux reports the script taking up
80MB.

I'd like to find out where all the extra memory is going.  Does anybody
know exactly what the memory overhead of creating a dictionary/list is,
and what the overhead of each entry is so that I can more accurately tally
how much memory my data structures are taking up?  Or alternatively, a
module or function that will do this for me, or perhaps a memory profiler.

I've used the gc module and it doesn't report any garbage, so I figure my
data structures are just much larger than I thought.  I'm only loading
data (I never delete any) so I don't think the problem is undeleted
refererences.

I'm using Python 2.1 on a pretty-much-stock Redhat-7.0 linux i386 system.

Thanks in advance for any help,

--Pedro



More information about the Python-list mailing list