Memory leak/gc.get_objects()/Improved gc in version 2.5

Chris Mellon arkanes at gmail.com
Tue Oct 9 10:54:34 EDT 2007


On 10/8/07, crazy420fingers at gmail.com <crazy420fingers at gmail.com> wrote:
> I'm running a python program that simulates a wireless network
> protocol for a certain number of "frames" (measure of time).  I've
> observed the following:
>
> 1. The memory consumption of the program grows as the number of frames
> I simulate increases.
>
> To verify this, I've used two methods, which I invoke after every
> frame simulated:
>
> --  Parsing the /proc/<pid>/status file as in:
> http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/286222
> --  Using ps vg | grep python | awk '!/grep/ {print " ",$8}'  in an
> os.system() call.
>
> The memory usage vs. frame number graph shows some big "jumps" at
> certain points, and, after a large number of frames, shows a steady
> upward slope
>

This would be expected if you're creating ever-larger amounts of
objects - python uses memory pools and as the number of simultaneous
objects increases, the size of the pool will need to increase. This
isn't expected if the total number of objects you create is pretty
much static, but the way you're trying to determine that is flawed
(see below).

> 2. I think I've verified that the objects I instantiate are actually
> freed-- I'm therefore assuming that this "leak" is "caused" by
> python's garbage collection mechanism. I count the number of objects I
> generate that are being tracked by gc as follows:
>
>     gc.collect()
>     objCount = {}
>     objList = gc.get_objects()
>     for obj in objList:
>         if getattr(obj, "__class__", None):
>             name = obj.__class__.__name__
>             if objCount.has_key(name):
>                 objCount[name] += 1
>             else:
>                 objCount[name] = 1
>
>     for name in objCount:
>         print name, " :", objCount[name]
>
>    del objList
>
> Running this snippet every hundred frames or so, shows that the number
> of objects managed by gc is not growing.
>
> I upgraded to Python 2.5. in an attempt to solve this problem. The
> only change in my observations from version 2.4 is that the absolute
> memory usage level seems to have dropped. However, I still see the
> jumps in memory usage at the same points in time.
>
> Can anybody explain why the memory usage shows significant jumps (~200
> kB or ~500 kb) over time (i.e. "frames") even though there is no
> apparent increase in the objects managed by gc? Note that I'm calling
> gc.collect() regularly.
>

You're misunderstanding the purpose of Pythons GC. Python is
refcounted. The GC exists only to find and break reference cycles. If
you don't have ref cycles, the GC doesn't do anything and you could
just turn it off.

gc.get_objects() is a snapshot of the currently existing objects, and
won't give you any information about peak object count, which is the
most direct correlation to total memory use.

> Thanks for your attention,
>
> Arvind
>
> --
> http://mail.python.org/mailman/listinfo/python-list
>



More information about the Python-list mailing list