garbage collector and slowdown (guillaume weymeskirch)
guillaume weymeskirch
l7gbq06tp0bjko1 at jetable.org
Sat Oct 4 06:24:07 EDT 2008
Hello everybody,
To test the python 2.5 garbage collector, I wrote a trivial script
allocating dummy objects of various sizes, then forgetting them in a loop.
The garbage collector seems working well, limiting the memory used.
But I've noticed a near linear slowdown of the execution : after a few
minutes - and several millions of allocated and freed objects, each
iteration take more and more time to execute.
Does anybody have noticed this ? Could python suffers from some memory
fragmentation in long running processes ?
I've got the same slowdown in python 2.5.2 on a 64 bits gentoo linux
box, and on a winxp machine.
FYI, the following script shows this : it prints the seconds taken for
each iteration.
--- cut here ---
import random,sys
from timeit import Timer
class Reference(object):
refcount = {}
def __init__(self):
name = self.__class__.__name__
c = self.refcount.setdefault(name,0)
self.refcount[name] = c + 1
class ClassicSmall(Reference):
def __init__(self):
super(ClassicSmall,self).__init__()
self.a = 0;
self.b = 20000;
self.c = random.randint(0,500)
self.d = random.randint(0,500)
class ClassicBig(Reference):
def __init__(self):
super(ClassicBig,self).__init__()
self.mylist = range(1000)
class Tree(Reference):
def __init__(self,left,right):
super(Tree,self).__init__()
self.left = left
self.right = right
self.data = ''.join([chr(x) for x in range(65,128)])
def doit():
smalls = []
bigs = []
trees = []
for i in xrange(30000):
smalls.append(ClassicSmall())
bigs.append(ClassicBig())
trees.append(Tree(1,2))
if __name__ == '__main__':
t = Timer("doit()", "from __main__ import doit; gc.enable()")
min = 0.9e300; max=0.
try:
while True:
d = t.timeit(1)
if d < min:
min = d
if d > max:
max = d
print d
except:
pass
print Reference.refcount
print "max=%f min=%f " % (max,min)
More information about the Python-list
mailing list