Python memory handling

frederic.pica at gmail.com frederic.pica at gmail.com
Thu May 31 07:40:04 EDT 2007


Greets,

I've some troubles getting my memory freed by python, how can I force
it to release the memory ?
I've tried del and gc.collect() with no success.
Here is a code sample, parsing an XML file under linux python 2.4
(same problem with windows 2.5, tried with the first example) :
#Python interpreter memory usage : 1.1 Mb private, 1.4 Mb shared
#Using http://www.pixelbeat.org/scripts/ps_mem.py to get memory
information
import cElementTree as ElementTree #meminfo: 2.3 Mb private, 1.6 Mb
shared
import gc #no memory change

et=ElementTree.parse('primary.xml') #meminfo: 34.6 Mb private, 1.6 Mb
shared
del et #no memory change
gc.collect() #no memory change

So how can I free the 32.3 Mb taken by ElementTree ??

The same problem here with a simple file.readlines()
#Python interpreter memory usage : 1.1 Mb private, 1.4 Mb shared
import gc #no memory change
f=open('primary.xml') #no memory change
data=f.readlines() #meminfo: 12 Mb private, 1.4 Mb shared
del data #meminfo: 11.5 Mb private, 1.4 Mb shared
gc.collect() # no memory change

But works great with file.read() :
#Python interpreter memory usage : 1.1 Mb private, 1.4 Mb shared
import gc #no memory change
f=open('primary.xml') #no memory change
data=f.read() #meminfo: 7.3Mb private, 1.4 Mb shared
del data #meminfo: 1.1 Mb private, 1.4 Mb shared
gc.collect() # no memory change

So as I can see, python maintain a memory pool for lists.
In my first example, if I reparse the xml file, the memory doesn't
grow very much (0.1 Mb precisely)
So I think I'm right with the memory pool.

But is there a way to force python to release this memory ?!

Regards,
FP




More information about the Python-list mailing list