pylab/matplotlib large plot memory management - bug? or tuning parameter needed?

bdb112 boyd.blackwell at gmail.com
Mon Oct 19 15:15:45 EDT 2009


Summary:

It is not straightforward to avoid memory leaks/consumption in pylab.
If we define
x = arange(1e6)     # adjust size to make the increment visible, yet
fast enough to plot
# then repetition of
plot(x,hold=0)   # consumes increasing memory according to ubuntu
system monitor

Details:
#versions: ubuntu 9.04 standard (py 2.6.2, ipython 0.91, matplotlib
0.98.5.2) , (results here)
#or win32 matplotlib '0.98.3', python 2.5 (similar, although mem usage
limits and reuses after 3-4 plot()s
First, turn off output caching in ipython because it consumes memory
(as it should)

ipython -pylab -cs 0

plot(x,hold=0)
plot(x,hold=0)
plot(x,hold=0)

# closing the window doesn't help much, neither does close() or any of
the below individually
# antidote
plot(hold=0); gcf().clf(); close()  # first often doesn't help!
plot(hold=0); gcf().clf(); close()  # need all three!
plot(hold=0); gcf().clf(); close()

As stated above, the windows version apparently starts more aggressive
garbage collection after 3-4 plots.
The ubuntu version reaches my system memory limit (1GB) without
reclaiming memory - i.e. memory usage just keeps growing until swap
space is used when array is 2e6 elements, consuming 100mB per plot.
For 1e6 elements, memory usage grows for about 10 50MB steps, and then
some garbage collection seems to happen, alothough more can be freeed
with the triple line above.

1/ I am running under VMWare so maybe VMWare isn;t reporting the
correct physical memory size to ubuntu/python - how to check this?

2/ possible bug - why doesn't closing the plot window release all
memory it uses?  Especially when this approaches machine memory size.

3/ Are there python/matplotlib memory management tuning parameters I
can tweak?



More information about the Python-list mailing list