slowdown with massive memory usage

Andrew MacIntyre andymac at bullseye.apana.org.au
Fri Jul 30 19:34:31 EDT 2004


On Sat, 30 Jul 2004, Hallvard B Furuseth wrote:

> I have a program which starts by reading a lot of data into various
> dicts.
>
> When I moved a function to create one such dict from near the beginning
> of the program to a later time, that function slowed down by a factor
> of 8-14: 38 sec at 15M memory usage, 570 sec at 144M, 330 sec at 200M.
> Is there anything I can do to fix that?
>
> When the program is running, the system has 18M free memory and is not
> doing any swapping.  `python -O' did not help.
>
> Python: 2.2.3 configured with --enable-ipv6 --enable-unicode=ucs4
> on i386-redhat-linux-gnu (Linux 2.4.21-15.0.2.ELsmp).
> I'm not sure if upgrading to Python 2.3 is an option at the moment;
> I'll check if necessary.

Python 2.2 didn't use PyMalloc by default.  This leaves Python at the
mercy of the platform malloc()/realloc()/free(), and Python has found
rough spots with nearly every platform's implementation of these - which
is why PyMalloc was written.

While it isn't certain that this is your problem, if you can rebuild your
Python interpreter to include PyMalloc (--with-pymalloc I think), you can
find out.

Be warned that there were some bugs in PyMalloc that were fixed before
Python 2.3 was released (when PyMalloc became a default option); as far as
I recall, these bugfixes were never backported to 2.2x.  So I wouldn't
recommend running a 2.2.3 PyMalloc enabled interpreter in production
without seriously testing all your production code.

--
Andrew I MacIntyre                     "These thoughts are mine alone..."
E-mail: andymac at bullseye.apana.org.au  (pref) | Snail: PO Box 370
        andymac at pcug.org.au             (alt) |        Belconnen  ACT  2616
Web:    http://www.andymac.org/               |        Australia



More information about the Python-list mailing list