Upper memory limit

Bengt Richter bokr at oz.net
Wed May 15 20:23:23 EDT 2002


On 15 May 2002 18:09:14 -0400, Kragen Sitaker <kragen at pobox.com> wrote:

>Siegfried Gonzi <siegfried.gonzi at kfunigraz.ac.at> writes:
>> I have a calculation (external C function included with the help of
>> SWIG; external Fortran function included with the help of F2PY; some pre
>> processed Lisp files; a dozen of binary satellite data files). One run
>> of the code takes about 1 hour. The execution time is not the problem!
>> Memory is not a problem too, but: it consumes after 1 hour 200 MB of
>> RAM. If I start the calculation in the DOS shell I can see that Python
>> gives memory back to the OS (I got the advice to start the calculation
>> from the DOS line), but this is only for 2 runs the case. If I start the
>> calculation for a third time I have to see that Python wreaks havoc.
>

>I have had a somewhat similar problem.  If I create a dict containing
>lots of small Numeric arrays, write the small arrays to a file, delete
>the dict, and load the file as a large Numeric array, I get a
>MemoryError.  If I exit and restart Python between deleting the dict
>and loading the file, it works fine.

Would you try doing whatever_dict.clear() before deleting it, and see if
that makes any difference?

Here is some discussion on where it did make a difference:

http://mail.python.org/pipermail/python-list/2002-February/088476.html

Regards,
Bengt Richter



More information about the Python-list mailing list