[SciPy-user] How to free memory allocated to array in weave.inline?

Angus McMorland amcmorl at gmail.com
Sat Aug 25 07:37:46 EDT 2007


Hi all,

I'm trying to debug a memory leak in my code, which I suspect comes
from some weave.inline code I have written. Unfortunately, I don't
know anything about debugging memory leaks, and only marginally more
about writing in G, and would appreciate some advice from the list. I
needed to switch to the C to speed up my code - and that bit worked,
with a 10x improvement over my numpy implementation.

The code generates a 3-D array, and loops through the array assigning
values based on the co-ordinate position. In the final program this
array generation has to be done many (ideally 1000s) of times, but
when I try this my memory consumption increases.

I've narrowed down the problem to the following code snippet:

from scipy import weave

def build_mem():
    '''an implementation of spine_model_new using weave.inline'''
    code = '''
    npy_intp dims[3] = {200,200,200};
    PyArrayObject* ar = (PyArrayObject *)PyArray_ZEROS(3, &dims[0],
NPY_BOOL, 0);
    return_val = PyArray_Return(ar);
    '''
    return weave.inline( code )

if __name__ == "__main__":
    sp = build_mem()

def loop_test():
    for i in xrange(150):
        sp = build_mem()

Running loop_test causes uses up some 1.5 GB of memory, which is
consistent with the array size * 150 iterations of the loop, and this
is confirmed by valgrind's memcheck (if I understand the output
correctly), which reports the following when run on the above module:

==14571== 8,000,000 bytes in 1 blocks are possibly lost in loss record 45 of 45
==14571==    at 0x40244B0: malloc (vg_replace_malloc.c:149)
==14571==    by 0x48FFCA0: (within
/usr/lib/python2.4/site-packages/numpy/core/multiarray.so)
==14571==    by 0x4901EF3: (within
/usr/lib/python2.4/site-packages/numpy/core/multiarray.so)
==14571==    by 0x606B9A5: compiled_func(_object*, _object*)
(sc_333f8aabfa93fdf594860ccb438bae6d0.cpp:661)
==14571==    by 0x805A506: PyObject_Call (in /usr/bin/python2.4)
==14571==    by 0x80B458B: PyEval_CallObjectWithKeywords (in /usr/bin/python2.4)
==14571==    by 0x80B0523: (within /usr/bin/python2.4)
==14571==    by 0x80B9C49: PyEval_EvalFrame (in /usr/bin/python2.4)
==14571==    by 0x80B99C3: PyEval_EvalFrame (in /usr/bin/python2.4)
==14571==    by 0x80BB0E4: PyEval_EvalCodeEx (in /usr/bin/python2.4)
==14571==
==14571== LEAK SUMMARY:
==14571==    definitely lost: 40 bytes in 1 blocks.
==14571==    indirectly lost: 24 bytes in 1 blocks.
==14571==      possibly lost: 8,028,594 bytes in 37 blocks.
==14571==    still reachable: 3,795,173 bytes in 2,906 blocks.
==14571==         suppressed: 0 bytes in 0 blocks.

So, it looks pretty well like I'm not freeing the array correctly at
the C level. I've looked through the numpy book and the wiki
(http://www.scipy.org/Cookbook/C_Extensions/NumPy_arrays) but couldn't
determine any solution from those.

What do I need to do to free the array memory from each iteration of the loop?

Thanks,

Angus.
-- 
AJC McMorland, PhD Student
Physiology, University of Auckland



More information about the SciPy-User mailing list