[Numpy-discussion] Memory issue with memory-mapped array assignment
Thomas Robitaille
thomas.robitaille at gmail.com
Mon Mar 18 05:56:52 EDT 2013
Hi everyone,
I've come across a memory issue when trying to assign data to slices
of a Numpy memory-mapped array. The short story is that if I create a
memory mapped array and try to add data to subsets of the array many
times in a loop, the memory usage of my code grows over time,
suggesting there is some kind of memory leak.
More specifically, if I run the following script:
import random
import numpy as np
image = np.memmap('image.np', mode='w+', dtype=np.float32, shape=(10000, 10000))
print("Before assignment")
for i in range(1000):
x = random.uniform(1000, 9000)
y = random.uniform(1000, 9000)
imin = int(x) - 128
imax = int(x) + 128
jmin = int(y) - 128
jmax = int(y) + 128
data = np.random.random((256,256))
image[imin:imax, jmin:jmax] = image[imin:imax, jmin:jmax] + data
del x, y, imin, imax, jmin, jmax, data
the memory usage goes up to ~300Mb after 1000 iterations (and
proportionally more if I increase the number of iterations).
I've written up a more detailed overview of the issue on stackoverflow
(with memory profiling):
http://stackoverflow.com/questions/15473377/memory-issue-with-numpy-memory-mapped-array-assignment
Does anyone have any idea what is going on, and how I can avoid this issue?
Thanks!
Tom
More information about the NumPy-Discussion
mailing list