[SciPy-User] down-sampling an array by averaging - vectorized form?
andrew giessel
andrew.giessel at gmail.com
Fri Feb 10 22:19:01 EST 2012
Hello all,
I'm looking to down-sample an image by averaging. I quickly hacked up the
following code, which does exactly what I want, but the double loop is slow
(the images I'm working with are ~2000x2000 pixels). Is there a nice way
to vectorize this? A quick profile showed that most of the time is spend
averaging- perhaps there is a way to utilize np.sum or np.cumsum, divide
the whole array, and then take every so many pixels?
This method of down-sampling (spatial averaging) makes sense for the type
of data I'm using and yields good results, but I'm also open to
alternatives. Thanks in advance!
Andrew
######################
import numpy as np
def downsample(array, reduction):
"""example call for 2fold size reduction: newImage = downsample(image,
2)"""
newArray = np.empty(array.shape[0]/reduction, array.shape[1]/reduction)
for x in range(newArray.shape[0]):
for y in range(newArray.shape[1]):
newArray[x,y] =
np.mean(array[x*reduction:((x+1)*reduction)-1, y*reduction:((y+1)*reduction)-1])
return newArray
######################
--
Andrew Giessel, PhD
Department of Neurobiology, Harvard Medical School
220 Longwood Ave Boston, MA 02115
ph: 617.432.7971 email: andrew_giessel at hms.harvard.edu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.scipy.org/pipermail/scipy-user/attachments/20120210/24f3c929/attachment.html>
More information about the SciPy-User
mailing list