[SciPy-User] down-sampling an array by averaging - vectorized form?

Tony Yu tsyu80 at gmail.com
Sat Feb 11 10:40:40 EST 2012


On Fri, Feb 10, 2012 at 10:11 PM, Andrew Giessel <
andrew_giessel at hms.harvard.edu> wrote:

> Hello all,
>
> I'm looking to down-sample an image by averaging.  I quickly hacked up the
> following code, which does exactly what I want, but the double loop is slow
> (the images I'm working with are ~2000x2000 pixels).  Is there a nice way
> to vectorize this?  A quick profile showed that most of the time is spend
> averaging- perhaps there is a way to utilize np.sum or np.cumsum, divide
> the whole array, and then take every so many pixels?
>
> This method of down-sampling (spatial averaging) makes sense for the type
> of data I'm using and yields good results, but I'm also open to
> alternatives.  Thanks in advance!
>
> Andrew
>
> ######################
> import numpy as np
>
> def downsample(array, reduction):
>     """example call for 2fold size reduction:  newImage =
> downsample(image, 2)"""
>
>     newArray = np.empty(array.shape[0]/reduction, array.shape[1]/reduction)
>
>     for x in range(newArray.shape[0]):
>         for y in range(newArray.shape[1]):
>             newArray[x,y] =
> np.mean(array[x*reduction:((x+1)*reduction)-1, y*reduction:((y+1)*reduction)-1])
>
>     return newArray
> ######################
>
>
I think `scipy.ndimage.zoom` does what you want. Or actually, it does the
opposite: your 2fold size reduction example would be

>>> from scipy import ndimage
>>> small_image = ndimage.zoom(image, 0.5)

-Tony
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.scipy.org/pipermail/scipy-user/attachments/20120211/6541f8e7/attachment.html>


More information about the SciPy-User mailing list