[SciPy-User] variable smoothing kernel

Wolfgang Kerzendorf wkerzendorf at googlemail.com
Sat Mar 26 21:54:58 EDT 2011


Well, your time is my wavelength. It should vary on wavelength.

I agree implementing it with cython makes it probably faster. I do 
suspect however, that it won't be as fast as the normal smoothing function.

I have learned that multiplying functions in fourier space is the same 
as convoluting them. I believe that is how the ndimage kernels work so 
incredibly fast.
I wanted to see if there's a similar shortcut for a variable kernel.

I have copied my previous attempts (which were very simply written and 
take a long time) into this pastebin: http://pastebin.com/KkcEATs7

Thanks for your help
      Wolfgang
On 27/03/11 12:09 PM, Nicolau Werneck wrote:
> If I understand correctly, you want a filter that varies on "time".
> This non-linearity will cause it to be inherently more complicated to
> calculate than a normal linear time-invariant filter.
>
> I second Christopher's suggestion, try Cython out, it's great for this
> kind of thing. Or perhaps scipy.weave.
>
> ++nic
>
> On Sat, Mar 26, 2011 at 9:52 AM, Wolfgang Kerzendorf
> <wkerzendorf at googlemail.com>  wrote:
>> Hello,
>>
>> I'm interested in having a gaussian smoothing where the kernel depends
>> (linearly in this case) on the index where it is operating on. I
>> implemented it myself (badly probably) and it takes for ever, compared
>> to the gaussian smoothing with a fixed kernel in ndimage.
>>
>> I could interpolate the array to be smoothed onto a log space and not
>> change the kernel, but that is complicated and I'd rather avoid it.
>>
>> Is there a good way of doing that?
>>
>> Cheers
>>      Wolfgang
>> _______________________________________________
>> SciPy-User mailing list
>> SciPy-User at scipy.org
>> http://mail.scipy.org/mailman/listinfo/scipy-user
>>
>
>




More information about the SciPy-User mailing list