[SciPy-User] Autocorrelation function: Convolution vs FFT

josef.pktd at gmail.com josef.pktd at gmail.com
Tue Jun 22 15:46:23 EDT 2010


On Tue, Jun 22, 2010 at 1:49 PM, Skipper Seabold <jsseabold at gmail.com> wrote:
> I am trying to compute the autocorrelation via convolution and via fft
> and am far from an expert in DSP.  I'm wondering if someone can spot
> anything that might introduce numerical inaccuracies or if I'm stuck
> with the following two being slightly different.
>
> Generate some autocorrelated data:
>
> import numpy as np
> nobs = 150000
> x = np.zeros((nobs))
> for i in range(1,nobs):
>    x[i] = .85 * x[i-1] + np.random.randn()
>
> # compute ACF using convolution
>
> x0 = x - x.mean()
>
> # this takes a while for the big data
> acf1 = np.correlate(x0,x0,'full')[nobs-1:]/nobs
> acf1 /= acf1[0]
>
>
> # compute ACF using FFT
>
> Frf = np.fft.fft(x0, n=2*nobs) # zero-pad for separability
> Sf = Frf * Frf.conjugate()
> acf2 = np.fft.ifft(Sf)
> acf2 = acf2[1:nobs+1]/nobs
> acf2 /= acf2[0]
> acf2 = acf2.real
>
> np.linalg.norm(acf1-acf2, ord=2)
>
> They are pretty close, but I would expect them to be closer than this.
>
> np.max(acf1-acf2)
> 0.006581962491189159
>
> np.min(acf1-ac2)
> -0.0062705596399049799

I don't see anything, but I don't remember these things by heart. Why
don't you use scipy.signal.fftconvolve, or steal the source, which I
did for some version of fft convolutions.

BTW: the best padding is to the power of 2, there is also the issue of
one-sided (past) or two-sided (past and future) correlation, but I
don't remember whether it's changes anything in this case.

(I have to look, I thought I tried or copied acf with fft from
somewhere, maybe nitime.)

Josef


>
> Skipper
> _______________________________________________
> SciPy-User mailing list
> SciPy-User at scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>



More information about the SciPy-User mailing list