[SciPy-User] leastsq

josef.pktd at gmail.com josef.pktd at gmail.com
Thu Jun 27 10:37:35 EDT 2013


On Thu, Jun 27, 2013 at 10:13 AM, Frédéric Parrenin
<parrenin.ujf at gmail.com> wrote:
> Dear Matt,
>
> Yes, leastsq is probably what I need.
> As Josef suggested, I can decompose the observation covariance matrix using
> Choleski to transform the model into one with independent observations.
>
> It is still not very clear how to obtain the analyzed (or posterior)
> covariance matrix around the solution.
> At first glance, cov_x is what we are looking for but when looking at the
> doc, it specifies:
>
> Uses the fjac and ipvt optional outputs to construct an estimate of the
> jacobian around the solution. None if a singular matrix encountered
> (indicates very flat curvature in some direction). This matrix must be
> multiplied by the residual variance to get the covariance of the parameter
> estimates – see curve_fit.
>
> Is not jacobian an error in the documentation? I would have expected
> 'covariance'.

I always have problems with this part (I read what I want to hear not
what is written)
As far as I understand
It uses the outerproduct of the jacobian as an estimator for the raw
covariance. If the error function is a standard least squares problem,
then this is also the Hessian (matrix of second derivatives of the
objective function).
The raw covariance corresponds to inv(X'X) in a linear regression
problem (where the X could be the transformed, whitened observations)
The jacobian takes the place of X in the non-linear least squares
problem.

Unless the equation is already prewhitened correctly with the variance
of the error (a different set of long threads on the mailing list and
github issue), then we need to multiply the raw covariance matrix by
an estimate of the error variance.

As far as I remember; Even if you use cholsigmainv as transformation
(prewithening), the calculations for the covariance is exactly the
same as in curve_fit because everything already uses the whitened
terms.

So I think you could just copy the parts from curve_fit to get the
covariance for the more general correlated error case.

Josef

>
> Best regards,
>
> Frédéric
>
>
>
>
>
> 2013/6/27 Matt Newville <newville at cars.uchicago.edu>
>>
>> Hi,
>>
>> I'm pretty baffled by these questions.   optimize.leastsq() does not
>> take a covariance matrix as input, but can give one as output.   It
>> can take functions used to compute the Jacobian... Perhaps that would
>> accomplish what you're trying to do?
>>
>> optimize.curve_fit() is a wrapper around leastsq() for the common case
>> of "fitting data" in which one has a set of observations at a set of
>> sampled "data points", and a set of variables used in a model for the
>> data.     Like leastsq(), it returns the covariance.  If curve_fit()
>> does what you need but seems sup-optimal, than leastsq() is probably
>> what you want to use.
>>
>> Hope that helps, but maybe I'm not understanding what you're trying to do.
>>
>> --Matt Newville
>> _______________________________________________
>> SciPy-User mailing list
>> SciPy-User at scipy.org
>> http://mail.scipy.org/mailman/listinfo/scipy-user
>
>
>
> _______________________________________________
> SciPy-User mailing list
> SciPy-User at scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>



More information about the SciPy-User mailing list