[SciPy-User] calculating the jacobian for a least-squares problem

Andrew Nelson andyfaff at gmail.com
Mon Mar 26 19:57:00 EDT 2018


I would like to calculate the Jacobian for a least squares problem,
followed by a Hessian estimation, then the covariance matrix from that
Hessian.

With my current approach I sometimes experience issues with the covariance
matrix in that it's sometimes not positive semi-definite. I am using the
covariance matrix to seed a MCMC sampling process by supplying it to
`np.random.multivariate_normal` to get initial positions for the MC chain.
I am using the following code:

```
from scipy.optimize._numdiff import approx_derivative
jac = approx_derivative(residuals_func, x0)
hess = np.matmul(jac.T, jac)
covar = np.linalg.inv(hess)
```

Note that x0 may not be at a minimum.

- would this be the usual way of estimating the Hessian, is there anything
incorrect with the approach?
- what is the recommended way (i.e. numerically stable) of inverting the
Hessian in such a situation?
- does `optimize.leastsq` do anything different?
- if `x0` is not at a minimum should the covariance matrix be expected to
be positive semi-definite anyway?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-user/attachments/20180327/9faefadd/attachment.html>


More information about the SciPy-User mailing list