[SciPy-User] calculating the jacobian for a least-squares problem

Mark Alexander Mikofski mikofski at berkeley.edu
Wed Mar 28 11:21:30 EDT 2018


Hi Andrew,

The covariance can be calculated from the Jacobian as Cij = Jij * Sij *
Jij^T where S is the covariance of the inputs, C is the covariance of the
outputs, and J^T is the transpose of the Jacobian Jij = dxi/dyj.

Sorry if this is redundant or unhelpful.

Best,
Mark

On Tue, Mar 27, 2018, 11:38 PM Evgeni Burovski <evgeny.burovskiy at gmail.com>
wrote:

> Hi,
>
> Additionally to what Gregor said:
>
> - finite-differences estimation of the derivatives should really be a last
> resort; best is paper-and-pencil, or algorithmic differentiation (algopy et
> al). If that is not possible, I'd try some higher-order finite differences.
> E.g. approx_derivatives with method '3-point' or 'cs' (if that works).
>
> - approx_derivative is more sophisticated than fitpack actually. IIUC
> minpack only does the simplest two-point forward scheme,
> https://github.com/scipy/scipy/blob/master/scipy/optimize/minpack/fdjac2.f
>
> - linalg.inv(matrix) is generally better spelled as solve(matrix,
> identity_matrix)
>
> - in this case, it's indeed best to use QR or SVD. curve_fit does a
> pseudoinverse:
>
>
> https://github.com/scipy/scipy/blob/v1.0.0/scipy/optimize/minpack.py#L502-L790
>
> (IIRC this was written by Nikolay, and he cited Ceres or some other
> industry-class optimization software).
>
> Cheers,
>
> Evgeni
>
>
> On Tue, Mar 27, 2018, 12:19 PM Gregor Thalhammer <
> gregor.thalhammer at gmail.com> wrote:
>
>>
>>
>> > Am 27.03.2018 um 01:57 schrieb Andrew Nelson <andyfaff at gmail.com>:
>> >
>> > I would like to calculate the Jacobian for a least squares problem,
>> followed by a Hessian estimation, then the covariance matrix from that
>> Hessian.
>> >
>> > With my current approach I sometimes experience issues with the
>> covariance matrix in that it's sometimes not positive semi-definite. I am
>> using the covariance matrix to seed a MCMC sampling process by supplying it
>> to `np.random.multivariate_normal` to get initial positions for the MC
>> chain. I am using the following code:
>> >
>> > ```
>> > from scipy.optimize._numdiff import approx_derivative
>> > jac = approx_derivative(residuals_func, x0)
>> > hess = np.matmul(jac.T, jac)
>> > covar = np.linalg.inv(hess)
>> > ```
>> >
>> > Note that x0 may not be at a minimum.
>> >
>> > - would this be the usual way of estimating the Hessian, is there
>> anything incorrect with the approach?
>> your straightforward approach is ok, especially since you don’t require
>> the highest precision. An alternative would be to use automatic
>> differentiation to calculate the derivatives accurately, e.g. using algopy,
>> theano or tensor flow
>>
>> > - what is the recommended way (i.e. numerically stable) of inverting
>> the Hessian in such a situation?
>>
>> If your hess matrix is close to being singular, you could gain some
>> precision by using the QR decomposition of the jacobian. In general to
>> solve a linear system it is recommended to avoid calculating the the
>> inverse matrix.
>>
>> > - does `optimize.leastsq` do anything different?
>>
>> leastsq wraps the MINPACK library, which brings it own carefully tuned
>> numeric differentiation routines, and it uses QR decomposition.
>>
>> > - if `x0` is not at a minimum should the covariance matrix be expected
>> to be positive semi-definite anyway?
>> If x0 is not a minimum, then there is no guarantee. Even if x0 is a
>> minimum this might by violated due to numerical errors.
>>
>> best
>> Gregor
>>
>> > _______________________________________________
>> > SciPy-User mailing list
>> > SciPy-User at python.org
>> > https://mail.python.org/mailman/listinfo/scipy-user
>>
>> _______________________________________________
>> SciPy-User mailing list
>> SciPy-User at python.org
>> https://mail.python.org/mailman/listinfo/scipy-user
>>
> _______________________________________________
> SciPy-User mailing list
> SciPy-User at python.org
> https://mail.python.org/mailman/listinfo/scipy-user
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-user/attachments/20180328/3c2c6962/attachment-0001.html>


More information about the SciPy-User mailing list