[SciPy-User] Numerical estimation of Hessian matrix at minimum

josef.pktd at gmail.com josef.pktd at gmail.com
Thu Nov 25 14:27:10 EST 2010


2010/11/25 Eirik Gjerløw <eirikgje at student.matnat.uio.no>:
> Hello,
>
> I am looking for a function (written in python) that will essentially do
> the same thing as the function nlm (non-linear minimization) in R, when
> passed the argument Hessian=T. That is, I would like to numerically
> compute n values (n>1) which, when passed to a function, give its
> minimum, and the value of the Hessian matrix at that point.
>
> Does such a thing exist?

There are some packages that calculate numerical derivatives,
numdifftools, or automatic differentiation, (??? recent thread).
Openopt will also have some features for this.

In statsmodels, we us a method that wraps several scipy optimizers

http://bazaar.launchpad.net/~scipystats/statsmodels/devel/annotate/head%3A/scikits/statsmodels/model.py#L135

A subclass is supposed to provide the gradient and Hessian, which, if
we don't have the analytical gradient or Hessian, we get it by
numerical differentiation, using our own numdiff.hess for forward
differentiation.
It's slightly messy to look at, because it's fully integrated in our
Maximum Likelihood Estimation "framework".
Essentially, when we call fit() from a subclass of the
GenericLikelihoodModel, we have estimated value, parameter covariance
matrix from inverse Hessian and similar statistics directly available
through the superclass. (Maybe more than you want to know.)

Josef

>
> Regards,
> Eirik Gjerløw
>
>
> _______________________________________________
> SciPy-User mailing list
> SciPy-User at scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>



More information about the SciPy-User mailing list