ANNOUNCE: New algorithms added to optimize.py

Travis Oliphant olipt at mayo.edu
Thu Aug 31 15:20:22 EDT 2000


This is a heads up to those interested in Python-implemented optimization
algorithms.

I've updated my optimize.py module to version 0.3 by including two new
unconstrained optimization algorithms to minimize a function of many
variables -- one which implements a quasi-Newton algorithm (BFGS) and
another which implements a practical Newton's algorithm using conjugate
gradients to invert the Hessian (approximated if not provided). 

The license is very liberal it's available at

http://oliphant.netpedia.net/packages/optimize.py

Here are the docstrings:

>>> print optimize.fminBFGS.__doc__

xopt = fminBFGS(f, fprime, x0, args=(), avegtol=1e-5,
                       maxiter=None, fulloutput=0, printmessg=1)

    Optimize the function, f, whose gradient is given by fprime using the
    quasi-Newton method of Broyden, Fletcher, Goldfarb, and Shanno (BFGS)
    See Wright, and Nocedal 'Numerical Optimization', 1999, pg. 198.



>>> print optimize.fminNCG.__doc__

xopt = fminNCG(f, fprime, x0, fhess_p=None, args=(), avextol=1e-5,
                       maxiter=None, fulloutput=0, printmessg=1)

    Optimize the function, f, whose gradient is given by fprime using the
    Newton-CG method.  fhess_p must compute the hessian times an arbitrary
    vector. If it is not given, finite-differences on fprime are used to 
    compute it. See Wright, and Nocedal 'Numerical Optimization', 1999,
    pg. 140.






More information about the Python-list mailing list