[Scipy-svn] r6221 - trunk/scipy/optimize
scipy-svn at scipy.org
scipy-svn at scipy.org
Tue Feb 9 03:41:34 EST 2010
Author: dmitrey.kroshko
Date: 2010-02-09 02:41:34 -0600 (Tue, 09 Feb 2010)
New Revision: 6221
Modified:
trunk/scipy/optimize/anneal.py
trunk/scipy/optimize/cobyla.py
trunk/scipy/optimize/lbfgsb.py
trunk/scipy/optimize/minpack.py
trunk/scipy/optimize/nnls.py
trunk/scipy/optimize/optimize.py
trunk/scipy/optimize/slsqp.py
trunk/scipy/optimize/tnc.py
Log:
scikits.openopt replaced by mere openopt
Modified: trunk/scipy/optimize/anneal.py
===================================================================
--- trunk/scipy/optimize/anneal.py 2010-02-08 15:18:27 UTC (rev 6220)
+++ trunk/scipy/optimize/anneal.py 2010-02-09 08:41:34 UTC (rev 6221)
@@ -217,6 +217,8 @@
fixed_point -- scalar fixed-point finder
+ OpenOpt -- Python package with more optimization solvers
+
"""
x0 = asarray(x0)
lower = asarray(lower)
Modified: trunk/scipy/optimize/cobyla.py
===================================================================
--- trunk/scipy/optimize/cobyla.py 2010-02-08 15:18:27 UTC (rev 6220)
+++ trunk/scipy/optimize/cobyla.py 2010-02-09 08:41:34 UTC (rev 6221)
@@ -46,8 +46,6 @@
See also:
- scikits.openopt, which offers a unified syntax to call this and other solvers
-
fmin, fmin_powell, fmin_cg,
fmin_bfgs, fmin_ncg -- multivariate local optimizers
leastsq -- nonlinear least squares minimizer
@@ -65,6 +63,9 @@
fixed_point -- scalar fixed-point finder
+ OpenOpt -- a tool which offers a unified syntax to call this and
+ other solvers with possibility of automatic differentiation
+
"""
err = "cons must be a sequence of callable functions or a single"\
" callable function."
Modified: trunk/scipy/optimize/lbfgsb.py
===================================================================
--- trunk/scipy/optimize/lbfgsb.py 2010-02-08 15:18:27 UTC (rev 6220)
+++ trunk/scipy/optimize/lbfgsb.py 2010-02-09 08:41:34 UTC (rev 6221)
@@ -119,8 +119,6 @@
ACM Transactions on Mathematical Software, Vol 23, Num. 4, pp. 550 - 560.
See also:
- scikits.openopt, which offers a unified syntax to call this and other solvers
-
fmin, fmin_powell, fmin_cg,
fmin_bfgs, fmin_ncg -- multivariate local optimizers
leastsq -- nonlinear least squares minimizer
@@ -138,6 +136,9 @@
fixed_point -- scalar fixed-point finder
+ OpenOpt -- a tool which offers a unified syntax to call this and
+ other solvers with possibility of automatic differentiation
+
"""
n = len(x0)
Modified: trunk/scipy/optimize/minpack.py
===================================================================
--- trunk/scipy/optimize/minpack.py 2010-02-08 15:18:27 UTC (rev 6220)
+++ trunk/scipy/optimize/minpack.py 2010-02-09 08:41:34 UTC (rev 6221)
@@ -102,8 +102,6 @@
See Also
--------
- scikits.openopt : offers a unified syntax to call this and other solvers
-
fmin, fmin_powell, fmin_cg, fmin_bfgs, fmin_ncg : multivariate local optimizers
leastsq : nonlinear least squares minimizer
@@ -118,6 +116,9 @@
fixed_point : scalar and vector fixed-point finder
+ OpenOpt : a tool which offers a unified syntax to call this and
+ other solvers with possibility of automatic differentiation
+
"""
if not warning :
msg = "The warning keyword is deprecated. Use the warnings module."
@@ -263,7 +264,6 @@
See Also
--------
- scikits.openopt: offers a unified syntax to call this and other solvers
fmin, fmin_powell, fmin_cg, fmin_bfgs, fmin_ncg: multivariate local optimizers
fmin_l_bfgs_b, fmin_tnc, fmin_cobyla: constrained multivariate optimizers
anneal, brute: global optimizers
@@ -272,6 +272,9 @@
brentq, brenth, ridder, bisect, newton: one-dimensional root-finding
fixed_point: scalar and vector fixed-point finder
curve_fit: find parameters for a curve-fitting problem.
+ OpenOpt : a tool which offers a unified syntax to call this and
+ other solvers with possibility of automatic differentiation
+
"""
if not warning :
msg = "The warning keyword is deprecated. Use the warnings module."
Modified: trunk/scipy/optimize/nnls.py
===================================================================
--- trunk/scipy/optimize/nnls.py 2010-02-08 15:18:27 UTC (rev 6220)
+++ trunk/scipy/optimize/nnls.py 2010-02-09 08:41:34 UTC (rev 6221)
@@ -16,6 +16,8 @@
wrapper around NNLS.F code below nnls/ directory
+ Check OpenOpt for more LLSP solvers
+
"""
A,b = map(asarray_chkfinite, (A,b))
Modified: trunk/scipy/optimize/optimize.py
===================================================================
--- trunk/scipy/optimize/optimize.py 2010-02-08 15:18:27 UTC (rev 6220)
+++ trunk/scipy/optimize/optimize.py 2010-02-09 08:41:34 UTC (rev 6221)
@@ -156,7 +156,8 @@
Uses a Nelder-Mead simplex algorithm to find the minimum of
function of one or more variables.
-
+ Check OpenOpt - a tool which offers a unified syntax to call
+ this and other solvers with possibility of automatic differentiation.
"""
fcalls, func = wrap_function(func, args)
x0 = asfarray(x0).flatten()
@@ -694,8 +695,8 @@
*See Also*:
- scikits.openopt : SciKit which offers a unified syntax to call
- this and other solvers.
+ OpenOpt : a tool which offers a unified syntax to call
+ this and other solvers with possibility of automatic differentiation.
"""
x0 = asarray(x0).squeeze()
@@ -862,7 +863,8 @@
using the nonlinear conjugate gradient algorithm of Polak and
Ribiere See Wright, and Nocedal 'Numerical Optimization',
1999, pg. 120-122.
-
+ Check OpenOpt - a tool which offers a unified syntax to call
+ this and other solvers with possibility of automatic differentiation.
"""
x0 = asarray(x0).flatten()
if maxiter is None:
@@ -1018,8 +1020,7 @@
If True, return a list of results at each iteration.
:Notes:
- 1. scikits.openopt offers a unified syntax to call this and other solvers.
- 2. Only one of `fhess_p` or `fhess` need to be given. If `fhess`
+ 1. Only one of `fhess_p` or `fhess` need to be given. If `fhess`
is provided, then `fhess_p` will be ignored. If neither `fhess`
nor `fhess_p` is provided, then the hessian product will be
approximated using finite differences on `fprime`. `fhess_p`
@@ -1027,6 +1028,8 @@
given, finite-differences on `fprime` are used to compute
it. See Wright, and Nocedal 'Numerical Optimization', 1999,
pg. 140.
+ 2. Check OpenOpt - a tool which offers a unified syntax to call
+ this and other solvers with possibility of automatic differentiation.
"""
x0 = asarray(x0).flatten()
@@ -1179,8 +1182,9 @@
Finds a local minimizer of the scalar function `func` in the
interval x1 < xopt < x2 using Brent's method. (See `brent`
for auto-bracketing).
+ Check OpenOpt - a tool which offers a unified syntax to call
+ this and other solvers with possibility of automatic differentiation.
-
"""
# Test bounds are of correct form
@@ -1722,7 +1726,8 @@
Uses a modification of Powell's method to find the minimum of
a function of N variables.
-
+ Check OpenOpt - a tool which offers a unified syntax to call
+ this and other solvers with possibility of automatic differentiation.
"""
# we need to use a mutable object here that we can update in the
# wrapper function
Modified: trunk/scipy/optimize/slsqp.py
===================================================================
--- trunk/scipy/optimize/slsqp.py 2010-02-08 15:18:27 UTC (rev 6220)
+++ trunk/scipy/optimize/slsqp.py 2010-02-09 08:41:34 UTC (rev 6221)
@@ -146,6 +146,11 @@
for examples see :ref:`in the tutorial <tutorial-sqlsp>`
+ See also
+ --------
+ OpenOpt - a tool which offers a unified syntax to call this
+ and other solvers with possibility of automatic differentiation.
+
"""
exit_modes = { -1 : "Gradient evaluation required (g & a)",
Modified: trunk/scipy/optimize/tnc.py
===================================================================
--- trunk/scipy/optimize/tnc.py 2010-02-08 15:18:27 UTC (rev 6220)
+++ trunk/scipy/optimize/tnc.py 2010-02-09 08:41:34 UTC (rev 6221)
@@ -164,8 +164,6 @@
Return code as defined in the RCSTRINGS dict.
:SeeAlso:
- - scikits.openopt, which offers a unified syntax to call this and other solvers
-
- fmin, fmin_powell, fmin_cg, fmin_bfgs, fmin_ncg :
multivariate local optimizers
@@ -184,6 +182,9 @@
- fixed_point : scalar fixed-point finder
+ - OpenOpt : a tool which offers a unified syntax to call this and
+ other solvers with possibility of automatic differentiation.
+
"""
x0 = asarray(x0, dtype=float).tolist()
n = len(x0)
More information about the Scipy-svn
mailing list