[SciPy-dev] Derivative-based nonlinear optimization with linear ineqality constraints

Matthieu Brucher matthieu.brucher at gmail.com
Tue May 22 04:21:28 EDT 2007


Hi,

I think a part of the GSoC on optimization handles the constraints in the
framework I proposed, doesn't it ?
Writting a solver in this framework would mainly mean writting a new
Optimizer that would use the same steps or line searches, that would handle
all convex constraints, I suppose.

Matthieu

2007/5/22, Bill Baxter <wbaxter at gmail.com>:
>
> On 5/22/07, dmitrey <openopt at ukr.net> wrote:
> > Bill Baxter wrote:
> > > There don't seem to be any optimizers in SciPy that have all of these
> > > attributes:
> > > 1) nonlinear objective function
> > > 2) uses analytical derivatives
> > > 3) allows (non-)linear (in)equality constraints
> > >
> > So your message differs very much from topic "Derivative-based nonlinear
> > optimization with linear    ineqality constraints"
>
> Not sure what you mean.  I have gradient information and I am
> interested in using it to optimizing a non-linear objective function
> subject to linear inequality constraints.  But at the same time I
> don't see any gradient-based solvers in scipy that can handle *any*
> constraints other than simple bounds, hence the (non-) and (in) in
> parentheses.
>
> > > Did I just miss it?  The only things I see are methods that can only
> > > handle simple bounds, or don't make use of derivative info.
> > >
> > It seems to me I had seen in scipy.optimize solver(s) that allow(es) to
> > handle nonlinear inequality constraints (with derives), but not
> > equality.
>
> Here's what there is:
> fmin - simplex method, no derivs no constraints
> fmin_powell - powell's method, no derivs no constraints
> fmin_cg - uses derivs, no constraints
> fmin_ncg - uses derivs (and hessian), no constraints
> fmin_bfgs - uses derivs, no constraints
> fmin_l_bfgs_b - uses derivs, bounds only
> fmin_tnc - uses derivs, bounds only
> fmin_cobyla - no derivs, but non-linear constraints
>
> So again the question... did I miss anything?  It seems like there's a
> hole in the category of "uses derivs, supports constraints".
>
> > There are very small % of NLP solvers that are able to handle
> > equality constraints. Also note please that many SQP/rSQP solvers can
> > handle equality, but not inequality constraints, so amount of NLP
> > solvers capable of handling both eq & ineq is very small. Usually it is
> > based in method of linearization. Afaik there is free IPopt (but it has
> > GPL license) from COIN-OR project, written in C++, and some commercial,
> > for example KNitro. There was a Ukrainean academician Pshenichniy, who
> > had spend dozens of years to research in linearization-based
> > optimization methods, but some months ago he died.
> > BTW in his book "Linearization method" (1983) he says very important
> > observation:
> > "multiyears practice of solving optimization problems made specialists
> > think, that there can't be universal NLP algorithm, that successfully
> > enough solves ALL problems" (and that he is agree with the statement, as
> > well as I do). I would explain the statement with my own args, but it
> > requires much English-written text, so let me omit the one here.
> > And some lines below Pshenichniy continue:
> > "Practical experience shows, that there are a sets of problems, where an
> > algorithm, that seems to be the most ineffective from the theoretical
> > point of view, turns out to yield good results, due to specific of
> > problem. Hence, a number of different algorithms is required."
> > An alternative approach (to constructing a single universal solver, like
> > some do) can be like TOMLAB (or like in my OpenOpt): assignment of
> > problem to single object + trying different solvers. And, of course,
> > observing convergence pictures, because just single output <f_opt,
> > x_opt, time_elapsed> can be very uninformative in many cases.
>
> All very interesting, but I'm not sure what point you're trying to make.
> I'd be happy to try different solvers on my problem, and in fact I
> have tried a couple.  The BFGS seem to work pretty well if I start
> from a feasible point, but it makes me a little nervious using them
> since the values of my function are bogus outside the feasible region.
> I'm just setting f(x) to HUGE_NUM outside the feasible region.
> fmin_tnc did *not* work. It seems to take a step out of bounds and get
> lost.
>
> --bb
> _______________________________________________
> Scipy-dev mailing list
> Scipy-dev at scipy.org
> http://projects.scipy.org/mailman/listinfo/scipy-dev
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20070522/b54aa714/attachment.html>


More information about the SciPy-Dev mailing list