[SciPy-user] Questions about scipy.optimize.fmin_cobyla

Dominique Orban dominique.orban at gmail.com
Mon Jul 16 14:14:20 EDT 2007


Anne Archibald wrote:
> On 16/07/07, fdu.xiaojf at gmail.com <fdu.xiaojf at gmail.com> wrote:
> 
>> 2) In function f(X), there are item of log(xi). Although I have
>> constrained the value of xi in the constrain functions, xi still may be
>> equal or less than 0 during minimization, then "math domain error"
>> occurred.
>>
>> What should I do ?
>>
>> My method is when "math domain error" occurred, catch it and set the
>> return value of f to a very large number. Should this work or not?
> 
> This problem happens because while fmin_cobyla is able to handle
> constraints, it nevertheless evaluates the objective function outside
> those constraints. I think this is a very serious bug, which limits
> the usefulness of fmin_cobyla. Unfortunately, I do not have a general
> solution.

This behavior of Cobyla is not a bug.

There are two kinds of methods for constrained problems in optimization: 
feasible and infeasible methods. The iterates generated in a feasible method 
will all satisfy the constraints. In an infeasible method, they may or may not 
satisfy the constraints but, if convergence happens and if all goes well, the 
iterates converge to a point which is (approximately) feasible. Cobyla is an 
infeasible method.

The fact that your objective function is not defined everywhere (because of the 
log terms) suggests that you should not be using an infeasible method. Instead, 
you could look into feasible interior-point methods (beware, there also are 
infeasible interior-point methods... in fact, most of them are).

NLPy features one such method, albeit a simple one, under the name TRIP (for 
'trust-region interior-point method'). For those who care about this sort of 
thing, it is a purely primal method at the moment. I mostly implemented it as a 
proof of concept in the NLPy framework, and plan to upgrade it in the future. It 
will not treat equality constraints, but the inequalities can be as nonlinear as 
you please, as long as they are twice differentiable. You can give it a try and 
let me know what happens.

In addition to the constraints you already have, add new constraints stating 
that the arguments of the logs must be >= 0. Interior-point methods ensure that 
at each iteration, those quantities remain > 0 (strictly).

If the constraints you end up with are *only bound constraints*, you can also 
look into L-BFGS-B (which I believe is interfaced in SciPy), a projected 
quasi-Newton method, or TRON (www.mcs.anl.gov/~more/tron which I don't think is 
interfaced in SciPy), a projected Newton method. NLPy also contains a 
gradient-projection method which I find surprisingly efficient, but also only 
applies to bound constraints.

Re-parametrizing your problem is, of course, another option.

Good luck,
Dominique



More information about the SciPy-User mailing list