[SciPy-user] nonlinear optimisation with constraints

Ernest Adrogué eadrogue at gmx.net
Mon Jun 22 13:15:14 EDT 2009


22/06/09 @ 13:54 (+0200), thus spake Sebastian Walter:
> 2009/6/22 Ernest Adrogué <eadrogue at gmx.net>:
> > Hi Sebastian,
> >
> > 22/06/09 @ 09:57 (+0200), thus spake Sebastian Walter:
> >>
> >> are you sure you can't reformulate the problem?
> >
> > Another approach would be to try to solve the system of
> > equations resulting from equating the gradient to zero.
> > Such equations are defined for all x. I have already tried
> > that with fsolve(), but it only seems to find the obvious,
> > useless solution of x=0. I was going to try with a
> > Newton-Raphson alorithm, but since this would require the
> > hessian matrix to be calculated, I'm leaving this option
> > as a last resort :)
> 
> Ermmm, I don't quite get it.  You have an NLP with linear equality
> constraints and box constraints.
> Of course you could write down the Lagrangian for that and define an
> algorithm that satisifies the first and second order optimality
> conditions.
> But that is not going to be easy, even if you have the exact hessian:
> you'll need some globalization strategy (linesearch, trust-region,...)
> to guarantee global convergence
> and implement something like projected gradients so you stay within
> the box-constraints.
> 
> I guess it will be easier to use an existing algorithm...

Mmmm, yes, but the box constraints are merely to prevent the
algorithm from evaluating f(x) with values of x for which f(x)
is not defined. It's not a "real" constraint, because I know
beforehand that all elements of x are > 0 at the maximum.

> And I just had a look at fmin_l_bfgs_b: how did you set the equality
> constraints for this algorithm. It seems to me that this is an
> unconstrained optimization algorithm which is worthless if you have a
> constrained NLP.

You're right. I included the equality constraint within the
function itself, so that the function I omptimised with fmin_l_bfgs_b
had one parameter less and computed the "missing" parameter
internally as a function of the others.

The problem is that this dependent parameter, was unaffected by
the box constraint and eventually would take values < 0.

Fortunately, Siddhardh Chandra has told me the solution, which
is to maximise f(|x|) instead of f(x), with the linear
constraint incorporated into the function, using a simple
unconstrained optimisation algorithm. His message hasn't made it
to the list though.

I have just done this and it seems to work! After 10.410
function evaluations and 8.904 iterations fmin has found the
solution and it looks sound at first sight.

Thanks for your help.

> remark:
> To compute the Hessian you can always use an AD tool. There are
> several available in Python.
> My biased  favourite one being pyadolc (
> http://github.com/b45ch1/pyadolc ) which is slowly approaching version
> 1.0.

I will have a look.

Thanks again :)

Ernest



More information about the SciPy-User mailing list