[SciPy-dev] Derivative-based nonlinear optimization with linear ineqality constraints

Bill Baxter wbaxter at gmail.com
Tue May 22 14:58:59 EDT 2007


On 5/22/07, Anne Archibald <peridot.faceted at gmail.com> wrote:
> On 22/05/07, Bill Baxter <wbaxter at gmail.com> wrote:
>
> > All very interesting, but I'm not sure what point you're trying to make.
> > I'd be happy to try different solvers on my problem, and in fact I
> > have tried a couple.  The BFGS seem to work pretty well if I start
> > from a feasible point, but it makes me a little nervious using them
> > since the values of my function are bogus outside the feasible region.
> >  I'm just setting f(x) to HUGE_NUM outside the feasible region.
> > fmin_tnc did *not* work. It seems to take a step out of bounds and get
> > lost.
>
> What happened when you tried using a derivative-less solver (which is
> guess is just fmin_cobyla)? Too slow? I've never actually gone to the
> trouble of implementing analytic derivatives even when I could have,
> but then I've never had particularly demanding minimizations to do.

I should do more testing, but when I tried BFGS with finite
differences it used about 10x as many function evaluations.  The
difference in computation between just the function and the
function+gradient is not that big, so using finite differences was
significantly slower.   I'll give cobyla, and the standard fmin
simplex a try just to see, though.

--bb



More information about the SciPy-Dev mailing list