[SciPy-Dev] minimize reporting successful but not searching (with default method)

Pauli Virtanen pav at iki.fi
Mon Mar 21 17:39:47 EDT 2016


Mon, 21 Mar 2016 23:54:13 +0800, David Mikolas kirjoitti:
> It works fine if I specify Nelder-Mead, but my question is "why does
> minimize return successful when it isn's

Your function is piecewise constant ("staircase" pattern on a small 
scale).

The derivative at the initial point is zero.

The optimization method is a local optimizer.

Therefore: the initial point is a local optimum (in the sense understood 
by the optimizer i.e. satisfying the KKT conditions).

   ***

As your function is not differentiable, the KKT conditions don't mean 
much.

Note that it works with Nelder-Mead is sort of a coincidence --- it works 
because the "staircase" pattern of the function is on a scale smaller 
than the initial simplex size.

On the other hand, it fails for BFGS, because the default step size used 
for numerical differentiation happens to be smaller than the size of the 
"staircase" size.

> and what can I do (if anything) to flag this behavior?

Know whether your function is differentiable or not, and which 
optimization methods are expected to work.

There's no cheap general way of detecting --- based on the numerical 
floating-point values output by a function --- whether a function is 
everywhere differentiable or continuous. This information must come from 
the author of the function, and optimization methods can only assume it.

You can however make sanity checks, e.g., using different initial values 
to see if you're stuck at a local minimum.

If the function is not differentiable, optimizers that use derivatives, 
even numerically approximated, can be unreliable.

If the non-differentiability is on a small scale, you can cheat and choose 
a numerical differentiation step size large enough --- and hope for the 
best.

-- 
Pauli Virtanen




More information about the SciPy-Dev mailing list