[SciPy-Dev] minimize reporting successful but not searching (with default method)

David Mikolas david.mikolas1 at gmail.com
Mon Mar 21 21:46:21 EDT 2016


Pauli thank you for taking the time to explain clearly and thoroughly. I've
updated the question in stackexchange by removing words like "wrong" and
"fail", and I've added a supplementary answer with a plot of the staircase
behavior of the JulianDate method.

http://stackoverflow.com/a/36144582
http://i.stack.imgur.com/CUuia.png

That's 40 microseconds on a span of thousands of years.

On Tue, Mar 22, 2016 at 5:39 AM, Pauli Virtanen <pav at iki.fi> wrote:

> Mon, 21 Mar 2016 23:54:13 +0800, David Mikolas kirjoitti:
> > It works fine if I specify Nelder-Mead, but my question is "why does
> > minimize return successful when it isn's
>
> Your function is piecewise constant ("staircase" pattern on a small
> scale).
>
> The derivative at the initial point is zero.
>
> The optimization method is a local optimizer.
>
> Therefore: the initial point is a local optimum (in the sense understood
> by the optimizer i.e. satisfying the KKT conditions).
>
>    ***
>
> As your function is not differentiable, the KKT conditions don't mean
> much.
>
> Note that it works with Nelder-Mead is sort of a coincidence --- it works
> because the "staircase" pattern of the function is on a scale smaller
> than the initial simplex size.
>
> On the other hand, it fails for BFGS, because the default step size used
> for numerical differentiation happens to be smaller than the size of the
> "staircase" size.
>
> > and what can I do (if anything) to flag this behavior?
>
> Know whether your function is differentiable or not, and which
> optimization methods are expected to work.
>
> There's no cheap general way of detecting --- based on the numerical
> floating-point values output by a function --- whether a function is
> everywhere differentiable or continuous. This information must come from
> the author of the function, and optimization methods can only assume it.
>
> You can however make sanity checks, e.g., using different initial values
> to see if you're stuck at a local minimum.
>
> If the function is not differentiable, optimizers that use derivatives,
> even numerically approximated, can be unreliable.
>
> If the non-differentiability is on a small scale, you can cheat and choose
> a numerical differentiation step size large enough --- and hope for the
> best.
>
> --
> Pauli Virtanen
>
> _______________________________________________
> SciPy-Dev mailing list
> SciPy-Dev at scipy.org
> https://mail.scipy.org/mailman/listinfo/scipy-dev
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20160322/87b0cea0/attachment.html>


More information about the SciPy-Dev mailing list