[SciPy-Dev] minimize reporting successful but not searching (with default method)

David Mikolas david.mikolas1 at gmail.com
Mon Mar 21 21:59:50 EDT 2016


Andrew,

Differential evolution is new to me and therefore by definition
interesting, thanks!

In this particular case I definitely want the local minimum (this eclipse),
and calls are expensive - there's a millisecond delay for each new instance
of Julian Date arrays

http://stackoverflow.com/q/35358401

But if I access the database directly, then this could be very interesting
to try to search for new events!

On Tue, Mar 22, 2016 at 9:00 AM, Andrew Nelson <andyfaff at gmail.com> wrote:

> If you have a rough idea of lower/upper bounds for the parameters then you
> could use differential_evolution, it uses a stochastic rather than gradient
> approach. It will require many more function evaluations though.
>
>
> On 22 March 2016 at 08:39, Pauli Virtanen <pav at iki.fi> wrote:
>
>> Mon, 21 Mar 2016 23:54:13 +0800, David Mikolas kirjoitti:
>> > It works fine if I specify Nelder-Mead, but my question is "why does
>> > minimize return successful when it isn's
>>
>> Your function is piecewise constant ("staircase" pattern on a small
>> scale).
>>
>> The derivative at the initial point is zero.
>>
>> The optimization method is a local optimizer.
>>
>> Therefore: the initial point is a local optimum (in the sense understood
>> by the optimizer i.e. satisfying the KKT conditions).
>>
>>    ***
>>
>> As your function is not differentiable, the KKT conditions don't mean
>> much.
>>
>> Note that it works with Nelder-Mead is sort of a coincidence --- it works
>> because the "staircase" pattern of the function is on a scale smaller
>> than the initial simplex size.
>>
>> On the other hand, it fails for BFGS, because the default step size used
>> for numerical differentiation happens to be smaller than the size of the
>> "staircase" size.
>>
>> > and what can I do (if anything) to flag this behavior?
>>
>> Know whether your function is differentiable or not, and which
>> optimization methods are expected to work.
>>
>> There's no cheap general way of detecting --- based on the numerical
>> floating-point values output by a function --- whether a function is
>> everywhere differentiable or continuous. This information must come from
>> the author of the function, and optimization methods can only assume it.
>>
>> You can however make sanity checks, e.g., using different initial values
>> to see if you're stuck at a local minimum.
>>
>> If the function is not differentiable, optimizers that use derivatives,
>> even numerically approximated, can be unreliable.
>>
>> If the non-differentiability is on a small scale, you can cheat and choose
>> a numerical differentiation step size large enough --- and hope for the
>> best.
>>
>> --
>> Pauli Virtanen
>>
>> _______________________________________________
>> SciPy-Dev mailing list
>> SciPy-Dev at scipy.org
>> https://mail.scipy.org/mailman/listinfo/scipy-dev
>>
>
>
>
> --
> _____________________________________
> Dr. Andrew Nelson
>
>
> _____________________________________
>
> _______________________________________________
> SciPy-Dev mailing list
> SciPy-Dev at scipy.org
> https://mail.scipy.org/mailman/listinfo/scipy-dev
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20160322/f9765ae5/attachment.html>


More information about the SciPy-Dev mailing list