[SciPy-Dev] Global Optimization Benchmarks

Andrea Gavana andrea.gavana at gmail.com
Sun Jul 26 10:05:20 EDT 2020


Dear SciPy developers & users,

     I have a couple of new derivative-free, global optimization algorithms
I’ve been working on lately - plus some improvements to AMPGO and a few
more benchmark functions - and I’d like to rerun the benchmarks as I did
back in 2013 (!!!).

In doing so, I’d like to remove some of the least interesting/worst
performing algorithms (Firefly, MLSL, Galileo, the original DE) and replace
them with the ones currently available in SciPy - differential_evolution,
SHGO and dual_annealing.

Everything seems good and dandy, but it appears to me that SHGO does not
accept an initial point for the optimization process - which makes the
whole “run the optimization from 100 different starting points for each
benchmark” a bit moot.

I am no expert on SHGO, so maybe there is an alternative way to “simulate”
the changing of the starting point for the optimization? Or maybe some
other approach to make it consistent across optimizers?

Any suggestion is more than welcome.

Andrea.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20200726/14c80b9b/attachment.html>


More information about the SciPy-Dev mailing list