[SciPy-Dev] Adding a multistart algorithm to global optimizers in scipy.optimize

Anne Archibald peridot.faceted at gmail.com
Mon Jul 18 08:35:44 EDT 2016


On Mon, Jul 18, 2016 at 2:03 PM Fabian Rost <fabian.rost at tu-dresden.de>
wrote:

> Actually, a wrapper that starts many optimizations at random points would
> not only work for global optimizers but also for local optimizers. However,
> this wrapper would be limited to optimizers that use an initial guess
> (optimize.brute and differential_evolution would be excluded). Furthermore,
> this wrapper would turn any local optimizer into a global optimizer. In
> fact, the code I suggested here
> <http://nbviewer.jupyter.org/github/fabianrost84/multistart-minimize/blob/master/multistart-minimize.ipynb> is
> just a wrapper around optimize.minimize.
>
> I realized that the documentation of optimize.basinhopping suggests that
> "the algorithm can be run from a number of different random starting
> points". So a convenient wrapper to do this definitely would not hurt.
>
> I wonder what this wrapper should return. It could return just the global
> optimum. But for analysis it might also be good to get all the
> OptimizeResults. This could help to judge how many local optima there are
> and if more or less initial guesses are needed to find the global optimum.
>
> Parallelization is certainly a good idea.
>
What to return is a question the other global optimizers have already had
to face. The basinhopping code just returns the global optimum, I believe,
but also provides a callback that allows users to see each local minimum.
Presumably they can also stop the global optimization if they're happy, by
raising an exception if nothing else will work. This may require some
thought in the context of parallelization, since it's not so easy to send
results back from subprocesses. It might be necessary to "open the hood" of
basinhopping and run just the local optimizations in subprocesses.

Obviously not everyone is going to want parallelization, but global
optimization tends to be a "big" problem, and making it easy for people to
use parallelization, to the degree and in the way that they prefer, would
make the code in scipy much more useful. For really complex and difficult
optimizations, of course, people are going to want to switch to more
elaborate specialized codes. But it we make that less necessary, so much
the better.

Anne
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20160718/793092eb/attachment.html>


More information about the SciPy-Dev mailing list