[SciPy-Dev] Extending scipy.optimize.minimize with particle swarm

Andrea Gavana andrea.gavana at gmail.com
Fri May 15 11:40:51 EDT 2020


On Fri, 15 May 2020 at 17.21, Ralf Gommers <ralf.gommers at gmail.com> wrote:

>
>
> On Thu, May 14, 2020 at 3:55 AM Andrew Nelson <andyfaff at gmail.com> wrote:
>
>> As a global optimizer in the first instance it would probably be added as
>> a separate function (`particle_swarm`), rather than be added as a method to
>> minimize. I'm not familiar with particle swarm, are there various
>> types/flavours of the approach?
>>
>> The general route to adding new global optimizers is:
>>
>
> Thanks for this detailed reply Andrew. I think it's worth adding this to
> the new section of the developer guide once
> https://github.com/scipy/scipy/pull/12069 is merged.
>
> One thing to ask additionally here is: are particle swarm methods in
> scope? It would be good to compare this proposed method to for example
> what's currently available in PyGMO (https://esa.github.io/pygmo/).
>
> Cheers,
> Ralf
>

My apologies for pitching in, I’m usually just an observer. I’m sure some
of you remember the global optimization benchmarks I once posted - and I
believe they have been updated, adapted and integrated in scipy:

http://infinity77.net/global_optimization/multidimensional.html

In the set of optimizers, a particle swarm algorithm was actually there,
and specifically the one from:

http://www.norg.uminho.pt/aivaz/pswarm/

And it turned out to be not so bad in the end. My impression is that there
might be scope for a particle swarm algorithm in scipy: I’ve used a few
variants of them in the past where many other optimization procedures
failed. But then I’m just a lurker so please take my opinion with a grain
of salt :-).

Andrea.



>
>
>> - is it a recognised solver approach that is proven in the scientific
>> literature?
>> - run it against the global benchmark suite, which would involve
>> modifying
>> https://github.com/scipy/scipy/blob/master/benchmarks/benchmarks/optimize.py (do
>> this before you do any modification of scipy library code). Running those
>> global benchmarks for the `particle_swarm` function would check that the
>> solver is performant enough. By performant we mean that the percentage
>> success rate against the problems is as good as the other global solvers.
>> Also of interest is the average number of function evaluations to reach
>> that success rate. (Time taken is probably related to number of function
>> evaluations). Ideally the function fills holes that the other solvers can't
>> handle.
>>
>> If it's performant then we can go further towards adding it to scipy
>> (these steps would be after a further discussion on this list):
>>
>> - code needs to be compatible with the scipy licence (i.e. no GPL/LGPL)
>> - the changes need a comprehensive test suite.
>> - I strongly suggest that you have a public `particle_swarm` function,
>> and a private ParticleSwarm class that does the solving behind the scenes.
>> If possible make the ParticleSwarm object an iterator with a __next__
>> method so that individual iteration steps can be taken. See
>> https://github.com/scipy/scipy/blob/master/scipy/optimize/_differentialevolution.py as
>> an example. A full solve could be done with a `solve` method.
>> - keep the API as close as possible to other optimizers, i.e. same
>> parameter names, same keyword names, parameter order is similar to other
>> minimizers.
>> - differential_evolution benefitted from offering a reduced set of
>> functionality when it was first added. Additional features came afterwards
>> and the delayed introduction of them meant that the API/design of the
>> solver benefitted from a lot of thought over a prolonged period.
>> - if `particle_swarm` uses random number generation then the solver
>> should be able to generate random numbers with either
>> `np.random.RandomState` or `np.random.Generator` (
>> https://github.com/scipy/scipy/blob/master/scipy/optimize/_differentialevolution.py#L560).
>> This means using methods that belong to both type of objects.
>> - the random number generation needs to be reproducible (function should
>> have `seed` keyword).
>> - total number of function evaluations needs to be tracked.
>> - consider parallelisation aspects of your function. Can calls to the
>> objective function be done in parallel? (`workers` keyword).
>> - vectorisation is nice.
>> _______________________________________________
>> SciPy-Dev mailing list
>> SciPy-Dev at python.org
>> https://mail.python.org/mailman/listinfo/scipy-dev
>>
> _______________________________________________
> SciPy-Dev mailing list
> SciPy-Dev at python.org
> https://mail.python.org/mailman/listinfo/scipy-dev
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20200515/875c0f56/attachment-0001.html>


More information about the SciPy-Dev mailing list