[SciPy-Dev] Consideration of differential evolution minimizer being added to scipy.optimize.

Robert Kern robert.kern at gmail.com
Tue Mar 4 06:33:41 EST 2014


On Tue, Mar 4, 2014 at 2:21 AM, Andrew Nelson <andyfaff at gmail.com> wrote:
> I have written some code implementing the differential evolution
> minimization algorithm, as invented by Storn and Price.  It's a
> stochastic technique, not gradient based, but it's quite good at
> finding global minima of functions.
>
> (see http://www1.icsi.berkeley.edu/~storn/code.html,
> http://en.wikipedia.org/wiki/Differential_evolution)
>
> I'd like it to be considered for inclusion in scipy.optimize, and have
> tried to write it as such. Can anyone give advice of how to go about
> polishing the code, such that it's suitable for inclusion in
> scipy.optimize?
>
> https://github.com/andyfaff/DEsolver

Looks good! A couple of minor nits:

- Use PEP8-compliant names for methods, attributes and variables.

- You don't need double-underscore-private methods. Single-underscores will do.

- Don't seed a new RandomState instance every time. Instead, take a
RandomState instance (defaulting to the global numpy.random if not
specified). This lets you integrate well with other code that may also
be using random numbers. For example, see what sklearn does:

https://github.com/scikit-learn/scikit-learn/blob/master/sklearn/utils/validation.py#L301

Actually, since this is such a common pattern, I may steal that
function to put into numpy.random so everyone can use it.

-- 
Robert Kern



More information about the SciPy-Dev mailing list