[SciPy-dev] Proposal for more generic optimizers (posted before on scipy-user)

Matthieu Brucher matthieu.brucher at gmail.com
Thu Mar 8 02:59:52 EST 2007


Hi,

Here is a little proposal for the simple optimizer - I intend to make a
damped one if the structure I propose is OK -.
What is in the package :
- Rosenbrock is the Rosenbrock function, with gradient and hessian method,
the example
- Optimizer is the core optimizer, the skeletton
- StandardOptimizer is the standard optimizer - not very complicated in fact
- with six optimization examples
- Criterions is a file with three simple convergence criterions, monotony,
relative error and absolute error. More complex can be created.
- GradientStep is a class taht computes the gradient step of a function at a
specific point
- NewtonStep is the same as the latter, but with a Newton step.
- NoAppendList is an empty list, not derived from list, but it could be done
if needed. The goal was to be able to save every set of parameters if
needed, by passing a list or a container to Optimizer

Some may wonder why the step is a class and not a function. It could be a
function, as I use functors, but I want a class so that state-based steps
can be used as well, as Levenberg-Marquardt one for instance
Now, it is not very complicated, just a bunch of class that are really
simple, but if this kind of structure is interesting, I'd like some comments
so as it can be made more Pythonic.

Matthieu

2007/2/27, Matthieu Brucher <matthieu.brucher at gmail.com>:
>
> Hi,
>
> I'm migratting toward Python for some weeks, but I do not find the tools I
> had to develop for my PhD in SciPy at the moment. I can't, for instance,
> find an elegant way to save the set of parameters used in an optimization
> for the standard algorithms. What is more, I think they can be more generic.
>
> What I did in C++, and I'd like your opinion about porting it in Python,
> was to define a standard optimizer with no iteration loop - iterate was a
> pure virtual method called by a optimize method -. This iteration loop was
> then defined for standard optimizer or damped optimizer. Each time, the
> parameters tested could be saved. Then, the step that had to be taken was an
> instance of a class that used a gradient step, a Newton step, ... and the
> same was used for the stoping criterion. The function was a class that
> defined value, gradient, hessian, ... if needed.
> For instance, a simplified instruction could have been :
> Optimizer* optimizer = StandardOptimizer</*some more parameters not
> relevant in Python*/(function, GradientStep(),
> SimpleCriterion(NbMaxIterations), step, saveParameters);
> optimizer->optimize();
> optimizer->getOptimalParameters();
>
> The "step" argument was a constant by which the computed step had to be
> multiplied, by default, it was 1.
>
> I know that this kind of writting is not as clear and lightweight as the
> current one, which is used by Matlab too. But perhaps giving more latitude
> to the user can be proposed with this system. If people want, I can try
> making a real Python example...
>
> Matthieu
>
> P.S. : sorry for the multiposting, I forgot that there were two ML for
> SciPy :(
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20070308/18a24eb9/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: optimizerProposal.tar.gz
Type: application/x-gzip
Size: 2496 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/scipy-dev/attachments/20070308/18a24eb9/attachment.bin>


More information about the SciPy-Dev mailing list