[SciPy-dev] Proposal for more generic optimizers (posted before on scipy-user)

Michael McNeil Forbes mforbes at physics.ubc.ca
Wed Apr 18 02:04:22 EDT 2007


Okay, I think we are thinking similar things with different terminology:

I think you are saying that only one object should maintain state  
(your "optimizer") (I was originally sharing the state which I agree  
can cause problems).  If so, I agree, but to me it seems that object  
should be called a "partially optimized function".  I think of an  
"optimizer" as something which modifies state rather than something  
that maintains state.  I am thinking of code like:

------
roughly_locate_minimum = Optimizer 
(criterion=extremelyWeakCriterion,step=slowRobustStep,...)
find_precise_minimum = Optimizer 
(criterion=preciseCriterion,step=fasterStep,...)

f = Rosenbrock(...)
x0 = ...

f_min = OptimizedFunction(f,x0)
f_min.optimize(optimizer=roughly_locate_minimum)
f_min.optimize(optimizer=find_precise_minimum)

#OR (this reads better to me, but the functions should return copies  
of f_min, so may not be desirable for performance reasons)
f_min = roughly_locate_minimum(f_min)
f_min = find_precise_minimum(f_min)

# Then one can query f_min for results:
print f_min.x   # Best current approximation to optimum
print f_min.f
print f_min.err #Estimate error
print f_min.df
# etc...
-----

The f_min object keeps track of all state, can be passed from one  
optimizer to another, etc.  In my mind, it is simply an object that  
has accumulated information about a function.  The idea I have in  
mind is that f is extremely expensive to compute, thus the object  
with state f_min accumulates more and more information as it goes  
along.  Ultimately this information could be used in many ways, for  
example:

- f_min could keep track of roughly how long it takes to compute f 
(x), thus providing estimates of the time required to complete a  
calculation.
- f_min could keep track of values and use interpolation to provide  
fast guesses etc.

Does this mesh with your idea of an "optimizer"?  I think it is  
strictly equivalent, but looking at the line of code  
"optimizer.optimize()" is much less useful to me than "f_min.optimize 
(optimizer=...)".

What would your ideal "user" code look like for the above use-case?

I will try to flesh out a more detailed structure for the  
OptimizedFunction class,
Michael.

On 16 Apr 2007, at 12:27 PM, Matthieu Brucher wrote:

> ...
> OK, that is not what a function is :)
> A function is the set of f, df, ddf but not with the options. What  
> you are exposing is the construction of an optimizer ;)
>
> Did you see the code in my different proposals ?
> In fact, you have a function class - for instance Rosenbrock class  
> - that defines several methods, like gradient, hessian, ... without  
> a real state - a real state being something other than for instead  
> the number of dimension for the rosenbrock function, the points  
> that need to be approximated, ... a real state is something that is  
> dependent of the subsequent calls to the functor, the gradient, ...  
> - so that this function can be reused efficiently.
> Then you use an instance of this class to be optimized.
> You choose your step mode with its parameters like gradient,  
> conjugate gradient, Newton, Quasi-Newton, ...
> You choose your line search with its own parameters -  
> tolerance, ... - like section methods, interpolation methods, ...
> Actually, you choose your stopping criterion.
> Then you make something like :
> optimizer = StandardOptimizer(function = myFunction,  step =  
> myStep, .......)
> optimizer->optimize()
>
> That is a modular design, and that is why some basic functions must  
> be provided so that people that don't care about the underlying  
> design really do not have to care. Then if someone wants a  
> specific, non-standard optimizer, one just have to select the  
> wanted modules - for instance, a conjugate-gradient with a golden- 
> section line search and a relative value criterion onstead of a  
> fibonacci search and an absolute criterion -.
>
> It can be more cumbersome at the start, but once some modules are  
> made, assembling them will be more easy, and tests will be more fun :)
> ...



More information about the SciPy-Dev mailing list