[SciPy-dev] Proposal for more generic optimizers (posted before on scipy-user)

Alan G Isaac aisaac at american.edu
Sun Mar 11 16:07:11 EDT 2007


> 4. I understand that you want an object that provides 
>>    the function, gradient, and hessian.  But when you 
>>    make a class for these, it is full of (effectively) 
>>    class functions, which suggests just using a module. 


On Sun, 11 Mar 2007, Matthieu Brucher apparently wrote: 
> It's not only a module, it is a real class, with a state. 
> For instance, an approximation function can need a set of 
> points that will be stored in the class, and a module is 
> not enough to describe it - a simple linear approximation 
> with a robust cost function for instance - 

This seems to be a different question?
One question is the question of optimizer design:
should it take as an argument an object that provides
a certain mix of services, or should it take instead
as arguments the functions proiding those services.
I am not sure, just exploring it.
I am used to optimizers that take a function,
gradient procedure, and hessian procedure as arguments.
I am just asking whether *requiring* these to be bundled
is the right thing to do.

This design would not mean that I cannot pass as arguments
methods of some object.  (I think I am responding to your
objection here.)

Note that requiring bundling imposes an interface 
requirement on the bundling object.  This is not true
if I just provide the functions/methods as arguments.

> Perhaps a more precise example of the usefullness is needed ? 

Perhaps so.

Cheers,
Alan Isaac





More information about the SciPy-Dev mailing list