[SciPy-dev] Proposal for more generic optimizers (posted before on scipy-user)

Pierre GM pgmdevlist at gmail.com
Thu Mar 22 14:49:10 EDT 2007


On Thursday 22 March 2007 13:37:59 Matthieu Brucher wrote:
> > Right now you require an object that implements methods with
> > certain names, which is ok but I think not perfect.  Here is
> > a possibly crazy thought, just to explore.  How about making
> > all arguments optional, and allowing passing either such an
> > object or the needed components?  Finally, to accmmodate
> > existing objects with a different name structure, allow
> > passing a dictionary of attribute names.
>
> Indeed, that could be a solution. The only question remaining is how to use
> the dictionnary, perhaps creating a fake object with the correct interface

Just 2c:
I just ported the loess package to numpy (the package is available in the 
scipy SVN sandbox as 'pyloess'). The loess part has a C-based structure that 
looks like what you're trying to reproduce.

Basically, the package implements a new class, loess. The attributes of a 
loess object are themselves classes: one for inputs, one for outputs, and 
several others controlling different aspects of the estimation.

The __init__method of a loess object requires two mandatory arguments for the 
independent variables (x) and the observed response (y). These arguments are 
used to initialize the inputs subclass. The __init__ method also accepts 
optional parameters, that modiify the values of the different parameters. 
However, one can also modify specific arguments directly.

The outputs section is created when a new loess object is instantiated, but 
with empty arrays. The arrays are filled when the .fit() method of the loess 
object is called. 

Maybe you would like to check the tests/test_pyloess.py file to get a better 
idea. I'm currently updating the docs and writing a small example.



More information about the SciPy-Dev mailing list