[SciPy-user] optimization question

dmitrey openopt at ukr.net
Wed Jul 4 03:28:44 EDT 2007


Volker Lorrmann wrote:
> Hi Robert,
>
>  > All leastsq() really adds to fmin*() is that you return the residuals 
>  > instead of summing up the squares of the residuals. You can do that
>  > manually and use whichever minimizer you need.
>
> Can you give me a short example, how to do so? I´ve asked this also 
> Matthieu, so if he will answer, you probably don´t need to. But two 
> examples would be better than one ;)
>
>  > Well, that's a big problem. You have twice as many variables to fit 
> as > you have datapoints. There are possibly an infinite number of
>  > solutions that exactly(!) fit your data.
>
> I see, indeed i need as many datapoints (equations) as variables, to 
> solve the problem.
>
>  > Is it possible that you can reparameterize your problem? Perhaps c(.) 
>  > and d(.)
>  > could be formulated as functions that depend on some small number of 
>   > parameters
>  > besides x_i in order to smooth things out. You would then do
>  > least-squares to optimize those parameters.
>
> I think i can do that. There are some physical constraints for c(x_i) 
> and d(x_i), which should make it possible to reparameterize c an d.
>
> volker
>
>   
as one more solution, for constrained NLP problem you can try "lincher" 
solver from scikits.openopt toolkit (free, BSD license). There is also a 
ralg solver that is good for both NLP and non-smooth problems, but it's 
currently suitable for unconstrained problems only.
See example of lincher usage here:
http://openopt.blogspot.com/2007/06/constrained-nlp-example-with-gradients.html

>BTW, don't people on this mailing list ever sleep?
>It's 2:00 AM here in Birmingham, Alabama.

>     James Phillips
>     http://zunzun.com

Here in Ukraine it's 10.27AM, time to work :)

Regards, Dmitrey.








More information about the SciPy-User mailing list