[SciPy-User] Using leastsq(), fmin(), anneal() to do a least squares fit

Matthieu Brucher matthieu.brucher at gmail.com
Wed Jun 30 15:11:08 EDT 2010


Perhaps your minimum is numerically unstable, or around the global
minimum, the cost function is more or less constant? Due to the 12
variables, you may also have several local minimas where you may be
trapped.

If you want more advanced optimization tools, you may try OpenOpt or
scikits.optimization.

Matthieu

2010/6/30 Andreas <lists at hilboll.de>:
> Hi there,
>
> I have an optimization problem in 12 variables.
>
> I first wrote a functino toBeMinimized(), which outputs these 12
> variables as one array. Trying to solve this problem with leastsq(), I
> noticed that however i play around with the parameters, the function
> does not seem to find the global optimum.
>
> So I figured I'd try some other functions from scipy.optimize, starting
> with anneal(). I wrote a wrapper function around my original
> toBeMinimized(), doing nothing but call
> np.sum(toBeMinimized(params)**2). Now, however, the results I get from
> anneal vary widely, and don't seem to have anything in common with the
> results from leastsq().
>
> Basically the same happens when I use fmin() instead of anneal().
>
> I'm somewhat at a loss here. leastsq() seems to give the most consistent
> results, but still they vary too much to be too useful for me.
>
> Any ideas?
>
> Thanks for your insight,
>
> Andreas.
> _______________________________________________
> SciPy-User mailing list
> SciPy-User at scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>



-- 
Information System Engineer, Ph.D.
Blog: http://matt.eifelle.com
LinkedIn: http://www.linkedin.com/in/matthieubrucher



More information about the SciPy-User mailing list