[SciPy-User] scipy.optimize.leastsq question

ms devicerandom at gmail.com
Tue Jul 6 14:46:29 EDT 2010


On 06/07/10 19:19, Joshua Holbrook wrote:
> On Tue, Jul 6, 2010 at 10:02 AM, ms<devicerandom at gmail.com>  wrote:
>
>> On 29/06/10 09:59, Sebastian Walter wrote:
>>>>> Only use derivative free optimization methods if your problem is not
>> continuous.
>>>>> If your problem is differentiable, you should compute the Jacobian
>>>>> yourself, e.g. with
>>>>>
>>>>> def myJacobian(x):
>>>>>       h = 10**-3
>>>>>       # do finite differences approximation
>>>>>       return ....
>>   >>>
>>   >>>  and provide the Jacobian to
>>   >>>  scipy.optimize.leastsq(..., Dfun = myJacobian)
>>
>> Uh, I am real newbie in this field, but I expected that the Jacobian was
>> needed if there was an analytical expression for the derivatives; I
>> thought the leastsq routine calculated the finite difference
>> approximation by itself otherwise. So I never bothered providing an
>> "approximate" Jacobian. Or maybe I do not get what do you mean by finite
>> difference.

> I say this not as someone intimately familiar with scipy.optimize, but as
> someone who has implemented a least squares-ish algorithm himself.
>
> You are almost certainly correct in that leastsq calculates an approximate
> Jacobian using a finite difference method on its own. However, if you can
> symbolically differentiate your promblem without too much heartache, then
> supplying an exact Jacobian is probably preferrable due to higher precision
> and less function evaluation (f(x) and f(x+h), differenced and normalized,
> vs. simply f'(x)).
>
> On the other hand: When I implemented my algorithm (nearly two years ago),
> my equations were pretty nasty. My derivatives just happened to be much much
> worse (as can be seen at
> http://modzer0.cs.uaf.edu/~jesusabdullah/gradients.html, at least for a
> little while), and at the time sympy honestly wasn't production-ready. So, I
> ended up using a finite difference method to calculate them (I believe I
> used scipy's derivative function), with which I did have to tweak step
> sizes.

Thank you. What you tell me is very similar to what I have always 
understood. But I was confused because of the pseudocode that Sebastian 
Walter provided:

 >> On 29/06/10 09:59, Sebastian Walter wrote:
 >>>>> If your problem is differentiable, you should compute the Jacobian
 >>>>> yourself, e.g. with
 >>>>>
 >>>>> def myJacobian(x):
 >>>>>       h = 10**-3
 >>>>>       # do finite differences approximation
 >>>>>       return ....
 >>   >>>
 >>   >>>  and provide the Jacobian to
 >>   >>>  scipy.optimize.leastsq(..., Dfun = myJacobian)

that explicitly says you can provide one with finite differences 
approximation, so I am unsure.

thanks,
M.



More information about the SciPy-User mailing list