[SciPy-User] Sometimes fmin_l_bfgs_b tests NaN parameters and then fails to converge

Skipper Seabold jsseabold at gmail.com
Sat Jan 1 11:38:41 EST 2011


On Fri, Dec 31, 2010 at 7:39 PM, Yury V. Zaytsev <yury at shurup.com> wrote:
> On Fri, 2010-12-31 at 16:35 -0500, josef.pktd at gmail.com wrote:
>
>> But your function has a discontinuity, and I wouldn't expect a bfgs
>> method to produce anything useful since the method assumes smoothness,
>>  as far as I know.
>
> You are perfectly right about the discontinuity, but that was not the
> point. I was rather interested if anyone else is seeing the optimizer
> trying out NaNs as function parameters as in my case or not...
>
> I have this problem with a completely different (smooth and
> differentiable) function, the test script is just something I came up
> with without thinking too much to illustrate the problem.
>

I don't see the NaNs (on 64-bit).  But I have run into perhaps a
similar problem recently.  I switch from fmin_l_bfgs_b to fmin_tnc and
was able to fine tune the step size in the line search (eta in tnc).
Looking only very briefly it looks like the step size for bfgs is
adaptive and determined in the code, but I don't see how to sensibly
change it.  For tnc, I start with eta = 1e-8 and when I get return
code 4, I rerun with eta *= 10.  The return codes for tnc are in

>>> from scipy.optimize.tnc import RCSTRINGS
>>> RCSTRINGS
{-1: 'Infeasible (low > up)',
 0: 'Local minima reach (|pg| ~= 0)',
 1: 'Converged (|f_n-f_(n-1)| ~= 0)',
 2: 'Converged (|x_n-x_(n-1)| ~= 0)',
 3: 'Max. number of function evaluations reach',
 4: 'Linear search failed',
 5: 'All lower bounds are equal to the upper bounds',
 6: 'Unable to progress',
 7: 'User requested end of minimization'}

Curious if this approach might work for you.

Skipper



More information about the SciPy-User mailing list