[SciPy-user] fmin won't optimize

Travis Oliphant oliphant at ee.byu.edu
Sat Nov 6 23:58:19 EST 2004


Yaroslav Bulatov wrote:

>Hi
>
>I'm fitting Gibbs distribution to data using maximum likelihood. If I
>use fmin_powell, it works, but when I use fmin, it terminates after 1
>iteration regardless of starting point. The log-likelihood function is
>convex. Any idea why this would happen?
>
>The code below terminates with likelihood 4.158883 if I use fmin, but
>with 3.296169 if "fmin" is replaced with "fmin_powell"
>
># Train simple 3 parameter Gibbs Distribution
>
>import math
>from scipy import *
>from scipy.optimize import *
>
>train_set=[(0,0),(0,1),(1,1)]
>#test_set=[(0,0),(0,1),(1,1)]
>
># Negative log-likelihood
>def neg_ll(lambdas):
>  unnormalized = lambda(x):
>  
>
Note, this line break needed to be fixed for it to work.

>  math.exp(lambdas[0]*x[0]+lambdas[1]*x[1]+lambdas[2]*x[0]*x[1])
>  Z = sum([unnormalized(x) for x in [(0,0),(0,1),(1,0),(1,1)]])
>  ll = 0
>  for x in train_set:
>    ll+=math.log(unnormalized(x))-math.log(Z)
>  return -ll
>
>def main():
>  x0 = [0,0,0]
>  xopt = fmin(neg_ll, x0)
>  print 'Before training: %f' %(neg_ll(x0),)
>  print 'After training: %f' %(neg_ll(xopt),)
>
>if __name__=='__main__': main()
>
>  
>
I tried your code and it worked like this for me:

 >>> xopt = fmin_powell(neg_ll,x0)
Optimization terminated successfully.
         Current function value: 3.296169
         Iterations: 7
         Function evaluations: 168
 >>> print xopt
[-8.2385 -0.0122  8.254 ]

 >>> xopt = fmin(neg_ll,x0)
Optimization terminated successfully.
         Current function value: 3.295837
         Iterations: 269
         Function evaluations: 510
 >>> print xopt
[-34.2701  -0.      34.2701]





More information about the SciPy-User mailing list