questions about solving equations in scipy

Robert Kern robert.kern at gmail.com
Tue Jun 12 21:30:35 EDT 2007


fdu.xiaojf at gmail.com wrote:
> Hi all,
> 
> I have two questions about scipy.

You're likely to get a better response from the scipy mailing list. Here, you'll
primarily get me, and I have to rush out right now.

  http://www.scipy.org/Mailing_Lists

> 1) When I was trying to solve a single variable equations using scipy, I
> found two methods: scipy.optimize.fsolve, which is designated to find the
> roots of a polynomial,

No, it finds the roots of a non-linear system of N functions in N variables. The
documentation makes no mention of polynomials.

> and scipy.optimize.newton, which is used for Scalar
> function root finding according to the help().

There's also brentq, brenth, ridder, and bisect for this problem.

> I have tried both, and it seemed that both worked well, and fsolve ran
> faster.
> 
> My questions is, which is the right choose ?

Whichever one works faster and more robustly for your problem. fsolve is
implemented in FORTRAN, which sometimes helps. I do recommend looking at the
brentq and brenth if you can provide bounds rather than just an initial guess.

> 2) I have to solve a linear equation, with the constraint that all
> variables should be positive. Currently I can solve this problem by
> manually adjusting the solution in each iteration after get the solution
> bu using scipy.linalg.solve().
> 
> Is there a smart way ?

I don't think that's a well-defined problem. Either the (unique) solution is
within the constraint or it's not. Are you sure you don't want to find the
minimum-error solution that obeys the constrain, instead?

-- 
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless enigma
 that is made terrible by our own mad attempt to interpret it as though it had
 an underlying truth."
  -- Umberto Eco




More information about the Python-list mailing list