Mathematics in Python are not correct

Lou Pecora pecora at anvil.nrl.navy.mil
Sat May 10 11:06:56 EDT 2008


In article <mailman.882.1210361735.12834.python-list at python.org>,
 "Terry Reedy" <tjreedy at udel.edu> wrote:

> "Lou Pecora" <pecora at anvil.nrl.navy.mil> wrote in message 
> news:pecora-DFE713.11234209052008 at ra.nrl.navy.mil...
> | In article <mailman.825.1210293599.12834.python-list at python.org>,
> | "Terry Reedy" <tjreedy at udel.edu> wrote:
> |
> | > "Luis Zarrabeitia" <kyrie at uh.cu> wrote in message
> | > news:200805081914.06459.kyrie at uh.cu...
> | > | Btw, there seems to be a math problem in python with 
> exponentiation...
> | > | >>> 0**0
> | > | 1
> | > | That 0^0 should be a nan or exception, I guess, but not 1.
> | >
> | > a**b is 1 multiplied by a, b times.  1 multiplied by 0 no times is 1.
> | > But there are unenlighted people who agree with you ;-)
> | > Wikipedia has a discussion of this.
> | >
> | > tjr
> |
> | I like that argument better.  But...
> |
> | I've also heard the very similar a**b is a multiplied by a b-1 times.
> 
> Me too, in school, but *that* definition is incomplete: it excludes b=0 and 
> hence a**0 for all a.  It was the best people could do before 0 was known.
> But 0 was introduced to Europe just over 800 years ago ;-)

[cut some interesting examples]

Yes, I was also thinking about the b=0 case and then the case when b<0.  
If you solve the b<0 case, you solve the b=0 case for a !=0.  Define

a**b= a multiplied by 1/a |b-1| times when b<0.  Then for b=0 we get 
a*(1/a)=1.

Of course, I can avoid all this mathematical dancing around by using 
some of the other simpler definitions like the original one.  :-)

-- 
-- Lou Pecora



More information about the Python-list mailing list