[issue22477] GCD in Fractions

Mark Dickinson report at bugs.python.org
Wed Sep 24 09:58:12 CEST 2014


Mark Dickinson added the comment:

The current `gcd` definition is almost accidental, in that it just happens to be what's convenient for use in normalisation in the Fraction type.  If people are using it as a standalone implementation of gcd, independent of the fractions module, then defining the result to be always nonnegative is probably a little less surprising than the current behaviour.

BTW, I don't think there's a universally agreed definition for the extension of the gcd to negative numbers (and I certainly wouldn't take Rosen's book as authoritative: did you notice the bit where he talks about 35-bit machines being common?), so I don't regard the fraction module definition as wrong, per se.  But I do agree that the behaviour you propose would be less surprising.

One other thought: if we're really intending for gcd to be used independently of the fractions module, perhaps it should be exposed as math.gcd.  (That would also give the opportunity for an optimised C version.)

----------

_______________________________________
Python tracker <report at bugs.python.org>
<http://bugs.python.org/issue22477>
_______________________________________


More information about the Python-bugs-list mailing list