How about adding rational fraction to Python?

Jeff Schwab jeff at schwabcenter.com
Sun Mar 2 18:21:31 EST 2008


Paul Rubin wrote:
> Jeff Schwab <jeff at schwabcenter.com> writes:
>> Better yet, how hard would it be to define an otherwise int-like type
>> that did not define a non-flooring division operator?  Are there any
>> real use cases for such a type?
> 
> User defined types in python are fairly heavyweight compared with the
> built-in types,

Yet they continue to form the basis of almost all non-trivial Python 
programs.  Anyway, it's a bit soon to be optimizing. :)


> and a type like that is just another thing for the
> user to have to remember.

How so?  A well-written function generally shouldn't depending on the 
exact types of its arguments, anyway.  If someone has written a function 
to find (e.g.) the median of a collection of numbers, their code should 
already be prepared to accept values of user-defined numeric types.  If 
I want to call such a function with my hand-rolled DivisionSafeInteger 
type, it should just work, unless specifically documented to work only 
with a particular subset of Python data types.


> The C library has a bunch of different types like off_t (offset in a

off_t is vendor-specific extension, not part of the standard C library. 
  In gcc, it's a typedef for __off_t, which is a macro for _G_off_t, 
which is in turn a macro for a compiler-specific type called _IO_off_t. 
  Elsewhere, it may just be a typedef of long int.


> file) and size_t, so if you pass an off_t to a function that expects a
> size_t as that arg, the compiler notices the error.

On what compiler?  I've never seen a C compiler that would mind any kind 
of calculation involving two native, unsigned types.

     $ cat main.c
     #include <sys/types.h>
     int main() { off_t ot = 0; long li = 3L; ot = li; }
     $ make
     cc -ansi -pedantic -Wall -std=c99    main.c   -o main
     $


> But they are
> really just integers and they compile with no runtime overhead.

They do indeed have run-time overhead, as opposed to (e.g.) meta-types 
whose operations are performed at compile-time.  If you mean they have 
less overhead than types whose operations perform run-time checks, then 
yes, of course that's true.  You specifically stated (then snipped) that 
you "would be happier if int/int always threw an error."  The beauty of 
a language with such extensive support for user-defined types that can 
be used like built-in type is that you are free to define types that 
meet your needs.  The weight of such hand-rolled solutions may lead to 
performance problems at first, but C-linkable extensions go a long way 
to help; hence numpy et al.


> So, I think Python won't easily support lots of different types of
> integers, and we've got what we've got.

My understanding is that Python will easily support lots of different 
types of just about anything.  That's the point.  In theory at least, it 
supports programming in a way that lets the translator (compiler + 
interpreter) keep track of the exact types being used, so that the 
programmer doesn't have to.  The fact that the tracking is done 
dynamically, rather than statically, is a fundamental design decision 
that was made early in the language's development.


> There's an interesting talk linked from LTU about future languages:
> 
>   http://lambda-the-ultimate.org/node/1277

Thanks, but that just seems to have links to the slides.  Is there a 
written article, or a video of Mr. Sweeney's talk?



More information about the Python-list mailing list