What is considered an "advanced" topic in Python?

Laura Creighton lac at openend.se
Mon Jun 1 06:28:11 EDT 2015


In a message of Mon, 01 Jun 2015 19:45:31 +1000, Chris Angelico writes:
>On Mon, Jun 1, 2015 at 5:58 PM, Laura Creighton <lac at openend.se> wrote:
>> If you are giving a talk about Decimal -- and trying to stamp out the
>> inappropriate use of floats you have to first inform people that
>> what they learned as 'decimals' as children was not floating point,
>> despite the fact that we write them the same way.
>>
>> If I ever get the time machine, I am going back in time and demand that
>> floating point numbers be expressed as 12345:678 instead of 12345.678
>> because it would save so much trouble.  Never has the adage 'It's not
>> what you don't know, that bites you.  It's what you know that ain't so.'
>> been more apt.
>
>While I agree that there are problems with conflating float with "real
>number", I don't know that decimal.Decimal is actually going to solve
>that either; and using a colon as the decimal separator won't solve
>anything (the world already has two - "." in the US and "," in Europe
>- and adding a third with the same semantics of separating the
>greater-than-unit from the sub-unit digits won't instantly give new
>meaning to the way numbers are handled), so I would be whacking you
>over the head when you invent that time machine XKCD 716 style.

You have missed my point.  What I want is for floats never to be
represented as '.' or ',' notation.  That way, when each naive
user writes his or her first program that deals with money, when
they look at their computer manual they will come to the section on
floating point numbers and they will all look like something they
have never seen before.  So they will read the section carefully
to see if this is what they want or need, and the section can nicely
say NEVER USE THIS FOR MONEY and they will know they are in the wrong
place.

The problem is that they take one look at a floating point number,
and think they know what it is from childhood, and use it.  They
don't know any better.  The notion that computer languages have
homophones, and just as the moles you have on your face and the
moles that burrow in your garden and the moles that you make out
of stone as a pier or a breakwater, numbers can work this way in
computers is deep, deep, deep in the 'this cannot be possible'
mindset of people who are learning to use computers.

I argued that when Decimal went into Python it should be called Money
(with an alias for Decimal for those who aren't using it for money) again
so that the people who don't know any better are more likely to end
up in the right place, but I lost that argument, too.

Laura




More information about the Python-list mailing list