for / while else doesn't make sense
Steven D'Aprano
steve+comp.lang.python at pearwood.info
Mon May 23 04:09:45 EDT 2016
On Monday 23 May 2016 16:09, Rustom Mody wrote:
> Steven is making wild and disingenuous statements; to wit:
>
> On Monday, May 23, 2016 at 3:39:19 AM UTC+5:30, Steven D'Aprano wrote:
>> I'm not defining the result. 4000+ years of mathematics defines the result.
>
> This is off by on order of magnitude.
> Decimal point started with Napier (improving on Stevin): 17th century
> OTOH it is plain numbers (ℕ) that have been in use for some 4 millennia.
Are you saying that the Egyptians, Babylonians and Greeks didn't know how to
work with fractions?
http://mathworld.wolfram.com/EgyptianFraction.html
http://nrich.maths.org/2515
Okay, it's not quite 4000 years ago. Sometimes my historical sense of the
distant past is a tad inaccurate. Shall we say 2000 years instead?
>> If you get up off your chair and wander around and ask people other than C
>> programmers "What's one divide by two?", I am confident that virtually zero
>> percent will answer "zero".
>
> You forget that we (most of us?) went to school.
Er, why would I forget that? That's the point -- people have learned about
fractions. I didn't say "go off deep into the Amazonian rainforests, or into
the New Guinea highlands, and ask innumerate hunter gatherers...".
But even innumerate hunter gatherers will have an understanding that if you
have one yam which you wish to share between two people, they will each get
half. Not zero.
--
Steve
More information about the Python-list
mailing list