for / while else doesn't make sense

Rustom Mody rustompmody at gmail.com
Mon May 23 02:09:32 EDT 2016


On Monday, May 23, 2016 at 9:59:27 AM UTC+5:30, Ben Finney wrote:
> Jon Ribbens writes:
> 
> > OK, I'm bored of you now. You clearly are not willing to imagine
> > a world beyond your own preconceptions.
> 
> Steven has, in the message to which you responded, asked for you to
> *describe* this other world you assert exists.
> 
> More concretely: Steven is not denying someone might have different
> expectations. On the contrary, you've said your expectations differ, and
> Steven is *explicitly asking* you to specify those expectations.
> 
> And, instead of answering, you give this dismissal. Are your
> expectations so hard to describe?

Steven is making wild and disingenuous statements; to wit:

On Monday, May 23, 2016 at 3:39:19 AM UTC+5:30, Steven D'Aprano wrote:
> I'm not defining the result. 4000+ years of mathematics defines the result.

This is off by on order of magnitude.
Decimal point started with Napier (improving on Stevin): 17th century
OTOH it is plain numbers (ℕ) that have been in use for some 4 millennia.

> 
> If you get up off your chair and wander around and ask people other than C
> programmers "What's one divide by two?", I am confident that virtually zero
> percent will answer "zero".

You forget that we (most of us?) went to school.
My recollections of it -- ok maybe fogged by near 5 decades:

I first learnt something called 'long-division'
In that procedure you take 2 numbers called divisor and dividend
And GET TWO NUMBERS a quotient and a remainder.
[At that point only knew of the numbers we would later call ℤ (or was it ℕ -- 
not sure -- decimal point would come later]

Later (again dont remember order) we were taught
- short division
- decimal numbers
[I mention short division -- put numerator on top of denominator and cancel off
factors -- because the symmetry of numerator:denominator and quotient:remainder
is more apparent there than in long-division]

In any case if learning primacy has any significance, pure integer division
is more basic than decimal number division.

To recapitulate the situation:
Mathematics (mathematicians if you prefer) have a strong attachment to
to two nice properties of operators:

The first is obvious and unarguable -- totality
The second does not have a standard term but is important enough -- I will
call it 'homogeneity'.  By this I mean a type of the form:  t × t → t 

Its nice to have totality because one can avoid case-analysing:
f(x) when x ∈ domain(f)

Its nice to have homogeneity because homogeneous operators can be 
nested/unnested/played-with
ie for ◼ :  t × t → t
x ◼ y ◼ z makes sense this way x ◼ (y ◼ z) or this way (x ◼ y) ◼ z
With non-homogeneous ◼ these may not make sense.


- Choosing ÷ to be total and homogeneous necessitates widening 
  ℤ (or ℕ) to ℚ or ℝ (or something as messy)
- Choosing ÷ to be non-homogeneous means needing to deal with quotients and 
  remainders
  Cant write if (x/4 < 256)...
  have to write 
     quot, rem = x/4   # throw away rem
     if quot < 256: ...


Haskell has (almost) what I learnt at school:

Prelude> let (q,r) = 7 `divMod` 3
Prelude> (q,r)
(2,1)

Replace the strange `divMod` with / and we are back to the behavior I first 
learnt at school


So with some over-simplification:
- the first choice leads to numerical analysis
- the second leads to number theory

To say that one is natural --especially the one that chooses something other 
than natural numbers! --and the other is surprising is nonsense.



More information about the Python-list mailing list