Time we switched to unicode? (was Explanation of this Python language feature?)

Rustom Mody rustompmody at gmail.com
Tue Mar 25 08:09:20 EDT 2014


On Tuesday, March 25, 2014 5:16:14 PM UTC+5:30, Antoon Pardon wrote:
> On 25-03-14 12:14, Steven D'Aprano wrote:

> >> Would
> >> such a use already indicate I should use a mathematical front-end?
> >> When a programming language is borrowing concepts from mathematics, I
> >> see no reason not to borrow the symbols used too.
> > I'd like to sum the squares of the integers from n=1 to 10. In the old 
> > Python, I'd write sum(n**2 for n in range(1, 11)), but with the brave new 
> > world of maths symbols, I'd like to write this:
> > http://timmurphy.org/examples/summation_large.jpg
> > How do I enter that, and what text editor should I use?

> You have a point. Blindly following mathematical notation will not
> work, because mathematics often enough uses positional clues that
> will be very hard to incorparate in a programming language.

Two completely separate questions 

1. Symbols outside of US-104-keyboard/ASCII used for python 
   functions/constants
2. Non-linear math notation

It goes back not just to the first programming languages but to Turing's paper
that what a mathematician can do on 2-d paper can be done on a 1-d 'tape'.
IOW conflating 1 and 2 is not even a poor strawman argument -- Its only 1 that anyone is talking about.

The comparison with APL which had/has at least some pretensions to
being mathematics and also simultaneously being a programming
language are more appropriate

So adding to my earlier list:

> Yes APL is a good example to learn mistakes from
> - being before its time/technology
> - taking a good idea too far
> - assuming that I understand clearly implies so do others

- not taking others' good ideas seriously

Structured programming constructs were hardly known in 1960 when APL was invented.
10 years down they were the rage. APL ignored them -- to its own detriment


> > On Tue, 25 Mar 2014 11:38:38 +0100, Antoon Pardon wrote:
> >> On 25-03-14 10:54, Chris Angelico wrote:
> >>> On Tue, Mar 25, 2014 at 8:43 PM, Antoon Pardon
> >>>> I thought programs were read more than written. So if writing is made
> >>>> a bit more problematic but the result is more readable because we are
> >>>> able to use symbols that are already familiar from other contexts, I
> >>>> would say it is worth it.
> >>> It's a matter of extents. If code is read ten times for every time it's
> >>> written, making it twenty times harder to write and a little bit easier
> >>> to read is still a bad tradeoff.
> >>> Also: To what extent IS that symbol familiar from some other context?
> >>> Are you using Python as a programming language, or should you perhaps
> >>> be using a mathematical front-end? Not everything needs to perfectly
> >>> match what anyone from any other context will expect. This is, first
> >>> and foremost, a *programming* language.
> >> So? We do use + -, so why shouldn't we use × for multiplication. 
> > I can't find × on my keyboard!

> Then use an editor that allows you to configure it, so you can
> easily use it.

> That's the kind of advice that is often enough given here if
> some python feature is hard for the tools someone is using.
> So why should it be different now?

> But often enough languages tried to use the symbols that were
> available to them. Now that more are, I see little reason for
> avoiding there use.

I am reminded that when Unix first came out, some of both the early
adoption as well as the early pooh-pooh-ing was around the
novelty/stupidity of using lower-case in programming







More information about the Python-list mailing list