[Python-3000] symbols?

Adam DePrince adam.deprince at gmail.com
Fri Apr 14 07:29:34 CEST 2006


On Wed, 2006-04-12 at 05:15 -0700, Michael Chermside wrote:
> Kendall Clark writes:
> > The other choice, of course, is for the library or API to define some
> > variables bound to strings and then use them like constants, except
> > that they can get redefined:
> >
> >     @pylons.rest.restrict(GET, HEAD)
> >     ...
> >
> > A symbol has the advantage that it can't be assigned to, so can't
> > change like a variable, but expresses more clearly than a string the
> > intent to refer to a name or term, not string data
> 
> When I see a variable name in all-caps, I don't assign to it. I don't
> even need a tool like PyChecker to remind me that this is a constant
> because I've been familiar with the "all-caps == constant" convention
> from shortly after I got a computer with lower-case letters on it.
> The other programmers I work with seem to behave the same way. I must
> be unusually lucky in this regard, because I meet lots of people who
> are very concerned about the fact that it *is* possible to change
> these values. I can only surmise that they work with people who make
> a habit of modifying all-caps variables randomly just for fun.

One thing we should consider is how euro-centric this convention is.
I'm not certain that all of the characters in all eastern languages can
be readily divided into upper and lower cases.  IIRC one of the goals
with P3K is to allow Python to compile unicode source files, with that
this convention will eventually stop working.  

I remember asking a similar question about a similar topic when I was a
wee-lad; why does C have enumerations when it has #define.  Aside from
answers like "because gdb can return symbols instead of numbers", the
real motivation is compiler encouraged discipline.  

The real question isn't about the merit of this new symbol data type,
but rather a philosophical question about the direction the language
should go.  On the spectrum of "linguistic liberty", we have "B&D"
languages like Java and C# on one side and the Rebelaisian utopia of
Perl on the other.  Python is sort of in the middle.  The question being
asked here is "Does using plain strings for symbols make us too much
like Perl, or will fixing it by introducing a symbol type make us too
much like Java.  

Personally, I rather like the direction of the symbols idea, but am
unsure if I like the :symbol form for the literal.  I give it a +0.5.

> I've found it even more curious that no one ever seems to be worried
> about the ability to write infinite loops ("while True: pass") in
> pretty much any piece of code. Of course, I suppose it might have
> something to do with the fact that the haulting problem makes it
> theoretically impossible to prevent all infinite loops. But it's

Just to be a nit pick, modern computers are not Turing complete.  They
are finite state machines; albeit large FSMs.  Proper Turing machines
have one ended mythical tapes of infinite length.  Your computer is a
finite state machine.   But I can understand the confusion, old open
reel tapes, when dropped with a copy of your thesis on them tend to feel
a lot like Turing tapes.   There is a correlation between the number of
copies you had, and the number of tape ends you find.

Solving the halting problem for a Turing machine isn't hard at all; hop
on a plane to Greece and wander about the ruins of the temple of
Delphi  ... 

On a serious note, nobody thinks about it as a dangerous flaw because it
is a provably inescapable part of the computing experience.   But as
vocal as our friend Alan was about the dangers of his auto-referential
one sided tapes, he was rather quiet about the issue of case we face
here.  The two issues are not at all comparable.  

Cheers - Adam DePrince



More information about the Python-3000 mailing list