declarations summary

Michael Tobis mt at 3planes.com
Sun Feb 6 23:34:22 EST 2005


Summary of my understanding of a recent interesting thread:

General usage has "declaration" meaning "statement which does not
generate executable bytecode but merely affects the compiler". My
assertion that decorator syntax is "declarative" is therefore formally
false.

The common assertion that "Python is 100% executable" is an
exageration, but not because of the introduction of decorator syntax.
The "global" statement, however is not executable and is recognized as
a declaration.

Both "global" and "@", though have the characteristics that 1) they
don't fit comfortably into Python syntax and 2) they at least
conceptually affect the reference and not the referent. In suport of
the first point, unlike any other block, decorators can appear at only
particular places in the file. Meanwhile globals are counterintuitively
 indifferent to whether they are nested in code branches that don't
execute.

.x = 1
.def foo():
.   if False:
.      global x
.   x = 2
.foo()
.print x

prints "1"

These are implemented in different ways and both seem to be somehow
unsatisfactory. They also seem to have something in common. However,
for me to call them both "declarations" was incorrect.

Pythonistas appear to be averse to any declarations (in the sense of
compiler directives) besides "global".

The resistance to "@" is a lost cause, even though a dramatically
superior
def foo(a,b,c) decoby bar"
syntax was proposed. I find this hard to justify, as a deocrator is
clearly a modifier of an immediately following function declaration
rather than a freeestanding executable block.

If it were feasible,I'd even prefer
def foo(a,b,c) decoby [baz,quux]:
so that the list of decorators could be dynamically constructued at
define time.

Anyway, it's this sort of reference-modifier that's at issue here,
whether or not the implementation is truly declarative.

Now that there's two, can there be more? Should there be?

It's not difficult for me to imagine at least two other things I'd want
to do with references: 1) make them immutable (making life easier for
the compiler) 2) make them refer only to objects with certain
properties (strong duck-typing).

Also there's the question of typo-driven bugs, where an attempted
rebinding of "epsilon" instead cerated a reference called "epselon".
(The epselon bug) This is the bane of fortran, and after generations it
was generally agreed that optionally one could require all references
to be declared (implicit none). Perl went throgh a similar process and
implemented a "use strict" module. Experienced Pythonistas are oddly
resistant to even contemplating this change, though in the Fortran and
Perl worlds their usage quickly became practically universal.

I believe that enthusiasm for this construct in other languages is so
strong that it should be prominently addressed in Python introductions
and surveys, and that questions in this regard should be expected from
newbies and patiently addressed. I also don't fully understand the
profound aversion to this idea.

I myself am going back and forth on all this. I despise the fact that
good fortran90 code often has more declarations than executables. Ed
Ream (LEO's author) pointed out to me the Edward-Tufte-ness of Python.
No wasted ink (except maybe the colons, which I do sometimes find
myself forgetting on long edits...)  I don't want Python to look like
f90. What would be the point?

On the other hand, I think compiler writers are too attached to
cleverly divining the intention of the programmer. Much of the effort
of the compiler could be obviated by judicious use of declarations and
other hints in cases where they matter. Correct code could always be
generated without these hints, but code could be made more
runtime-efficient with them. People who are running codes that are a
compute-bound tend not to be beginners, and could put up with some
clutter when tuning their code.

In the end, I think the intuition of what's Pythonic is enormously
valuable, and I'm continuing to learn from discussions here every day.
As a result, I'm inclined to believe in the grounds of authority that
this sort of trick does not belong in Python.

That said it seems unlikely that any version of Python will ever be
competitive as a high-performance numerical computing language because
nothing can be known about bindings at compile time, as we are
constantly being remineded here. I'll be happy if Python knocks my
socks off yet again, but I can't see counting on it. So the successor
to Fortran (presuming it isn't C++, which I do presume) may be
influenced by Python, but probably it won't be Python.

mt




More information about the Python-list mailing list