variable declaration

Michael Tobis mt at 3planes.com
Tue Feb 1 11:05:34 EST 2005


Given the behavior, the documentation is gratifyingly correct.

Given that the syntax is legal, though, the behavior is not what one
would intuitively expect, and is therefore unPythonic by (rather
dramatically) violating the principle of least surprise.

It's also, to me, understandable why it's difficult for the language
design to avoid this behavior.

This little discovery of mine sheds some considerable light on the
awkwardness of what you guys will deign to call "declarations". This
being the case, I can understand the resistance to "declarations" in
Python.

I had thought, until the current conversation and this experiment, that
the globals statement, er, declaration was just another executable,
especially given all the stress on Python's being purely executable.

I still see "global" and "@" as expressions of the same fundamental
problem, even though decorators are not implemented as declarations.
They both take effect in a non-intuitive sequence and they both affect
the reference rather than the referent.

This points out the problem that Python has in qualifying references
rather than referents.

Since BDFL is contemplating some optional typing, does this also imply
qualifying the references?

Maybe you wizard types can agree that there is a useful abstraction
that I'm talking about here, whether you wish to call it "declarations"
or not, and try to factor out some sort of consistent strategy for
dealing with it, perhaps in P3K. (I will try to be in a position to
help someday, but I have a long way to go.)

Language features that modify references rather than referents appear
to be problematic. Python clearly chafes at these. Yet there are at
least a few compelling reasons to want them. 

--
mt




More information about the Python-list mailing list