variable declaration

Alex Martelli aleaxit at yahoo.com
Sat Feb 5 06:52:01 EST 2005


Alexander Zatvornitskiy
<Alexander_Zatvornitskiy at p131.f3.n5025.z2.fidonet.org> wrote:

> Hi, Alex!
> 
> 31 jan 2005 at 13:46, Alex Martelli wrote:
> 
> (sorry for the delay,my mail client don't highlight me your answer)
> 
>  AM> Since the lack of declarations is such a crucial design choice for
>  AM> Python, then, given that you're convinced it's a very bad thing, I
>  AM> suggest you give up Python in favor of other languages that give you
>  AM> what you crave.
> Well, I like Python. But, as every language I know, it have some bad sides
> which I don't like. One of them in Python is lack of variable declarations,
> another (this problem is common with C/C++) is:
> ===
> >>>print 1/2
> 0
> ===
> (I understand why it is so, but I don't like it anyway. Such behaviour
> also can cause some hard-to-find-bugs)

You're conflating a fundamental, crucial language design choice, with a
rather accidental detail that's already acknowledged to be suboptimal
and is already being fixed (taking years to get fixed, of course,
because Python is always very careful to keep backwards compatibility).

Run Python with -Qnew to get the division behavior you probably want, or
-Qwarn to just get a warning for each use of integer division so those
hard to find bugs become trivially easy to find.  Or import from the
future, etc, etc.

The fact that in Python there are ONLY statements, NO declarations, is a
completely different LEVEL of issue -- a totally deliberate design
choice taken in full awareness of all of its implications.  I do not see
how you could be happy using Python if you think it went wrong in such
absolutely crucial design choices.


>  AM> issue for you.  Therefore, using Python, for you, would mean you'd be
>  AM> fighting the language and detesting its most fundamental design
>  AM> choice: and why should you do that?  There are zillions of languages
>  AM> -- use another one.
> Thank you for advice:)

You're welcome.

>  >> Pascal, or special syntax in C. It can cause very ugly errors,like
>  >> this: epsilon=0 S=0 while epsilon<10:  S=S+epsilon
>  >> epselon=epsilon+1 print S It will print zero, and it is not easy to
>  >> find such a bug!
>  AM> Actually, this while loop never terminates and never prints anything,
> Oh, I don't find it:)

Hit control-C (or control-Break or whatever other key combination
interrupts a program on your machine) when the program is just hanging
there forever doing nothing, and Python will offer a traceback showing
exactly where the program was stuck.

In any case, you assertion that "it will print zero" is false.  You
either made it without any checking, or chose to deliberately lie (in a
rather stupid way, because it's such an easy lie to recognize as such).


> Fine! Let interpreter never show us errors like division by zero, syntax
> errors, and so on. If file not found, library don't need to say it. Just skip
> it!!! Because every, even simple, test will find such bugs. Once you have unit
> tests, the added value of <anything> is tiny, and their cost remains.

Another false assertion, and a particularly ill-considered one in ALL
respects.  Presence and absence of files, for example, is an
environmental issue, notoriously hard to verify precisely with unit
tests.  Therefore, asserting that "every, even simple, test will find"
bugs connected with program behavior when a file is missing shows either
that you're totally ignorant about unit tests (and yet so arrogant to
not let your ignorance stop you from making false unqualified
assertions), or shamelessly lying.

Moreover, there IS no substantial cost connected with having the library
raise an exception as the way to point out that a file is missing, for
example.  It's a vastly superior approach to the old idea of "returning
error codes" and forcing the programmer to check for those at every
step.  If the alternative you propose is not to offer ANY indication of
whether a file is missing or present, then the cost of THAT alternative
would most obviously be grievous -- essentially making it impossible to
write correct programs, or forcing huge redundancy if the check for file
presence must always be performed before attempting I/O.

In brief: you're not just wrong, you're so totally, incredibly, utterly
and irredeemably wrong that it's not even funny.


> And, one more question: do you think code like this:
> 
> var S=0
> var eps
> 
> for eps in xrange(10):
>   S=S+ups
> 
> is very bad? Please explain your answer:)

Yes, the many wasted pixels in those idiotic 'var ' prefixes are a total
and utter waste of programmer time.  Mandated redundancy, the very
opposite of the spirit of Python.

> Uh! And you! And you!... And you must never even come close to any languages
> with variable declaration! Even to Visual Basic! :)

Wrong again.  I've made a good living for years as a C++ guru, I still
cover the role of C++ MVP for the Brainbench company, I'm (obviously)
totally fluent in C (otherwise I could hardly contribute to the
development of Python's C-coded infrastructure, now could I?), and as it
happens I have a decent command (a bit rusty for lack of recent use) of
dozens of other languages, including several Basic dialects and Visual
Basic in particular.

It should take you about 20 seconds with Google to find this out about
me, you know?  OK, 30 seconds if you're on a slow dialup modem line.

So, I guess you just *LIKE* being utterly and monumentally wrong, since
it would be so easy to avoid at least some of the bloopers you instead
prefer to keep making.

I *CHOOSE* Python, exactly because I have vast programming experience in
such a huge variety of languages, across all kinds of application areas,
methodologies, and sizes and levels of programming teams.  It's not
``perfect'', of course, being a human artifact, but it does implement
its main design ideas consistently and brilliantly, and gets the
inevitable compromises just about right.


Alex



More information about the Python-list mailing list