How coding in Python is bad for you

Chris Angelico rosuav at gmail.com
Mon Jan 23 21:38:46 EST 2017


On Tue, Jan 24, 2017 at 12:47 PM, BartC <bc at freeuk.com> wrote:
> On 24/01/2017 00:56, Chris Angelico wrote:
>>
>> On Tue, Jan 24, 2017 at 11:44 AM, BartC <bc at freeuk.com> wrote:
>
>
>>> With C++ or Java, it's possible to tell the indentation is wrong (because
>>> of
>>> the extra redundancy of having the indentation /and/ block delimiters).
>>> That's a bit harder in Python making source more fragile.
>>
>>
>> With C++ or Java, it's possible to tell that the braces are misplaced
>> (because of the extra redundancy). When that happens, the compiler
>> won't tell you; it'll just silently do the wrong thing. In Python,
>> that can't happen. Python is therefore more robust.
>>
>> What's to say which is correct?
>
> Take the same code with block
> delimiters, and take out that same indent:
>
> if 0 then
>     print ("one")
> print ("two")
> endif
> print ("three")
>
> It still compiles, it still runs, and still shows the correct "three" as
> output.

My point is that you *assume* that showing just "three" is the correct
behaviour. Why? Why do you automatically assume that the indentation
is wrong and the endif is correct? All you have is that the two
disagree.

That's the sort of (presumably unintentional) bias that comes from
your particular history of programming languages. Some languages teach
you that "foo" and "Foo" are the same name. Others teach you that the
integer 1 and the string "1" aren't materially different. Still others
leave you believing that bytes and characters are perfectly
equivalent. All those assumptions are proven wrong when you switch to
a language that isn't like that, and indentation is the same.

So what's to say which is more important? Surely it's nothing more
than your preconceived ideas, and could easily be different if you had
different experiences?

ChrisA



More information about the Python-list mailing list