python vs perl lines of code

Terry Hancock hancock at anansispaceworks.com
Thu May 18 20:28:34 EDT 2006


Michael Tobis wrote:

>John Bokma wrote:
>>"akameswaran at gmail.com" <akameswaran at gmail.com> wrote:
>>>Ok I'm going to end with a flamebait - but I would posit, ALL OTHER
>>>THINGS BEING EQUAL - that a smaller number of characters and lines in
>>>code is more maintainable than larger number of characters and lines in
>>>the code.
>>>
>>And I think that's why a lot of people posted very negative, in the hope
>>that people would not be tempted to make the above very dumb statement.
>Since it's too late to avoid such temptation, could you explain why you
>are willing to go so far as to call that statement "very dumb"?

The problem with the idea is that "simplicity" is not quite equal
to "a smaller number of characters".  For example, of each of the
following pairs, which is "clearer":

ls    list
cp    copy
mv    move

Or to ask it another way, "Which is easier to *read*?"

Although I have long since come to take the former spellings for
granted, and they are undeniably shorter, the latter would've actually
been easier for people to remember.  The insistence of using the
longer, more mnemonic form is a major characteristic of Python.

The wisdom behind this is that Python's design takes into account
the observation that source code is read many times more than it
is written.  To do this, it sacrifices some brevity compared to other
languages (the classic example being Perl which uses lots of arcane
symbols and tends to use shorter, more Unix-like naming).

However, this turns out to be a big win, in that you don't waste so
much time trying to remember abbreviated names. There are, for
example several ways of abbreviating a word like "quantity":

q
Q
quant
qtty
quanty
qnty
Qtty
QNT

etc.

But there's only one "standard spelling". If I only have to remember
that "Python prefers to use the standard spelling" as a rule of thumb,
then I will usually get the right answer (there are still significant
exceptions such as "dict" instead of "dictionary" and "def" instead of
"define" -- but they are fewer than in most languages).

Anyway, since humans tend to parse whole words as symbols when
reading, rather than individual characters, the choice between using
symbols or abbreviations versus whole standard-form words is not
believed to matter (I think it would be a strong claim to say that this
is proven, but it does seem likely to me).

Frankly, even when typing, I find it slightly easier to type "unmount"
than "umount" -- I always pause after the "u" to remember the funky
Unix spelling.  It's only a split-second delay, but those things add up,
and each one represents a possible bug, if the output is source code.

Even "dictionary" is hardly different from "dict" -- I've already 
learned to type common English morphemes without thinking about the
process, so it just doesn't make any difference from my PoV. That's
why I always find the hoops people will jump through to avoid typing
a couple of extra keystrokes to be silly (are they typing with their
noses?* ;-) ).  It would frankly be harder for me to remember the
extra mechanics than to just type the extra letters.

The claim is ironic -- even counter-intuitive -- considering this
conventional lore.  But the real point is that no one here can make
any reasonably objective assessment of whether your "data" is
meaningful unless you post examples.  That's what creates the
hostility, I think.

Cheers,
Terry


*With apologies to anyone who's actually using assistive tech to
type their code -- but I'm sure you already know how to use macros
to get what you want typed.

-- 
Terry Hancock (hancock at AnansiSpaceworks.com)
Anansi Spaceworks http://www.AnansiSpaceworks.com






More information about the Python-list mailing list