Perl is worse! (was: Python is Wierd!)

Alex Martelli alex at magenta.com
Sun Jul 30 05:28:35 EDT 2000


"Steve Lamb" <grey at despair.rpglink.com> wrote in message
news:slrn8o75j2.fjt.grey at teleute.rpglink.com...
> On Sun, 30 Jul 2000 01:48:32 GMT, Grant Edwards <nobody at nowhere.nohow>
wrote:
> >I concur. In non-programming, life is strictly typed. Allowed
> >operations are determined by the type of the object. You can't
> >make a phone call on a waffle-iron.  Were I to _try_ to make a
> >phone call on a waffle-iron, it doesn't "automagically" convert
> >istelf into a telephone.  Instead, I get a rather interesting
> >burn pattern on the side of my head.
>
>     You're making the mistake of trying to associate programming concepts
to
> real world objects.

Why is that a mistake?  We're trying to MODEL "the real world", at
some remove (abstraction) to be sure.  Prima facie, a correspondence
between significand and significant is A Good Thing (it helps rather
than hinder the modeling); while there are of course exceptions, the
burden of proof is clearly on the one trying to show that a certain
characteristic is best not-modeled in a given context.

Much, perhaps most, of what philosophers have been trying to do over
the last 2,600 years or so, is exactly about this, by the way.  While
the drastical criticisms of this endeavour in the "Philosophical
Untersuchungen" are well warranted, some grounding in what went on
before does help in making heads or tails out of it.  I've always
been rather perplexed at the lack of philosophy courses in the
education of typical software engineers (as for me, fortunately, I
am old enough, and European enough, that I got a "typical liberal
education" in a Lyceum before moving on to Engineering in University:-).

For an interesting mix of these philosophical considerations with
some rigour and pragmatism, try modern semiotics, most particularly
Umberto Eco (of course I'm biased in his favour -- he teaches at
my Alma Mater [which happens to be THE original "Alma Mater"], albeit
in Arts rather than Engineering, plus, how many high-powered academics
you know that are also best-selling novelists?-).


>     I'm sorry, but that is why I said data is data is data.

"Significants are significants", but they signify different
significands in any semiotic system (meaning-system).  Nor do
you deny this, except that you want to draw the line at
'structure' or 'behaviour', AND fail to accept that strings
in particular have both structure and behaviour so you'd
like to get them 'merged' with numbers (whose behaviour in
fact differs).


>     To give you a counter example when I write my pen doesn't toss up an
> exception when I write out, "11 people are insane!" because it contains an
> integer in a string context!

As a tool for modeling, a pen significantly fails to possess any
semantic capability itself.  All it does is lay ink on some surface,
and all levels of meaning upon those ink-blobs (the very fact that
they are meant to be structured into sequences of characters, the
alphabet being used, and so on up) depend on other layers of the
modeling-system.  Your pen won't (can't) complain, nor help you
avoid the problem, if you use it to make totally random splotches
of ink.  Other tools impose more structure -- e.g., a typewriter
imposes a character-oriented structure, and a given alphabet of
character-symbols; that makes it more suitable for its intended
tasks, while at the same time removing some flexibility (if you need
to prepare a Rorschach test, a pen or brush is more appropriate than
a typewriter; but for writing text, a typewriter offers a higher
productivity if it's applicable in your given context).

It's not a meaningful criticism of a typewriter that it's not as
good as a pen if making random splotches of ink is your purpose.
It IS an important consideration to keep in mind if you often
need Rorschach-work, to help you pick the best tool for each task,
of course.


Alex






More information about the Python-list mailing list