AI and cognitive psychology rant

Alex Martelli aleaxit at yahoo.com
Tue Oct 14 17:13:45 EDT 2003


John J. Lee wrote:

> Alex Martelli <aleax at aleax.it> writes:
> [...]
>> > Whether a system is intelligent must be determined by the result.  When
>> > you feed a chess configuration to the big blue computer, which any
>> > average player of chess would make a move that will guarantee
>> > checkmate, but the Deep Blue computer gives you a move that will lead
>> > to stalemate, you know that it is not very intelligent (it did happen).
>> 
>> That may be your definition.  It seems to me that the definition given
>> by a body comprising a vast numbers of experts in the field, such as
>> the AAAI, might be considered to carry more authority than yours.
> 
> Argument from authority has rarely been less useful or interesting
> than in the case of AI, where both our abilities and understanding are
> so poor, and where the embarrassing mystery of conciousness lurks
> nearby...

It's a problem of terminology, no more, no less.  We are dissenting on
what it _means_ to say that a system is about AI, or not.  This is no
different from discussing what it means to say that a pasta sauce is
carbonara, or not.  Arguments from authority and popular usage are
perfectly valid, and indeed among the most valid, when discussing what given
phrases or words "MEAN" -- not "mean to you" or "mean to Joe Blow"
(which I could care less about), but "mean in reasonable discourse".

And until previously-incorrect popular usage has totally blown away by
sheer force of numbers a previously-correct, eventually-pedantic
meaning (as, alas, inevitably happens sometimes, given the nature of
human language), I think it's indeed worth striving a bit to preserve the 
"erudite" (correct by authority and history of usage) meaning against
the onslaught of the "popular" (correct by sheer force of numbers) one.
Sometimes (e.g., the jury's still out on "hacker") the "originally correct"
meaning may be preserved or recovered.

One of the best ways to define what term X "means", when available, is
to see what X means _to people who self-identify as [...] X_ (where the
[...] may be changed to "being", or "practising", etc, depending on X's
grammatical and pragmatical role).  For example, I think that when X
is "Christian", "Muslim", "Buddhist", "Atheist", etc, it is best to 
ascertain the meaning of X with reference to the meaning ascribed to X by 
people who practice the respective doctrines: in my view of the world, it
better defines e.g. the term "Buddhist" to see how it's used, received, 
meant, by people who _identify as BEING Buddhist, as PRACTISING 
Buddhism_, rather than to hear the opinions of others who don't.  And,
here, too, Authority counts: the opinion of the Pope on the meaning of
"Catholicism" matter a lot, and so do those of the Dalai Lama on the
meaning of "Buddhism", etc.  If _I_ think that a key part of the meaning
of "Catholicism" is eating a lot of radishes, that's a relatively harmless
eccentricity of mine with little influence on the discourse of others; if 
_the Pope_ should ever hold such a strange opinion, it will have 
inordinately vaster effect.

Not an effect limited to religions, by any means.  What Guido van Rossum 
thinks about the meaning of the word "Pythonic", for example, matters a lot 
more, to ascertain the meaning of that word, than the opinion on the same 
subject, if any, of Mr Joe Average of Squedunk Rd, Berkshire, NY.  And 
similarly for terms related to scientific, professional, and technological 
disciplines, such as "econometrics", "surveyor", AND "AI".

The poverty of our abilities and understanding, and assorted lurking 
mysteries, have as little to do with the best approach to ascertain the 
definition of "AI", as the poverty of our tastebuds, and the risk that the 
pasta is overcooked, have to do with the best approach to ascertain the
definition of "sugo alla carbonara".  Definitions and meanings are about
*human language*, in either case.


Alex





More information about the Python-list mailing list