AI and cognitive psychology rant (was Re: BIG successes of Lisp...)

Alex Martelli aleax at aleax.it
Tue Oct 14 08:11:40 EDT 2003


Stephen Horne wrote:
   ...
> Perhaps AI should be defined as 'any means of solving a problem which
> the observer does not understand' ;-)

Clarke's law...?-)

The AAAI defines AI as:

"the scientific understanding of the mechanisms underlying thought and
intelligent behavior and their embodiment in machines."

But just about any mathematical theory is an abstraction of "mechanisms
underlying thought": unless we want to yell "AI" about any program doing
computation (or, for that matter, memorizing information and fetching it
back again, also simplified versions of "mechanisms underlying thought"),
this had better be a TAD more nuanced.  I think a key quote on the AAAI
pages is from Grosz and Davis: "computer systems must have more than
processing power -- they must have intelligence".  This should rule out
from their definition of "intelligence" any "brute-force" mechanism
that IS just processing power.  Chess playing machines such as Deep
Blue, bridge playing programs such as GIB and Jack (between the two
of them, winners of the world computer bridge championship's last 5 or
6 editions, regularly grinding into the dust programs described by their
creators as "based on artificial intelligence techniques" such as
expert systems), dictation-taking programs such as those made by IBM
and Dragon Systems in the '80s (I don't know if the technology has
changed drastically since I left the field then, though I doubt it),
are based on brute-force techniques, and their excellent performance
comes strictly from processing power.  For example, IBM's speech
recognition technology descended directly from the field of signal
processing -- hidden Markov models, Viterbi algoriths, Bayes all over
the battlefield, and so on.  No "AI heritage" anywhere in sight...


Alex





More information about the Python-list mailing list