Software bugs aren't inevitable

Terry Hancock hancock at anansispaceworks.com
Sun Sep 18 00:48:12 EDT 2005


On Saturday 17 September 2005 06:03 pm, Scott David Daniels wrote:
> Sybren Stuvel wrote:
> > ... Computers aren't happy. They couldn't care less about the
> > programming language.
> 
> This reminds me of a quote I love (and wish I could cite the
> originator):
> 
>      Don't anthropomorphize computers, they don't like that.

Actually, I wrote a long (and very off-topic) reply justifying the
philosophical point that whether or not computers can feel "happy"
is an article of faith, either way, and depends heavily on what
sense you are using for the word "happy" not to mention "feel"
or "be". I thought maybe it would be best to drop it, but since
this bone is still being worried, perhaps it's worth summarizing ...

Fascinating subject, actually. There is a widely held *faith* among
materialist atheists that the criterion for "being able to feel"
is an as-yet undefined, but definable level or form of computing
complexity mythologized as a "true AI".  The need for this belief
is an artifact of the materialist atheist belief -- or rather the
disbelief in any form of spiritualism or animism.

I contend that there is no such thing as a "true AI", or a "true
natural intelligence" for that matter in the sense of there being
some sharp line between "sentient" and "non-sentient" matter.

Obviously there is *something* going on, in continuity of memory and
reasoning capacity that has something to do with the objective
capability to react in complex ways, and thus to express one's own
sense of being in a way that is more comprehensible to "others".
Thus far, however, there is no need for any sharp line -- just a
gradually increasing level of intelligent responsiveness.

But, the actual state of sensation?  We know nothing about that,
and science cannot illuminate it, because it's not an objective
statement. We by definition cannot know what someone or something
else "feels".  We can only know how he/she/it *reacts*.

It seems a simpler and more natural assumption to think that 
sensation *pre-exists* the reasoning power to express or remember
that sensation.  Indeed, it seems more sensible to regard it as
a fundamental property of matter (or energy or space -- it being
hard to define which bit of reality the soul adheres to, but
"matter" will do for sake of argument).

It's no more unreasonable, I would contend, then, to say that a
computer is "happy" when it acts on data it "understands".  If
"thought" is a "mechanism", then a "mechanism" can be "thought".

I do not "anthropomorphize" the machine in that I do not regard
thought as a uniquely Human capacity.  That I have some "theory
of mind" for a PC does not mean that I think it's a Human, nor
that I would be stupid enough to credit it with a Human mind's
capabilities.

So I personally find it completely sane and often more natural
to speak of what a computer "knows" or "understands" or in this
case, "is happy with".

But if you really insist on a reductionist, mechanistic explanation,
then my point is simply this: a computer, in order to act on ANY
program, must first be made to act on a prior program (a compiler or
an interpreter -- in addition to the BIOS and operating system which
must first run in order to initiate the said program), which contains
instructions for converting said program into a format which the
computer is able to directly process.

I personally found "the computer is happier with binary" to 
be a much more concise and understandable way to say that, but
clearly, some people on the list find it heretical to use any
statement which assigns "agent status" to computers or programs.

But getting back to the point -- the fact that the computer itself
would be "happier" with binary program instructions shows that there
is certainly NO objective, technical sense in which ANY computer
programming language is "inherently" superior.

Programming languages can ONLY be evaluated in "psychological"
terms.  The "technical" design of programming languages -- and
indeed ALL systems for describing and characterizing cognition
of all forms is ultimately a "psychological" discipline. It
obviously depends on the function of the mind of the programmer,
and the ability of programmers' minds to process the
information is the *metric of success* of that programming
language.

Given that programmers' minds are neither identical nor
unchanging, it pretty much goes without saying that the choice
of programming language, notation, or technique will be
subjective -- and also changeable.

I said this, because an earlier poster had *dismissed* mere
"psychological" reasons as unimportant, claiming that
functional programming was superior on "technical" grounds.

I hope I have demonstrated that that statement is nonsensical --
ALL statements about programming languages or techniques are
ultimately dependent on "psychological reasons". If functional
programming reduces bugs, then it also does so for
*psychological* reasons.

--
Terry Hancock ( hancock at anansispaceworks.com )
Anansi Spaceworks  http://www.anansispaceworks.com




More information about the Python-list mailing list