Python vs. Lisp -- please explain
Peter Mayne
Peter.Mayne at hp.com
Tue Feb 21 07:25:02 EST 2006
Torsten Bronger wrote:
>
> My definiton would be that an interpreted language has in its
> typical implementation an interpreting layer necessary for typical
> hardware. Of couse, now we could discuss what is "typical",
> however, in practice one would know it, I think. In case of Python:
> CPython and all important modern processors.
In a previous century, I used something called UCSD Pascal, which at the
time was a typical implementation of Pascal. It ran on (amongst other
things) an Apple ][, which at the time was typical hardware. It worked
by compiling Pascal source to bytecode (called p-code), and interpreting
the p-code. So, in practice, one would know that Pascal was an
interpreted language.
Later on, I used a typical implementation called VAX Pascal: a compiler
reduced Pascal source to VAX object code. In practice, Pascal was not an
interpreted language. Of course, more than one of the VAXen we had did
not implement the entire VAX instruction set, and some instructions were
emulated, or interpreted, if you will, by other VAX instructions. So, in
practice, some of the Pascal was interpreted.
And, as someone in this thread has pointed out, it is likely that your
important modern (x86) processor is not natively executing your x86
code, and indeed meets your definition of having "in its typical
implementation an interpreting layer necessary for typical hardware".
Another example: is Java the bytecode, which is compiled from Java the
language, interpreted or not? Even when the HotSpot JIT cuts in? Or when
a native Java processor is used? Or when your Java program is compiled
with GCJ (if GCJ does what I think it does)? Does this make Java an
interpreted language or not?
Personally, in practice I don't care, so don't ask me. Ponder on getting
angels to dance on the head of a pin before you worry about whether the
dance can be interpreted or not.
PJDM
More information about the Python-list
mailing list