why python is slower than java?

Alex Martelli aleaxit at yahoo.com
Sun Nov 7 09:39:52 EST 2004


Maurice LING <mauriceling at acm.org> wrote:

> > 
> > dude that "comparision" from twistedmatrix you refrence is ANCIENT!!!
> 
> I am wondering the impact when IBM decided that the base memory to not
> exceed 64kb, in the late 1960s...

??? IBM in the '60s was developing the then-revolutionary concept of a
*family of computers* -- several models able to run the same codes with
different performance and price.  "360 degrees computing", whence came
the idea of calling that family "IBM 360".  Out of the 32 bits of a
register's width, only 24 bits were in fact used for addressing, so the
limit was *16 megabytes* of base memory -- 40 years ago, that not only
SEEMED huge, it WAS huge.  To the point that Motorola in the late
'70s/early '80s took exactly the same architectural decision regarding
*their* (less-revolutionary...) 16-bit microprocessor, the 68000: again
32-bit registers, but, again, 24-bit addresses.  The point being that,
about 15+ years later (10 doublings by Moore's Law!!!), IBM's crucial
architectural decision STILL seemed quite reasonable, albeit when
thinking of microcomputers rather than mainframes.

If you're thinking of IBM PC's, the key architectural decision there had
come from intel (8086 then 8088), again in the late '70s, and it _was_
way too restrictive -- just 1MB of addressability (in 64-k segments), vs
the unsegmented 16 MB of IBM earlier and Motorola at about the same
time.  IBM when designing the IBM PC shortly afterwards decided to pick
1/3 of that MB for I/O and special extension cards' memories rather than
general base RAM -- leaving 640K, not 64K!, for the latter -- and the
difference betwen 640K and 1MB never really had any big impact.

The machines with 64-K byte limits I know of were early 16-bit minis
(such as PDP-11, despite later kludges to push that limit up) and just
about all 8-bit micros (intel, Motorola, and just about all others).


> > it is comparing versions that are YEARS out of date and use!
> 
> Are the codebase of Python 1.5.2 and Java 1.1 totally replaced and 
> deprecated?

No, there's a lot of code from that era still working happily.

> Lisp compiler is the 1st compiler to be created (according to the 
> Red-Dragon book, I think) and almost all others are created by 

I'm sure Aho, Hopcroft and Ullmann would never be so ignorant or
insensitive as to detract from Backus' (and IBM Research's) epoch-making
result in making the first production compiler -- exactly 50 years ago,
quite a bit earlier than LISP, for the language first known as Formula
Translation, a name which was soon shortened to ForTran.  Lisp has quite
some "first"s, but "first compiler" ain't among them.

> bootstrapping to LISP compiler. What are the implications of design 
> decisions made in LISP compiler then affecting our compilers today? I
> don't know. I repeat myself, I DO NOT KNOW.

I can't think of even a single one, although I fancy myself as quite
knowledgeable in the history of computing -- neither from Formula
Translation, nor from LISt Processor, not one design decision whose
implications are still affecting compilers being written today.  Same
goes for computers of the same era, the '50s.  If you move to the '60s
then some things do suggest themselves -- at least in computers: general
registers that are not part of the addressable memory, memory
addressable by the byte, 8-bit bytes with significant 16-bit and 32-bit
halfwords and fullwords too -- all parts of the IBM-360 legacy; not so
obvious in languages and compilers, maybe.  Maybe the concepts of
grammar, tokenizers and parsers -- all stuff that wasn't in evidence in
the '50s, at least not in ForTran or Lisp.  The ambitious concept of a
language good for all kinds of problems, commercial AND scientific, much
like IBM's "360 degrees computing" was breaking down the barriers
between scientific and commercial computers in the HW field.  The very
first emergence of "bytecodes", partly from trying to emulate yet-older
computers on those of the time, to keep running old applications on new
hardware where IBM's revolutionary concept of "a family of computers"
didn't apply.  Pretty thin stuff, IMHO... you have to get to the '70s to
find the emergence of a more solid legacy, I think.


Alex



More information about the Python-list mailing list