Python's "only one way to do it" philosophy isn't good?

Paul Rubin http
Tue Jun 26 03:26:14 EDT 2007


Douglas Alan <doug at alum.mit.edu> writes:
> > In the Maclisp era functions like mapcar worked on lists, and
> > generated equally long lists in memory.
> 
> I'm aware, but there were various different mapping functions.  "map",
> as opposed to "mapcar" didn't return any values at all, and so you had
> to rely on side effects with it.

The thing is there was no standard way in Maclisp to write something
like Python's "count" function and map over it.  This could be done in
Scheme with streams, of course.

> Right -- I wrote "iterators", not "generators".

Python iterators (the __iter__ methods on classes) are written with
yield statements as often as not.

> > The point is that mapcar (as the name implies) advances down a list
> > using cdr, i.e. it only operates on lists, not general iterators or
> > streams or whatever.
> 
> Right, but each sequence type had it's own corresponding mapping fuctions.

Precisely, I think that's what Alexander was trying to get across, Lisp
didn't have a uniform interface for traversing different types of sequence.

> they had standardized such things.  This would not be particularly
> difficult to do, other than the getting everyone to agree on just what
> the interfaces should be.  But Lisp programmers, are of course, just
> as recalcitrant as Python programmers.

Python programmers tend to accept what the language gives them and use
it and not try to subvert it too much.  I don't say that is good or bad.

> And in Python's case, the reference manual is just an incomplete
> description of the features offered by the implementation, and people
> revel in features that are not yet in the reference manual.

No I don't think so, unless you count some things that are in accepted
PEP's and therefore can be considered part of the reference docs, even
though they haven't yet been merged into the manual.

> That's not ugly.  The fact that CPython has a reference-counting GC
> makes the lifetime of object predictable, which means that like in
> C++, and unlike in Java, you can use destructors to good effect.  This
> is one of the huge boons of C++.  The predictability of lifespan makes
> the language more expressive and powerful.  The move to deprecate
> relying on this feature in Python is a bad thing, if you ask me, and
> removes one of the advantages that Python had over Lisp.

No that's wrong, C++ has no GC at all, reference counting or
otherwise, so its destructors only run when the object is manually
released or goes out of scope.  The compiler normally doesn't attempt
lifetime analysis and it would probably be against the rules to free
an object as soon as it became inaccessible anyway.  Python (as of
2.5) does that using the new "with" statement, which finally makes it
possible to escape from that losing GC-dependent idiom.  The "with"
statement handles most cases that C++ destructors normally handle.

Python object lifetimes are in fact NOT predictable because the ref
counting doesn't (and can't) pick up cyclic structure.  Occasionally a
cyclic GC comes along and frees up cyclic garbage, so some destructors
don't get run til then.  Of course you can manually organize your code
so that stuff with destructors don't land in cyclic structures, but
now you don't really have automatic GC any more, you have (partially)
manual storage management.  And the refcounts are a performance pig in
multithreaded code, because of how often they have to be incremented
and updated.  That's why CPython has the notorious GIL (a giant lock
around the whole interpreter that stops more than one interpreter
thread from being active at a time), because putting locks on the
refcounts (someone tried in the late 90's) to allow multi-cpu
parallelism slows the interpreter to a crawl.

Meanwhile 4-core x86 cpu's are shipping on the desktop, and network
servers not dependent on the complex x86 architecture are using
16-core MIPS processors (www.movidis.com).  Python is taking a beating
all the time because of its inability to use parallel cpu's, and it's
only going to get worse unless/until PyPy fixes the situation.  And
that means serious GC instead of ref counting.

> And it's not bygone -- it's just nichified.  Lisp is forever -- you'll see.

Lisp may always be around in some tiny niche but its use as a
large-scale systems development language has stopped making sense.

If you want to see something really pathetic, hang out on
comp.lang.forth sometime.  It's just amazing how unaware the
inhabitants there are of how irrelevant their language has become.
Lisp isn't that far gone yet, but it's getting more and more like that.



More information about the Python-list mailing list