Python's "only one way to do it" philosophy isn't good?

Douglas Alan doug at alum.mit.edu
Wed Jun 27 01:45:44 EDT 2007


Paul Rubin <http://phr.cx@NOSPAM.invalid> writes:

> Douglas Alan <doug at alum.mit.edu> writes:

>> > In the Maclisp era functions like mapcar worked on lists, and
>> > generated equally long lists in memory.

>> I'm aware, but there were various different mapping functions.  "map",
>> as opposed to "mapcar" didn't return any values at all, and so you had
>> to rely on side effects with it.

> The thing is there was no standard way in Maclisp to write something
> like Python's "count" function and map over it.  This could be done in
> Scheme with streams, of course.

I'm not sure that you can blame MacLisp for not being object-oriented.
The idea hadn't even been invented yet when MacLisp was implemented
(unless you count Simula).  If someone went to make an OO version of
MacLisp, I'm sure they'd get all this more or less right, and people
have certainly implemented dialects of Lisp that are consistently OO.

>> Right -- I wrote "iterators", not "generators".

> Python iterators (the __iter__ methods on classes) are written with
> yield statements as often as not.

I certainly agree that iterators can be implemented with generators,
but generators are a language feature that are impossible to provide
without deep language support, while iterators are just an OO
interface that any OO language can provide.  Though without a good
macro facility the syntax to use them may not be so nice.

>> That's not ugly.  The fact that CPython has a reference-counting GC
>> makes the lifetime of object predictable, which means that like in
>> C++, and unlike in Java, you can use destructors to good effect.  This
>> is one of the huge boons of C++.  The predictability of lifespan makes
>> the language more expressive and powerful.  The move to deprecate
>> relying on this feature in Python is a bad thing, if you ask me, and
>> removes one of the advantages that Python had over Lisp.

> No that's wrong, C++ has no GC at all, reference counting or
> otherwise, so its destructors only run when the object is manually
> released or goes out of scope.

Right, but implementing generic reference-counted smart pointers is
about a page of code in C++, and nearly every large C++ application
I've seen uses such things.

> Python (as of 2.5) does that using the new "with" statement, which
> finally makes it possible to escape from that losing GC-dependent
> idiom.  The "with" statement handles most cases that C++ destructors
> normally handle.

Gee, that's back to the future with 1975 Lisp technology.  Destructors
are a much better model for dealing with such things (see not *all*
good ideas come from Lisp -- a few come from C++) and I am dismayed
that Python is deprecating their use in favor of explicit resource
management.  Explicit resource management means needlessly verbose
code and more opportunity for resource leaks.

The C++ folks feel so strongly about this, that they refuse to provide
"finally", and insist instead that you use destructors and RAII to do
resource deallocation.  Personally, I think that's taking things a bit
too far, but I'd rather it be that way than lose the usefulness of
destructors and have to use "when" or "finally" to explicitly
deallocate resources.

> Python object lifetimes are in fact NOT predictable because the ref
> counting doesn't (and can't) pick up cyclic structure.

Right, but that doesn't mean that 99.9% of the time, the programmer
can't immediately tell that cycles aren't going to be an issue.

I love having a *real* garbage collector, but I've also dealt with C++
programs that are 100,000+ lines long and I wrote plenty of Python
code before it had a real garbage collector, and I never had any
problem with cyclic data structures causing leaks.  Cycles are really
not all that common, and when they do occur, it's usually not very
difficult to figure out where to add a few lines to a destructor to
break the cycle.

> And the refcounts are a performance pig in multithreaded code,
> because of how often they have to be incremented and updated.

I'm willing to pay the performance penalty to have the advantage of
not having to use constructs like "when".

Also, I'm not convinced that it has to be a huge performance hit.
Some Lisp implementations had a 1,2,3, many (or something like that)
reference-counter for reclaiming short-lived objects.  This bypassed
the real GC and was considered a performance optimization.  (It was
probably on a Lisp Machine, though, where they had special hardware to
help.)

> That's why CPython has the notorious GIL (a giant lock around the
> whole interpreter that stops more than one interpreter thread from
> being active at a time), because putting locks on the refcounts
> (someone tried in the late 90's) to allow multi-cpu parallelism
> slows the interpreter to a crawl.

All due to the ref-counter?  I find this really hard to believe.
People write multi-threaded code all the time in C++ and also use
smart pointers at the same time.  I'm sure they have to be a bit
careful, but they certainly don't require a GIL.

I *would* believe that getting rid of the GIL will require some
massive hacking on the Python interpreter, though, and when doing that
it may be significantly easier to switch to having only a real GC than
having two different kinds of automatic memory management.

I vote, though, for putting in that extra work -- compatibility with
Jython be damned.

> Lisp may always be around in some tiny niche but its use as a
> large-scale systems development language has stopped making sense.

It still makes perfect sense for AI research.  I'm not sure that
Lisp's market share counts as "tiny".  It's certainly not huge, at
only 0.669% according to the TIOBE metric, but that's still the 15th
most popular language and ahead of Cobol, Fortran, Matlab, IDL, R, and
many other languages that are still in wide use.  (Cobol is probably
still around for legacy reasons, but that's not true for the other
languages I mentioned.)

> If you want to see something really pathetic, hang out on
> comp.lang.forth sometime.  It's just amazing how unaware the
> inhabitants there are of how irrelevant their language has become.
> Lisp isn't that far gone yet, but it's getting more and more like that.

Forth, eh.  A chaque son gout, but I'd be willing to bet that most
Forth hackers don't believe that Forth is going to make a huge
resurgence and take over the world.  And it still has something of a
place as the core of Postscript and maybe in some embedded systems.

Re Lisp, though, there used to be a joke (which turned out to be
false), which went, "I don't know what the most popular programming
language will be in 20 years, but it will be called 'Fortran'".  In
reality, I don't know what the most popular language will be called 20
years from now, but it will *be* Lisp.

|>oug



More information about the Python-list mailing list