Python's "only one way to do it" philosophy isn't good?

Paul Rubin http
Wed Jun 27 04:51:58 EDT 2007


Douglas Alan <doug at alum.mit.edu> writes:
> > The thing is there was no standard way in Maclisp to write something
> > like Python's "count" function and map over it.  This could be done in
> > Scheme with streams, of course.
> 
> I'm not sure that you can blame MacLisp for not being object-oriented.
> The idea hadn't even been invented yet when MacLisp was implemented
> (unless you count Simula).  If someone went to make an OO version of
> MacLisp, I'm sure they'd get all this more or less right, and people
> have certainly implemented dialects of Lisp that are consistently OO.

count() has nothing to with OO, it's just the infinite stream 1,2,3,...
which can be implemented as a Scheme closure the obvious way.

> Right, but implementing generic reference-counted smart pointers is
> about a page of code in C++, and nearly every large C++ application
> I've seen uses such things.

That's because C++ has no GC.

> Gee, that's back to the future with 1975 Lisp technology.  Destructors
> are a much better model for dealing with such things (see not *all*
> good ideas come from Lisp -- a few come from C++) and I am dismayed
> that Python is deprecating their use in favor of explicit resource
> management.  Explicit resource management means needlessly verbose
> code and more opportunity for resource leaks.

And relying on refcounts to free a resource at a particular time is
precisely explicit resource management.  What if something makes a
copy of the pointer that you didn't keep track of?  The whole idea of
so-called smart pointers is to not have to keep track of them after
all.  Anything like destructors are not in the spirit of GC at all.
The idea of GC is to be invisible, so the language semantics can be
defined as if all objects stay around forever.  GC should only reclaim
something if there is no way to know that it is gone.  For stuff like
file handles, you can tell whether they are gone or not, for example
by trying to unmount the file system.  Therefore they should not be
managed by GC.

> Also, I'm not convinced that it has to be a huge performance hit.
> Some Lisp implementations had a 1,2,3, many (or something like that)
> reference-counter for reclaiming short-lived objects.  This bypassed
> the real GC and was considered a performance optimization.  (It was
> probably on a Lisp Machine, though, where they had special hardware to
> help.)

That is a common technique and it's usually done with just one bit.

> > because putting locks on the refcounts (someone tried in the late
> > 90's) to allow multi-cpu parallelism slows the interpreter to a crawl.
> 
> All due to the ref-counter?  I find this really hard to believe.
> People write multi-threaded code all the time in C++ and also use
> smart pointers at the same time.  I'm sure they have to be a bit
> careful, but they certainly don't require a GIL.

Heap allocation in C++ is a relatively heavyweight process that's used
sort of sparingly.  C++ code normally uses a combination of static
allocation (fixed objects and globals), stack allocation (automatic
variables), immediate values (fixnums), and (when necessary) heap
allocation.  In CPython, *everything* is on the heap, even small
integers, so saying "x = 3" has to bump the refcount for the integer 3.
There is a LOT more refcount bashing in CPython than there would be
in something like a Boost application.

> > Lisp may always be around in some tiny niche but its use as a
> > large-scale systems development language has stopped making sense.
> 
> It still makes perfect sense for AI research.  

I think most AI research is being done in other languages nowadays.
There are some legacy Lisp systems around and some die-hards but
I have the impression newer stuff is being done in ML or Haskell.

I personally use Emacs Lisp every day and I think Hedgehog Lisp (a
tiny functional Lisp dialect intended for embedded platforms like cell
phones--the runtime is just 20 kbytes) is a very cool piece of code.
But using CL for new, large system development just seems crazy today.

> Re Lisp, though, there used to be a joke (which turned out to be
> false), which went, "I don't know what the most popular programming
> language will be in 20 years, but it will be called 'Fortran'".  In
> reality, I don't know what the most popular language will be called 20
> years from now, but it will *be* Lisp.

Well, they say APL is a perfect crystal--if you add anything to it, it
becomes flaws; while Lisp is a ball of mud--you can throw in anything
you want and it's still Lisp.  However I don't believe for an instant
that large system development in 2027 will be done in anything like CL.

See:

  http://www.cs.princeton.edu/~dpw/popl/06/Tim-POPL.ppt
  http://www.st.cs.uni-sb.de/edu/seminare/2005/advanced-fp/docs/sweeny.pdf

(both are the same presentation, the links are the original Powerpoint
version and a pdf conversion) for a game developer discussing his
experiences with a 500 KLOC C++ program, describing where he thinks
things are going.  I find it compelling.  Mostly he wants a
dependently typed, pure functional language with explicit escapes to
impurity in places, with a "lenient" evaluation strategy, i.e. sort of
a softcore version of Haskell with less polymorphism but with
dependent types for stuff like sized arrays.  This is in order to get
concurrency and parallelism automatically without dealing with threads
and locks, which of course are even worse than malloc/free in terms of
programmer insanity and pitfalls.

Everyone is running dual core x86's on their desktop and 4-core cpu's
are starting to appear, but the x86 is so complex that 4 cores per
package seems to be the maximum at the moment.  But some non-x86
designs are sporting as many as 16 cores on a chip (www.movidis.com)
in general purpose computers, to say nothing of specialized ones like
the Cell processor.  We have run out of megahertz and we are going
multicore.  And if we don't start adopting languages that support
parallelism sanely, we'll be left in the dust by those languages, just
as Python (or Lisp or Java or whatever) have left non-GC languages in
the dust.



More information about the Python-list mailing list