Will Python 3.0 remove the global interpreter lock (GIL)

Steven D'Aprano steve at REMOVE-THIS-cybersource.com.au
Wed Sep 19 23:54:59 EDT 2007


On Wed, 19 Sep 2007 19:14:39 -0700, Paul Rubin wrote:

> We get cpu speed increases now through parallelism, not mhz.  Intel and
> AMD both have 4-core cpu's now and Intel has a 16-core chip coming. 
> Python is at a serious disadvantage compared with other languages if the
> other languages keep up with developments and Python does not.

I think what you mean to say is that Python _will be_ at a serious 
disadvantage if other languages keep up and Python doesn't. Python can't 
be at a disadvantage _now_ because of what happens in the future.

Although, with the rapid take-up of multi-core CPUs, the future is 
*really close*, so I welcome the earlier comment from Terry Reedy that 
Guido has said he is willing to make changes to the CPython internals to 
support multiprocessors, and that people have begun to investigate 
practical methods of removing the GIL (as opposed to just bitching about 
it for the sake of bitching).


> The platitude that performance doesn't matter

Who on earth says that? I've never heard anyone say that.

What I've heard people say is that _machine_ performance isn't the only 
thing that needs to be maximized, or even the most important thing. 
Otherwise we'd all be writing hand-optimized assembly language, and there 
would be a waiting line of about five years to get access to the few 
programmers capable of writing that hand-optimized assembly language.


> that programmer time is more valuable than machine time

Programmer time is more valuable than machine time in many cases, 
especially when tasks are easily parallisable across many machines. 
That's why your "comparatively wimpy site" preferred to throw extra web 
servers at the job of serving webpages rather than investing in smarter, 
harder-working programmers to pull the last skerricks of performance out 
of the hardware you already had.


> etc. is at best an excuse for laziness.

What are you doing about solving the problem? Apart from standing on the 
side-lines calling out "Get yer lazy behinds movin', yer lazy bums!!!" at 
the people who aren't even convinced there is a problem that needs 
solving?


> And more and more often, in the
> application areas where Python is deployed, it's just plain wrong.  Take
> web servers: a big site like Google has something like a half million of
> them.  Even the comparatively wimpy site where I work has a couple
> thousand.  If each server uses 150 watts of power (plus air
> conditioning), then if making the software 2x faster lets us shut down
> 1000 of them, 

What on earth makes you think that would be anything more than a 
temporary, VERY temporary, shutdown? My prediction is that the last of 
the machines wouldn't have even been unplugged before management decided 
that running twice as fast, or servicing twice as many people at the same 
speed, is more important than saving on the electricity bill, and they'd 
be plugged back in.


> the savings in electricity bills alone is larger than my
> salary.  Of course that doesn't include environmental benefits, hardware
> and hosting costs, the costs and headaches of administering that many
> boxes, etc.  For a lot of Python users, significant speedups are a huge
> win.

Oh, I wouldn't say "No thanks!" to a Python speed up. My newest PC has a 
dual-core CPU (no cutting edge for me...) and while Python is faster on 
it than it was on my old PC, it isn't twice as fast.

But Python speed ups don't come for free. For instance, I'd *really* 
object if Python ran twice as fast for users with a quad-core CPU, but 
twice as slow for users like me with only a dual-core CPU.

I'd also object if the cost of Python running twice as fast was for the 
startup time to quadruple, because I already run a lot of small scripts 
where the time to launch the interpreter is a significant fraction of the 
total run time. If I wanted something like Java, that runs fast once it 
is started but takes a LONG time to actually start, I know where to find 
it.

I'd also object if the cost of Python running twice as fast was for Guido 
and the rest of the Python-dev team to present me with their wages bill 
for six months of development. I'm grateful that somebody is paying their 
wages, but if I had to pay for it myself it wouldn't be done. It simply 
isn't that important to me (and even if it was, I couldn't afford it).

Now there's a thought... given that Google:

(1) has lots of money;
(2) uses Python a lot;
(3) already employs both Guido and (I think...) Alex Martelli and 
possibly other Python gurus;
(4) is not shy in investing in Open Source projects;
(5) and most importantly uses technologies that need to be used across 
multiple processors and multiple machines

one wonders if Google's opinion of where core Python development needs to 
go is the same as your opinion?



-- 
Steven.



More information about the Python-list mailing list