regarding threading

Kevin Cazabon kevin at cazabon.com
Tue Oct 14 16:34:50 EDT 2003


Actually, with Python even on a dual-processor machine,
multi-threading will get you NO speed increase.  This because even
though you have multiple threads, only ONE of them is running at a
time (whichever one has the Global Interpreter Lock, or GIL).  Python
switches between threads every so often (100 byte codes is the default
if I remember correctly, but it can be changed).

The exception is if you write a C extension module... you can
explicitly release the GIL and reaquire it before returning to Python.
 That allows another Python thread to run at the same time as your C
module.

Some Python extension modules implement this (I've been working with
Fredrik Lundh to get this into PIL), but most don't... it's a personal
gripe of mine, but I understand the necessity for the time being.  The
GIL makes Python pretty "thread safe" even without locks on shared
objects, but in my opinion that should be up to the programmer to deal
with or die with by themselves.

Hopefully some day we'll get to a Python version that can internally
handle threads properly.

Kevin Cazabon.

"Diez B. Roggisch" <nospam-deets at web.de> wrote in message news:<bmgp8g$m2831$1 at ID-111250.news.uni-berlin.de>...
> akash shetty wrote:
> 
> > but this takes an awful amt of time.(abt 7 mins)
> > is there anyway to speed this up.
> > is use of threading feasible and what code do i
> > thread( since all i do is process the database).there
> > are no other concurrent tasks. so do i divide the
> > database into parts and multithread the searching on
> > these parts concurrently. is this feasible. or shud i
> > be using some kind of multiprocessing running the
> > parts(files) as diff processes.
> 
> Multiple threads/processes won't buy you anything unless you have a
> multiprocessor-machine. In fact, they'll slow down things, as context
> switches (which are considerably slower between processes than between
> threads) take also their time.
> 
> Threads only buy you performance on single processor-machines if you have to
> deal with asynchronus events like network packets or userinteraction.
> 
> For speeding up your search - if you search brute-force, you could try to go
> for something like a shift-and algorithm. 
> 
> And it might help to use C and memory-map parts of the file - but I have to
> admit that I have no expirience in that field.
> 
> Diez




More information about the Python-list mailing list