[SciPy-user] shared memory machines

Gael Varoquaux gael.varoquaux at normalesup.org
Fri Feb 6 10:24:29 EST 2009


On Fri, Feb 06, 2009 at 04:15:05PM +0100, Sturla Molden wrote:
> The question remains: should we base this on Cython or C (there is very 
> little coding to do), or some third party extension, e.g. Philip 
> Semanchuk's POSIX IPC and Mark Hammond's pywin32? I am thinking that at 
> least for POSIX IPC, GPL is a severe limitation. Also we need some 
> automatic clean up, which can only be accomplished with an extension 
> object (that is, __dealloc__ in Cython will always be called, as opposed 
> to __del__ in Python). In Pywin32 there is a PyHANDLE object that 
> automatically calls CloseHandle when it is collected. But I don't think 
> Semanchuk's POSIX IPC module will do the same. And avoiding dependencies 
> on huge projects like pywin32 is also good.

I am all for avoiding external dependencies (especially if they are GPL).
multiprocessing is in the standard library, I would like to be able to do
shared memory parallel computing with only numpy and the standard
library. Actually I can see a near future where some algorithms of scipy
could have the option of using multiple cores (I am thinking of eg
non-parametric statistics).

The __dealloc__ argument is a very good one for going with Cython. In
addition I really like the feeling of Cython code. And am I wrong in
thinking that it would make the transition to Python3 easier?

Gaël



More information about the SciPy-User mailing list