multiprocessing

elsa kerensaelise at hotmail.com
Thu Apr 7 00:06:25 EDT 2011


Hi guys,

I want to try out some pooling of processors, but I'm not sure if it
is possible to do what I want to do. Basically, I want to have a
global object, that is updated during the execution of a function, and
I want to be able to run this function several times on parallel
processors. The order in which the function runs doesn't matter, and
the value of the object doesn't matter to the function, but I do want
the processors to take turns 'nicely' when updating the object, so
there are no collisions. Here is an extremely simplified and trivial
example of what I have in mind:

from multiprocessing import Pool
import random

p=Pool(4)
myDict={}

def update(value):
    global myDict
    index=random.random()
    myDict[index]+=value

total=1000

p.map(update,range(total))


After, I would also like to be able to use several processors to
access the global object (but not modify it). Again, order doesn't
matter:

p1=Pool(4)

def getValues(index):
    global myDict
    print myDict[index]

p1.map(getValues,keys.myDict)

Is there a way to do this?

Thanks,

Elsa.




More information about the Python-list mailing list