Running a python farm

Ian McConnell ian at emit.demon.co.ukx
Mon Oct 27 09:28:05 EST 2003


What's the pythonic way of sending out a set of requests in parallel?

My program throws an image at the server and then waits for the result. I'm
currently using this bit of socket code to send an image to server on
another machine.


import socket
import SocketServer
import cPickle

def oneclient(host, port, array):
    sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
    sock.connect((host, port))
    sent = sock.sendall(cPickle.dumps(array, 0))
    sock.shutdown(1)
    rfile = sock.makefile('rb')
    return cPickle.loads(rfile.read())


Processing the image can take several minutes and I have loads of images to
process, but I also have several servers available, so I'd like save some
time by distributing the images around the servers. So for 'n' servers,
throw 'n' images at them. Then as each server finishes, give it another
image to work on until I've done all the images.

What should I be looking for, for a pythonic solution? I'm stuck for
terminology? Threads? Load balancing? Processors farms? Any pointers or
suggestions welcome.



I don't have to do this too often, so I would prefer a simple solution over
performance.


Thanks.


-- 
 "Thinks: I can't think of a thinks. End of thinks routine": Blue Bottle

** Aunty Spam says: Remove the trailing x from the To: field to reply **




More information about the Python-list mailing list