A question about subprocess

Dan Stromberg dstromberglists at gmail.com
Wed Oct 3 20:24:32 EDT 2007


You don't necessarily need the subprocess module to do this, though you
could use it.

I've done this sort of thing in the past with fork and exec.

To serialize the jobs on the machines, the easiest thing is to just send
the commands all at once to a given machine, like "command1; command2;
command3".

You can use waitpid or similar to check if a series of jobs has finished
on a particular machine.

An example of something similar can be found at
http://stromberg.dnsalias.org/~strombrg/loop.html

(If you look at the code, be kind.  I wrote it long ago :)

There's a benefit to saving the output from each machine into a single
file for that machine.  If you think some machines will produce the same
output, and you don't want to see it over and over, you can analyze the
files with something like
http://stromberg.dnsalias.org/~strombrg/equivalence-classes.html .


 On Wed, 03 Oct 2007 16:46:20 +0000, JD wrote:

> Hi,
> 
> I want send my jobs over a whole bunch of machines (using ssh). The
> jobs will need to be run in the following pattern:
> 
> (Machine A)   (Machine B)  (Machine C)
> 
> Job A1             Job B1            Job C1
> 
>  Job A2           Job B2            etc
> 
>  Job A3          etc
> 
>  etc
> 
> Jobs runing on machine A, B, C should be in parallel, however, for
> each machine, jobs should run one after another.
> 
> How can I do it with the subprocess?
> 
> 
> Thanks,
> 
> JD





More information about the Python-list mailing list