A question about subprocess

JD Jiandong.Ge at gmail.com
Fri Oct 5 14:44:27 EDT 2007


Thanks very much for all the answers.

JD

On Oct 3, 6:24 pm, Dan Stromberg <dstrombergli... at gmail.com> wrote:
> You don't necessarily need thesubprocessmodule to do this, though you
> could use it.
>
> I've done this sort of thing in the past with fork and exec.
>
> To serialize the jobs on the machines, the easiest thing is to just send
> the commands all at once to a given machine, like "command1; command2;
> command3".
>
> You can use waitpid or similar to check if a series of jobs has finished
> on a particular machine.
>
> An example of something similar can be found athttp://stromberg.dnsalias.org/~strombrg/loop.html
>
> (If you look at the code, be kind.  I wrote it long ago :)
>
> There's a benefit to saving the output from each machine into a single
> file for that machine.  If you think some machines will produce the same
> output, and you don't want to see it over and over, you can analyze the
> files with something likehttp://stromberg.dnsalias.org/~strombrg/equivalence-classes.html.
>
>  On Wed, 03 Oct 2007 16:46:20 +0000, JD wrote:
>
> > Hi,
>
> > I want send my jobs over a whole bunch of machines (using ssh). The
> > jobs will need to be run in the following pattern:
>
> > (Machine A)   (Machine B)  (Machine C)
>
> > Job A1             Job B1            Job C1
>
> >  Job A2           Job B2            etc
>
> >  Job A3          etc
>
> >  etc
>
> > Jobs runing on machine A, B, C should be in parallel, however, for
> > each machine, jobs should run one after another.
>
> > How can I do it with thesubprocess?
>
> > Thanks,
>
> > JD





More information about the Python-list mailing list