On Oct 3, 9:46 am, JD <[EMAIL PROTECTED]> wrote: > Hi, > > I want send my jobs over a whole bunch of machines (using ssh). The > jobs will need to be run in the following pattern: > > (Machine A) (Machine B) (Machine C) > > Job A1 Job B1 Job C1 > > Job A2 Job B2 etc > > Job A3 etc > > etc > > Jobs runing on machine A, B, C should be in parallel, however, for > each machine, jobs should run one after another. > > How can I do it with the subprocess?
subprocess is not network aware. What you can do is write a simple python script say run_jobs.py which can take in a command-line argument (say A or B or C) and will fire a sequence of subprocesses to execute a series of jobs. This will ensure the serialization condition like A2 starting after A1's completion. Now you can write a load distributer kind of script which uses ssh to login to the various machines and run run_jobs.py with appropriate argument (Here I assume all machines have access to run_jobs.py -- say it may reside on a shared mounted file-system). e.g. in outer script: ssh machine-A run_jobs.py A ssh machine-B run_jobs.py B ssh machine-B run_jobs.py C ... You may want to fire all these at once so that they all execute in parallel. Karthik > > Thanks, > > JD -- http://mail.python.org/mailman/listinfo/python-list