Flufferson
Flufferson

Reputation: 23

Using python to start parallel ssh rendering jobs

I'm writing a script in Python to ssh into a few computers (about ten) and have them start rendering 3d images from Blender. It works fine except the next computers's renders won't start until the previous ones are finished. Is there a way to start the commands and have them all run concurrently on their own machines?

what my code looks like:

import os
path = /home/me
comp1 = ['sneffels','1','2'] #computer name, start frame, end frame
comp2 = ['bierstadt','3','4']
comp3 = ['diente','5','6']

os.system("ssh igp@" + str(comp1[0]) + " blender -b "+ str(path) +" -s " + str(comp1[1]) + " -e " + str(comp1[2]) + " -a")

os.system("ssh igp@" + str(comp2[0]) + " blender -b "+ str(path) +" -s " + str(comp2[1]) + " -e " + str(comp2[2]) + " -a")

os.system("ssh igp@" + str(comp3[0]) + " blender -b "+ str(path) +" -s " + str(comp3[1]) + " -e " + str(comp3[2]) + " -a")

Upvotes: 2

Views: 2727

Answers (6)

Gilad Sharaby
Gilad Sharaby

Reputation: 998

I would recommend using a tool like pssh.
This tool uses multi-threads and performs quickly.
you can read more about it here.

Upvotes: 0

sotapme
sotapme

Reputation: 4903

What's wrong with just putting + " & " at the end of each os.system(...) ?

You don't seem too bothered if each blender works or not.

From the look of it you could just do it in a shell script.

You could install something like beanstalkd and have your 10 servers each running a worker that pulls off jobs from the shared queue. Then have your job-dispatcher put on jobs that mention the filename,start-frame, end-frame.

A benefit being that when a consumer finishes they can put their status back on the queue.

Otherwise you'll have issues knowing whether one of the subprocesses failed etc.

Upvotes: 0

djungelorm
djungelorm

Reputation: 51

subprocess.Popen should do the trick.

Check out this previous answer: How can I run an external command asynchronously from Python?

Upvotes: 0

Savir
Savir

Reputation: 18438

You may want to thread your calls. I prepared a little example that just echos something (you can change it to ssh). I hope it's clear enough so you can get the idea

#!/usr/bin/env python

import threading
import os
import random

ips = ["192.168.1.25", "192.168.1.26", "192.168.1.27"]

def run(ip, otherParameter):
    os.system("echo %s with other parameter %s" % (ip, otherParameter))

if __name__ == "__main__":
    for ip in ips:
        thread = threading.Thread(target=run, args=[ip, random.randint(0, 10)])
        thread.run()

Also, instead of os.system, you should take a look to the subprocess module, or even better, to use something intended to run ssh commands, take a look to the paramiko module.

Upvotes: 0

abarnert
abarnert

Reputation: 366103

The problem is that os.system doesn't return until the program is done, and ssh isn't done until the command you gave it is done.

This is one of many reasons not to use os.system—as the documentation explicitly says:

The subprocess module provides more powerful facilities for spawning new processes and retrieving their results; using that module is preferable to using this function. See the Replacing Older Functions with the subprocess Module section in the subprocess documentation for some helpful recipes.

In subprocess, you can create a bunch of subprocesses, and then join them all after they've all been kicked off. For example:

p1 = subprocess.Popen("ssh igp@" + str(comp1[0]) + " blender -b "+ str(path) +" -s " + str(comp1[1]) + " -e " + str(comp1[2]) + " -a", shell=True)
p2 = subprocess.Popen("ssh igp@" + str(comp2[0]) + " blender -b "+ str(path) +" -s " + str(comp2[1]) + " -e " + str(comp2[2]) + " -a", shell=True)
p3 = subprocess.Popen("ssh igp@" + str(comp3[0]) + " blender -b "+ str(path) +" -s " + str(comp3[1]) + " -e " + str(comp3[2]) + " -a", shell=True)
p1.wait()
p2.wait()
p3.wait()

That probably isn't the best way to do this. Read the subprocess docs to understand why shell=True and passing a string is usually not as good as passing a list of parameters, other ways to manage your subprocesses, etc.. But meanwhile, this is probably the simplest change from what you already have.

Another alternative is to not shell out to the ssh command in the first place, but instead use something like paramiko to spawn the remote processes from within Python.

Upvotes: 2

JonathanV
JonathanV

Reputation: 2504

You could try using the threading package. A simple example that might help you can be found here on the Salty Crane blog. It should allow you to run all your processes at the same time.

Upvotes: 0

Related Questions