Reputation: 86
I'm using an automated SSH script to copy/run/log hardware tests to a few computers via SSH, and everything works fine except one thing. The test file is supposed to run indefintely every 30 minutes and collect data, then write it to a file until killed. For lack of a better example:
NOTE: Neither of these files are the actual code. I don't have it in front of me to copy it.
file.py:
#!/usr/bin/env python
import os
idleUsage = []
sleepTime = 1800
while(True):
holder = os.popen('mpstat | awk \'{printf("%s\n", $9)}\'')
idleUsage.append(100.0 - float(holder[1]))
f = open("output.log", 'w')
f.write(%idleUsage)
f.close()
sleep(sleepTime)
automatic-ssh.sh:
#!/bin/bash
autossh uname1 password1 ip1 command <----gets stuck after ssh runs
autossh uname2 password2 ip2 command
autossh uname3 password2 ip3 command
Without fail it gets stuck on running the command. I've tried 'command &' as well as putting an ampersand at the end of the entire line of code. Anyone out there have some advice?
Upvotes: 0
Views: 934
Reputation: 3867
So, your shell script connects to a remote machine via ssh and runs an endless python command, and you want that ssh connection to go into the background?
#!/bin/sh
ssh thingie 1 > out.1 &
ssh thingie 2 > out.2 &
ssh thingie 3 > out.3 &
wait
That'll kick off three ssh commands in the background logging to individual files, and then the script will wait until they all exit (wait
, if not given a pid as an argument, waits for all children to exit). If you kill the script, the child ssh processes should terminate as well. I'm not sure if that's what you're asking or not, but maybe it helps something? :)
Upvotes: 0
Reputation: 8767
Not sure of your current context but I would recommend using subprocess:
from subprocess import Popen
p1 = Popen(["sar"], stdout=PIPE)
p2 = Popen(["grep", "kb"], stdin=p1.stdout, stdout=PIPE)
p1.stdout.close() # Allow p1 to receive a SIGPIPE if p2 exits.
output = p2.communicate()[0]
Upvotes: 1