Reputation: 3073
I've been playing with the subprocess
module to iteratively send
each line in an input file to a process created by the following command.
ssh -t -A $host 'remote_command'
The remote_command
expects a line in its STDIN, does some processing on the
line and iterates the cycle until STDIN closes or reaches EOF.
To achieve this, what I'd been doing was:
process = subprocess.Popen("ssh -t -A $host 'remote_command'",
shell=True,
stdin=subprocess.PIPE)
for line in file('/tmp/foo'):
process.stdin.write(line)
process.stdin.flush()
process.stdin.close()
But what I discovered was that the above method is not robust enough, as it is
often the case that remote_command
finishes prematurely without processing the
entire content (though sometimes the same code does succeed without a problem).
The situation is the same when I employ another, albeit very simiar, approach:
process = subprocess.Popen("ssh -t -A $host 'remote_command'",
shell=True,
stdin=file('/tmp/foo'))
So the question is: How can I make sure that each line in an input file be sent, received, and processed until the end by the remote machine in Python?
Upvotes: 2
Views: 590
Reputation: 28360
I would say your best bet would be to use the reply pipe to capture the results of the remote command and ensure that you reach a prompt between lines and after each line. BTW I have sometimes found that a dummy command such as ls -l at the end of a remote link session helps to ensure that processing is finished before dropping the connection.
Upvotes: 0
Reputation: 41940
If this...
process = subprocess.Popen("ssh -t -A $host 'remote_command'",
shell=True,
stdin=subprocess.PIPE)
for line in file('/tmp/foo'):
process.stdin.write(line)
process.stdin.flush()
process.stdin.close()
...is your entire program, it won't (necessarily) work.
Although the final call to process.stdin.close()
will ensure that all the data has been sent to the ssh
process before your program terminates, it won't ensure that the ssh
process has sent all the data across the network, so there may well be some outstanding data for it to send.
Unfortunately, since the ssh
process is a child process of your program, then when your program terminates, the ssh
process will receive a SIGHUP
which will immediately kill it, potentially before it finishes sending all its data.
As long as the remote_command
terminates when it hits EOF, it's not a problem, and you can either ask the ssh
process to ignore the SIGHUP
, and continue running in the background with...
process = subprocess.Popen("nohup ssh -t -A $host 'remote_command'", ...)
...or ask your program to wait for the ssh
process to finish, by adding...
process.wait()
...to the end of your program.
Update
Upon further examination, it looks like a process only gets a SIGHUP
if its controlling tty terminates, not its parent process.
It may be something to do with the -t
option which creates a new controlling tty on the remote host, and that's exiting before the subprocess it spawns has finished.
In which case, you might need...
process = subprocess.Popen("ssh -t -A $host 'nohup remote_command'", ...)
...or try it without the -t
option.
Upvotes: 2
Reputation: 3145
You can't do much more than what you have done to make sure that all input gets sent to your child process. Your second example is better than the first one, in my opinion. What you can do is inspecting the return code from your child process.
return_code = p.wait()
Your remote command should probably return 0 on successful completion and something non zero if an error occurred.
Upvotes: 1
Reputation: 28036
Instead of wrapping around a subprocess, you'd likely be better off using something like paramiko.
But in either case, if your connection gets terminated before you've sent all your data, you can catch that exception and you'll know that you need to retry. In the event the process dies prematurely you should be able to read the exit-code of the process.
Upvotes: 0