Matei
Matei

Reputation: 152

Controlling a "daemon-like" linux script from Python

I am trying to:

I have played around with the subprocess library, however all functions which handle output are blocking (they wait for the process to terminate). However, in my particular case, the script is non-terminating.

Is there a way to run the script "in background", kill it after some time, and then obtain it's output?

Upvotes: 1

Views: 161

Answers (2)

Matei
Matei

Reputation: 152

Unfortunately, disabling buffering in Click was not an option (as far as I could figure out). To address the issue with scripts buffering their output, I opted for:

Writing a script (which I call finite_script.sh) running my program (script.sh) for a finite amount of time (and redirecting stderr to stdout):

#!/bin/bash
$1 2>&1 & 
sleep 5
kill -KILL `pgrep $1`

... and running the (terminating) finite_script.sh script in Python:

p = subprocess.Popen(['finite_script.sh', 'script.sh'], stdout=subprocess.PIPE)
(output, err) = p_real.communicate()
print output

The solution is not really elegant, but it works in my particular setting (where script.sh is actually a Click script).

Thank you for the previous answer!

Upvotes: 0

math
math

Reputation: 2881

You should look at the Popen Objects in subprocess, they are completely appropriate for what you want to do:

import subprocess, time, signal

p = subprocess.Popen(['python', '-u', 'myscript.py'], stdout=subprocess.PIPE)
time.sleep(5)
p.send_signal(signal.SIGTERM)
output = p.communicate()[0]

print('Process has exited with code %d' % p.wait())
print('Output is %s' % str(output))

If you send a signal to stop the process, some data may be lost in the process output buffer. This is why you may want to ensure the script is not buffering its output (use "-u" option as above if your script is also a python script).

Upvotes: 1

Related Questions