Reputation: 4317
In Python, what is the simplest way to execute a local Linux command stored in a string while catching any potential exceptions that are thrown and logging the output of the Linux command and any caught errors to a common log file?
String logfile = “/dev/log”
String cmd = “ls”
#try
#execute cmd sending output to >> logfile
#catch sending caught error to >> logfile
Upvotes: 7
Views: 3668
Reputation: 115001
Using the subprocess module is the correct way to do it:
import subprocess
logfile = open("/dev/log", "w")
output, error = subprocess.Popen(
["ls"], stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate()
logfile.write(output)
logfile.close()
EDIT subprocess expects the commands as a list so to run "ls -l" you need to do this:
output, error = subprocess.Popen(
["ls", "-l"], stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate()
To generalize it a little bit.
command = "ls -la"
output, error = subprocess.Popen(
command.split(' '), stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate()
Alternately you can do this, the output will go directly to the logfile so the output variable will be empty in this case:
import subprocess
logfile = open("/dev/log", "w")
output, error = subprocess.Popen(
["ls"], stdout=logfile,
stderr=subprocess.PIPE).communicate()
Upvotes: 16
Reputation: 44749
Check out commands
module.
import commands
f = open('logfile.log', 'w')
try:
exe = 'ls'
content = commands.getoutput(exe)
f.write(content)
except Exception, text:
f.write(text)
f.close()
Specifying Exception
as an exception class after except
will tell Python to catch all possible exceptions.
Upvotes: -3
Reputation: 4468
subprocess is the best module for this.
You have different ways to run you scripts, in separate threads, or in the same waiting for each command to finish. Check the whole docs that are more than useful:
http://docs.python.org/library/subprocess.html
Upvotes: 0