Reputation: 319
I am writing a research tool and I have recently switched from using "print" statements to using the logger functionality built into Python. This, I reasoned, would allow me to give the user the option of dumping the output to a file, besides dumping it to the screen.
So far so good. The part of my code that is in Python uses "logger.info" and "logger.error" to dump to both the screen and a file. "logger" is the module-wide logger. This part works like a charm.
However, at several points, I use "subprocess.call" to run an executable through the shell. So, throughout the code, I have lines like
proc = subprocess.call(command)
The output from this command would print to the screen, as always, but it would not dump to the file that the user specified.
One possible option would be to open up a pipe to the file:
proc = subprocess.call(command, stdout=f, stderr=subprocess.OUTPUT)
But that would only dump to the file and not to the screen.
Basically, my question boils down to this: is there a way I can leverage my existing logger, without having to construct another handler for files specifically for subprocess.call? (Perhaps by redirecting output to the logger?) Or is this impossible, given the current setup? If the latter, how can I improve the setup?
(Oh, also, it would be great if the logging were in 'real time', so that messages from the executable are logged as they are received.)
Thanks for any help! :)
Upvotes: 5
Views: 2149
Reputation: 99510
unutbu
's comment is good; you should take a look at Lennart's answer.
What you need is something like the functionality of tee
, but the subprocess
module works at the level of OS handles, which means that data written by the subprocess can't be seen by your Python code, say by some file-like object you write which logs and prints whatever is written to it.
As well as using Lennart's answer, you can do this sort of thing using a third party library like sarge
(disclosure: I'm its maintainer). It works for more than logging. Suppose you have a program that generates output, such as:
# echotest.py
import time
for i in range(10):
print('Message %d' % (i + 1))
and you want to capture it in your script, log it and print it to screen:
#subptest.py
from sarge import capture_stdout
import logging
import sys
logging.basicConfig(filename='subptest.log', filemode='w',
level=logging.INFO)
p = capture_stdout('python echotest.py', async=True)
while True:
line = p.stdout.readline()
line = line.strip()
# depending on how the child process generates output,
# sometimes you won't see anything for a bit. Hence only print and log
# if you get something
if line:
print(line)
logging.info(line)
# Check to see when we can stop - after the child is done.
# The return code will be set to the value of the child's exit code,
# so it won't be None any more.
rc = p.commands[0].process.poll()
# if no more output and subprocess is done, break
if not line and rc is not None:
break
If you run the above script, you get printed out to the console:
$ python subptest.py
Message 1
Message 2
Message 3
Message 4
Message 5
Message 6
Message 7
Message 8
Message 9
Message 10
And when we check the log file, we see:
$ cat subptest.log
INFO:root:Message 1
INFO:root:Message 2
INFO:root:Message 3
INFO:root:Message 4
INFO:root:Message 5
INFO:root:Message 6
INFO:root:Message 7
INFO:root:Message 8
INFO:root:Message 9
INFO:root:Message 10
Upvotes: 3
Reputation: 366103
Instead of piping stdout
to a file, you can pipe it to a PIPE
, and then read from that PIPE and write to logger. Something like this:
proc = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.OUTPUT)
for line in proc.stdout:
logging.info(line)
However, there's an even simpler answer: You have to use a file-like object with a file handle, but you can create one on top of pipes that passes each line to logging
. You could write this object yourself, but, as @unutbu says, someone's already done it in this question. So:
with StreamLogger(logging.INFO) as out:
proc = subprocess.call(command, stdout=out, stderr=subprocess.OUTPUT)
Of course you can also temporarily wrap stdout
to write to the logger and just pass the output through, e.g., using this confusingly identically-named class:
with StreamLogger('stdout'):
proc = subprocess.call(command, stderr=subprocess.OUTPUT)
Upvotes: 3