Waschbaer
Waschbaer

Reputation: 486

Send subprocess.Popen stdout, stderr to logging module

I want to start an application using Pythons subprocess.Popen and send the output of the applications stdout and stderr to the logging module in such fashion that it shows like logging.INFO("This is whats in stdout") every time the applications sends something/someline to stdout/stderr.

Since the application is a deamon it does ( and should ) not terminate.

Is there an easy way to achieve this or do I have to check the process output frequently with a separate thread?

Upvotes: 4

Views: 6069

Answers (3)

jfs
jfs

Reputation: 414149

A thread is a simple and portable way to consume the output (not tested):

#!/usr/bin/env python
import logging
from subprocess Popen, PIPE, STDOUT
from threading import Thread

def consume_lines(pipe, consume):
    with pipe:
        for line in iter(pipe.readline, b''): #NOTE: workaround read-ahead bug
            consume(line)

logging.basicConfig(level=logging.INFO, format='%(asctime)s %(message)s')
consume = lambda line: logging.info('this is whats in the output: %r', line)

process = Popen(command, stdout=PIPE, stderr=STDOUT, bufsize=1)
Thread(target=consume_lines, args=[process.stdout, consume]).start()
process.wait()

Upvotes: 4

bradrf
bradrf

Reputation: 51

Here's what I took from j-f-sebastian's answer as a reusable class:

import subprocess
from threading import Thread

class BackgroundPopen(subprocess.Popen):
    @staticmethod
    def prefix_handler(prefix, io):
        return lambda line: io.write(prefix + line)

    @staticmethod
    def _proxy_lines(pipe, handler):
        with pipe:
            for line in pipe:
                handler(line)

    def __init__(self, out_handler, err_handler, *args, **kwargs):
        kwargs['stdout'] = subprocess.PIPE
        kwargs['stderr'] = subprocess.PIPE
        super(self.__class__, self).__init__(*args, **kwargs)
        Thread(target=self._proxy_lines, args=[self.stdout, out_handler]).start()
        Thread(target=self._proxy_lines, args=[self.stderr, err_handler]).start()

Upvotes: 5

Vinay Sajip
Vinay Sajip

Reputation: 99335

In my experience, a separate thread is the way to go. That's how my sarge library does it - it uses threads under the hood to capture output from the child process. In fact, I generally use two threads (one for stdout and one for stderr) unless I am merging the two output streams in the subprocess.Popen call.

Upvotes: 3

Related Questions