Reputation: 263
I am trying to read from a process that produces long and time-consuming output. However, I want to catch it's output as and when it is produced. But using something like the following seems to be buffering the command's output, so I end up getting the output lines all at once:
p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, bufsize=0)
for line in p.stdout:
print line
I am trying this on MacOS 10.5
Upvotes: 26
Views: 10142
Reputation: 2743
The file iterator is doing some internal buffering on its own. Try this:
line = p.stdout.readline()
while line:
print line
line = p.stdout.readline()
You also need to make sure the process you are running is actually flushing its output buffers frequently.
Upvotes: 31
Reputation: 2378
This was actually a bug that's fixed in Python 2.6: http://bugs.python.org/issue3907
Upvotes: 3
Reputation: 881575
Usually, every program will do more buffering on its input and/or output channels than you appear to desire... unless it's fooled into believing said channel's actually a terminal!
For that "fooling in a good cause" purpose, use pexpect -- it works just fine on a Mac (life is harder on Windows, though there are solutions that might help even there - fortunately we don't need to dwell on those as you use a Mac instead).
Upvotes: 6