Bart B
Bart B

Reputation: 669

Perl - output from external process directly to stdout (avoid buffering)

I have a Perl script that has to wrap a PHP script that produces a lot of output, and takes about half an hour to run.

At moment I'm shelling out with:

print `$command`;

This works in the sense that the PHP script is called, and it does it's job, but, there is no output rendered by Perl until the PHP script finishes half an hour later.

Is there a way I could shell out so that the output from PHP is printed by perl as soon as it receives it?

Upvotes: 2

Views: 386

Answers (3)

ephemient
ephemient

Reputation: 204668

Are you really doing print `$command`?

If you are only running a command and not capturing any of its output, simply use system $command. It will write to stdout directly without passing through Perl.

Upvotes: 2

Joel Berger
Joel Berger

Reputation: 20280

You might want to investigate Capture::Tiny. IIRC something like this should work:

use strict;
use warnings;

use Capture::Tiny qw/tee/;

my ($stdout, $stderr, @result) = tee { system $command };

Actually, just using system might be good enough, YMMV.

Upvotes: 0

Matthew Walton
Matthew Walton

Reputation: 9959

The problem is that Perl's not going to finish reading until the PHP script terminates, and only when it finishes reading will it write. The backticks operator blocks until the child process exits, and there's no magic to make a read/write loop implicitly.

So you need to write one. Try a piped open:

open my $fh, '-|', $command or die 'Unable to open';
while (<$fh>) {
    print;
}
close $fh;

This should then read each line as the PHP script writes it, and immediately output it. If the PHP script doesn't output in convenient lines and you want to do it with individual characters, you'll need to look into using read to get data from the file handle, and disable output buffering ($| = 1) on stdout for writing it.

See also http://perldoc.perl.org/perlipc.html#Using-open()-for-IPC

Upvotes: 8

Related Questions