Reputation: 25029
I'm trying to write a simple php script to take in data from stdin
, process it, then write it to stdout
. I know that PHP is probably not the best language for this kind of thing, but there is existing functionality that I need.
I've tried
<?php
$file = file_get_contents("php://stdin", "r");
echo $file;
?>
but it doesn't work. I'm invoking it like this: echo -e "\ndata\n" | php script.php | cat
. and get no error messages. The script I'm trying to build will actually be part of a larger pipeline.
Any clues as to why this is not working?
PS: I'm not very experienced with PHP.
Upvotes: 4
Views: 19954
Reputation: 489
If you are piping, you will want to buffer the input, instead of processing it all at once, just go one line at a time as is standard for *nix tools.
The SheBang on top of the file allows you to execute the file directly, instead of having to call php in the command line.
Save the following to test.php and run
cat test.php | ./test.php
to see the results.
#!php
<?php
$handle = fopen('php://stdin', 'r');
$count = 0;
while(!feof($handle)) {
$buffer = fgets($handle);
echo $count++, ": ", $buffer;
}
fclose($handle);
Upvotes: 6
Reputation: 25029
Right, got it working.
<?php
$input_stream = fopen("php://stdin","r");
$text="";
while($line = fgets($input_stream,4096)){ // Note 4k lines, should be ok for most purposes
$text .= $line;
}
fclose($input_stream);
print($text);
?>
from a recipe at PHPBuilder.
Upvotes: 0
Reputation: 10526
To place a php-script in a pipe you can use:
xargs -d "\n" ./mysrcipt.php --foo
With many lines/args the ./myscript.php will be called a couple times, but always with --foo.
e.g.:
./myscript.php:
#!/bin/php
<?php
foreach($args as $key => $value){
echo "\n".$key.":".$value;
}
?>
cat -n1000 /path/file | xargs -d "\n" ./myscript.php --foo | less
will call the script two times with echo to stdout/less:
0:./myscript
1:--foo
2:[file-line1]
3:[file-line2]
...
800:[file-line799]
0:./myscript
1:--foo
2:[file-line800]
...
Upvotes: 3