Reputation: 21
I have a php script that scrapes data from a website using file_get_contents('http://remote_site.com/page.html')
. The only issue I'm running into is that it prints the data only after all the data is scraped and processed. Is there a way to print
or echo
the data it as the script is scrapping?
Upvotes: 1
Views: 509
Reputation: 14489
If you want to be working with (and flushing) the buffer as you're reading a remote file I believe that you will need to switch from using file_get_contents
to use the f
-commands (fopen
,fgets
,etc) in order to be able to process/flush
the code as you're scrapping. file_get_contents()
does not support the offset parameter for remote files so you have to wait until the file is read in full before you can handle the result.
You'll have to check that allow_url_fopen
is enabled in your php.ini file, but you should be able to write something like this (modified from the documentation):
$file = fopen ("http://www.example.com/", "r");
if (!$file) {
echo "<p>Unable to open remote file.\n";
exit;
}
ob_start();
while (!feof ($file)) {
$line = fgets ($file, 1024);
$buffer = $line;//you can manipulate what goes to the buffer here
echo $buffer;
ob_flush();
flush();
}
fclose($file);
You may need to play around with this, as I haven't tested it, but I think that's the approach you'll want to take.
Upvotes: 0