Reputation: 2283
I've written a script in PHP that parses URLs for certain info, and echos that info to the web page. The problem is that it quits after about 200 URLs (I need ~200,000). No error messages or anything. What's happening? Is there a timeout on the server side of things, or is it a browser issue? How should I work around this?
Upvotes: 2
Views: 235
Reputation: 1894
For echoing large amounts of data you could use flush() and ob_flush(), that will basically push data parts to the browser as soon as they are ready.
<?php
echo "\nStarted\n";
for( $i = 0; $i <10; $i ++ )
{
sleep(1);
//do something to get data
echo "Data part number $i \n";
ob_flush();
flush();
}
echo "Finished\n";
?>
Upvotes: 0
Reputation: 21531
Yes there is a PHP timeout and can be an Apache one too.
Best thing to do is run the script from the PHP CLI, that way it won't time out.
Doing an operation on 200,000+ URL's sounds like a bit too much. You should consider breaking them down into smaller jobs for performance and probably spam purposes too.
Upvotes: 3