Reputation: 23
I am wanting to use this example:
https://stackoverflow.com/a/5830599
Pretty much for the very reason mentioned on this page.
I'm trying to serve up larger files (100-200megs in general) and need to 'output' the data in chunks instead of it reading it all in memory with curl_exec(). My web host provider only allows me 64 megs of memory so I can't read that much information at once.
Any suggestions here? Thanks in advance.
Upvotes: 2
Views: 1507
Reputation: 163488
This is pretty easy. All you need to do is provide cURL with a callback to handle data as it comes in.
function onResponseBodyData($ch, $data)
{
echo $data;
return strlen($data);
}
curl_setopt($ch, CURLOPT_WRITEFUNCTION, 'onResponseBodyData');
Returning the length of data from your callback is important. It signifies how much data you processed. If you return something other than the length of data passed in (such as 0
), then the request is aborted.
Now, make sure you don't have output buffering turned on, and configure your server to not buffer the entire response before sending. It will work out of the box on most configurations.
You can find more examples here: http://curl.haxx.se/libcurl/php/examples/callbacks.html
Upvotes: 1