Reputation: 17348
I am in the early stages of building a PHP application, part of which involves using file_get_contents()
to fetch large files from a remote server and transfer them to a user. Lets say, for example, the targeted file that is being fetched is 200 mB.
I am just trying to make sure that I know my options or limitations are before I do much too more.
Thank you for your time.
Upvotes: 2
Views: 1788
Reputation: 7426
If not a timeout you may well run into memory issues depending on how your PHP is configured. You can adjust a lot of these settings manually through code without much difficulty.
http://php.net/manual/en/function.ini-set.php
ini_set('memory_limit', '256M');
Upvotes: 2
Reputation: 174957
Yes, you can use set_time_limit(0)
and the max_execution_time
directive to cancel the time limit imposed by PHP.
You can open a stream of the file, and transfer it to the user seamlessly.
Read about fopen()
Upvotes: 2