kitenski
kitenski

Reputation: 639

Getting a 2GB file inside PHP?

I am needing to download a very large file via PHP, the last time I did it manually via http it was 2.2gb in size and took a few hours to download. I would like to automate the download somehow.

Previously I have used

file_put_contents($filename, file_get_contents($url));

Will this be ok for such a large file? I will want to untar the file post downloading and then perform analysis of the various files inside the tarball.

Upvotes: 1

Views: 510

Answers (2)

Álvaro González
Álvaro González

Reputation: 146450

file_get_contents() is handy for small files but it's totally unsuitable for large files. Since it loads the entire file into memory you need like 2GB of RAM for each script instance!

You should use resort to old fopen() + fread() instead.

Also, don't discard using a third-party download tool like wget (installed by default in many Linux systems) and create a cron task to run it. It's possibly the best way to automate a daily download.

Upvotes: 5

Guillaume Lebourgeois
Guillaume Lebourgeois

Reputation: 3873

You will have to adapt your php.ini to accept larger files in upload, and adapt your memory usage limit.

Upvotes: 0

Related Questions