Reputation: 338
I am trying to use the function below to zip up a file that is 551 mb, but there is not memory for it to run. I have used it to zip up other file and it works fine so I think it has to do with the size of the file.
function Zip($source, $destination)
{
global $latest_filename;
if (!extension_loaded('zip') || !file_exists($source)) {
return false;
}
$zip = new ZipArchive();
if (!$zip->open($destination, ZIPARCHIVE::CREATE | ZIPARCHIVE::OVERWRITE)) {
return false;
}
$source = str_replace('\\', '/', realpath($source));
if (is_dir($source) === true) {
$files = new RecursiveIteratorIterator(new RecursiveDirectoryIterator($source), RecursiveIteratorIterator::SELF_FIRST);
foreach ($files as $file) {
$file = str_replace('\\', '/', realpath($file));
if (is_dir($file) === true) {
$zip->addEmptyDir(str_replace($source . '/', '', $file . '/'));
} else if (is_file($file) === true) {
$zip->addFromString(str_replace($source . '/', '', $file), file_get_contents($file));
}
}
} else if (is_file($source) === true) {
$zip->addFromString(basename($source), file_get_contents($source));
}
return $zip->close();
}
Here is the error I recieve:
Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 577311064 bytes)
Thanks for any help with this.
Upvotes: 4
Views: 6501
Reputation: 338
Thank you for all of your reponses to this question , What I decided to do was to create a new zipping function that did not use 'file_get_contents' as this was using up all of the memory.
Here is my new function
function zip2 ($zipname){
$zip = new ZipArchive();
$zip->open("" .$zipname. ".zip", ZipArchive::CREATE);
$files = scandir("" .$zipname. "");
unset($files[0], $files[1]);
foreach ($files as $file){
$zip->addFile("" .$zipname. "/{$file}");
}
$zip->close();
}
Thanks
Colin
Upvotes: 8
Reputation: 48387
Upping your memory limit can sometimes be the right way to solve a problem but its not going to scale. Certainly you should not be changing the memory limit in php.ini to solve a problem for a single script!
If you're at 500Mb then you're already near the limits of what the system is capable of providing.
Looking at your script, there's nothing obviously wrong with your approach - presumably either the zip file is being built in memory or something is leaking. It would be fairly easy to test which is the case. A leak might be fixed by upgrading, but it might not.
The quickest route to a solution is to replace the code with:
function Zip($source, $destination)
{
if (!is_readable($source) || ! is_writeable(dirname($dest)) ||
(file_exists($dest) && !is_file($dest))) {
// really you should capture some more specific information
// in your excaption handling
return false;
}
$output='';
$returnv=true;
exec("zip -r $destination $source", $output, $returnv);
return !$returnv;
}
Upvotes: 2
Reputation: 4889
If you can't increase RAM allocated for the script, you may want to see if there's less memory used if you use exec()
and use the operating system's unzip directly. Still, I wouldn't expect to see less memory used than the zip file's size. You could also look into unzipping the contents in multiple batches after reading the zip's contents and dividing them into chunks that match your available memory.
Upvotes: 0
Reputation: 757
gee I wonder how you figured it had to do with file size :'^
now seriously, php.ini has a directive you can modify to allow more memory to the process. Just open it and search for 512 I guess, increase it.
If you don't have admin access, try to slip this code in ini_set('memory_limit', '1024M')
but no promises.
Upvotes: -1