Reputation: 515
100 MB file --> 10 ZIP calls(10 MB zip per call) --> 1 ZIP file
I should initiate 10 calls to add a single 100 MB file into Zip file (say 10 MB Zipping per call).
Problem is we have a system with memory and time limit(which will not process more then 10 to 15MB for a call).
So zipping a big file with many calls is the basic idea.
I am ready to provide more data if required.
Upvotes: 4
Views: 2561
Reputation: 1214
Have you ever tried out PECL Zip before?
Just zipped two files with the following code without any memory limitation problems. Time limit may be reset. My environment: memory_limit of 3MB and and max_execution time of 20 Sec.
<?php
set_time_limit(0);
$zip = new ZipArchive();
$zip->open('./test.zip', ZipArchive::CREATE);
$zip->addFile('./testa'); // 1.3 GB
$zip->addFile('./testb'); // 700mb
$zip->close();
Note: set_time_limit() will not work on php < 5.4 with save_mode=on
Another approach could be to create the zip in a background process. This avoids possible memory_limit issues.
Here is an example: http://pastebin.com/7jBaPedb
Usage:
try {
$t = new zip('/bin', '/tmp/test.zip');
$t->zip();
if ($t->waitForFinish(100))
echo "Succes :)";
else
echo $t->getOutput();
} catch ($e) {
echo $e->getMessage();
}
Instead of waiting until the process has ended, you could write the pid in a database and serve the file if it has finished...
Upvotes: 3
Reputation: 14173
Reading your question I first started to create a chunked zip packer, to do just what you asked. It would generate an array with links to a webpage, which you had to open in sequence to create a zip file. While the idea was working, I quickly realised its not realy needed.
A memorylimit is only a problem when the packer tries to open the entire file at once, and then zip it. Luckily a few smart people already figured that its easier to do it in chunks.
Asbjorn Grandt is one of those people who created this zip class which is very easy to use and does what you need.
First I created a very large file. It will be 500MB in size with various letters in it. This file is way to back to handle at once, which results in the fatal memory limit errors.
<?php
$fh = fopen("largefile.txt", 'w');
fclose($fh);
$fh = fopen("largefile.txt", 'ab');
$size = 500;
while($size--) {
$l = chr(rand(97, 122));
fwrite($fh, str_repeat($l, 1024*1024));
}
fclose($fh);
?>
And to use the zip class we would do:
<?php
include('zip.php');
$zip = new Zip();
$zip->setZipFile("largefile.zip");
//the firstname is the name as it will appear inside the zip and the second is the filelocation. In my case I used the same, but you could rename the file inside the zip easily.
$zip->addLargeFile("largefile.txt", "largefile.txt");
$zip->finalize();
?>
Now the large zip is created in just a few seconds on my server and the result is a 550KB file.
Now if for some weird reason you still need to do this in several web request, let me know. I still have the code I started with to do just that.
Upvotes: 2