frustratedtech
frustratedtech

Reputation: 433

php memory limit and reading/writing temp files

using the function below I am pulling rows from tables, encoding them, then putting them in csv format. I am wondering if there is an easier way to prevent high memory usage. I don't want to have to rely on ini_set. I believe the memory consumption is caused from reading the temp file and gzipping it up. I'd love to be able to have a limit of 64mb ram to work with. Any ideas? Thanks!

function exportcsv($tables) {
    foreach ($tables as $k => $v) {
        $fh = fopen("php://temp", 'w');
        $sql = mysql_query("SELECT * FROM $v");
        while ($row = mysql_fetch_row($sql)) {
            $line = array();
            foreach ($row as $key => $vv) {
                $line[] = base64_encode($vv);
            }
            fputcsv($fh, $line, chr(9));
        }
        rewind($fh);
        $data = stream_get_contents($fh);
        $gzdata = gzencode($data, 6);
        $fp = fopen('sql/'.$v.'.csv.gz', 'w');
        fwrite($fp, $gzdata);
        fclose($fp);
        fclose($fh);
    }
}

Upvotes: 2

Views: 3039

Answers (4)

goat
goat

Reputation: 31813

untested, but hopefully you understand

function exportcsv($tables) {
    foreach ($tables as $k => $v) {
        $fh = fopen('compress.zlib://sql/' .$v. '.csv.gz', 'w');
        $sql = mysql_unbuffered_query("SELECT * FROM $v");
        while ($row = mysql_fetch_row($sql)) {
            fputcsv($fh, array_map('base64_encode', $row), chr(9));
        }
        fclose($fh);
        mysql_free_result($sql);
    }
}

edit- points of interest are the use of mysql_unbuffered_query and use of php's compression stream. regular mysql_query() buffers entire result set into memory. and using the compression stream gets rid of having to buffer the data yet again into php memory as a string before writing to a file.

Upvotes: 2

nyxthulhu
nyxthulhu

Reputation: 9752

You could insert some flushing there, currently your entire php file will be held in memory then flushed at the end, however if you manually

fflush($fh);

Also instead of gzipping the entire file you could gzip line by line using

$gz = gzopen ( $fh, 'w9' );
gzwrite ( $gz, $content );
gzclose ( $gz );

This will write line by line packed data rather than creating an entire file and then gzipping it.

Upvotes: 1

ben
ben

Reputation: 1936

I found this suggestion for compressing in chunks on http://johnibanez.com/node/21

It looks like it wouldn't be hard to modify for your purposes.

function gzcompressfile($source, $level = false){ 
    $dest = $source . '.gz'; 
    $mode = 'wb' . $level; 
    $error = false; 

    if ($fp_out = gzopen($dest, $mode)) { 
        if ($fp_in = fopen($source, 'rb')) { 
            while(!feof($fp_in)) {
                gzwrite($fp_out, fread($fp_in, 1024*512)); 
            }
            fclose($fp_in); 
        } 
        else
            $error=true; 

        gzclose($fp_out); 
    } 
    else 
        $error=true; 

    if ($error)
        return false; 
    else
        return $dest;
} 

Upvotes: 0

Marc B
Marc B

Reputation: 360672

Pulling the whole file into memory via the stream_get_contents() is probably what's killing you. Not only are you having to hold the base64 data (which is usually about 33% than its raw content), you've got the csv overhead to deal with as well. If memory is a problem, consider simply calling a command-line gzip app instead of gzipping inside of PHP, something like:

... database loop here ...
exec('gzip yourfile.csv');

And you can probably optimize things a little bit better inside the DB loop, and encode in-place, rather than building a new array for each row:

while($row = mysql_fetch_row($result)) {
   foreach ($row as $key => $val) {
       $row[$key] = base64_encode($val);
       fputcsv($fh, $row, chr(9));
   }
}

Not that this will reduce memory usage much - it's only a single row of data, so unless you're dealing with huge record fields, it won't have much effect.

Upvotes: 2

Related Questions