CairoCoder
CairoCoder

Reputation: 3207

Handling fetchAll from huge mysql query without memoy limit error

I am trying to fetch huge amount of data from one mysql table to be exported as XLSX file.

I used fetchAll() function, but I got

Fatal error: Out of memory

Here's my code:

<?php
require_once 'classes/Spout/Autoloader/autoload.php';
use Box\Spout\Writer\WriterFactory;
use Box\Spout\Common\Type;

$query  = "SELECT *
           FROM full_report";

$header = array('DATA','STORE','FROM','TO','DATE','YEAR','MONTH','ITEM','SIZE','DEPT','SUBDEPT','DESC1','DESC2','GENDER','ATTR','VEND','SEASON','INVO#','TRANS#','QTY','MSRP','RTP','COST','T.RTP','T.COST','PAYMENT','STATUS');

$mDb->query($query);
$result = $mDb->fetchAll(); // Here where I get the error!

$fileName = "fullReport-" . date('m-d-Y-H-i-s') . ".xlsx";
$path     = "_uploads/" . $fileName;

$writer = WriterFactory::create(Type::XLSX); // for XLSX files
$writer->openToFile($path); // write data to a file or to a PHP stream
$writer->openToBrowser($path); // stream data directly to the browser
$writer->addRow($header);

foreach ($result as $value)
{
    unset($value['id']);
    unset($value[0]);
    $valuex[] = array_values($value);
}
$writer->addRows($valuex);
$writer->close();

Any suggestions?

Upvotes: 1

Views: 1892

Answers (2)

user7081607
user7081607

Reputation:

I will suggest you to use SELECT * FROM your table name into out file 'folder path / yourlfilename.extension ' ; customise the query with your requirement. It can customize easily with your requirement. Just study the mysql into outfile function. In case of huge data it is the best solution. Make sure your last folder has permission '777' and no file exist with that name. The query generates file with it self.

Upvotes: 0

Adrien
Adrien

Reputation: 1947

fetchAll is the problem. What it does is get all the matching rows from the table and load everything in memory. It works when you don't have too many rows to fetch but causes Out of Memory errors when the number of rows to be stored exceed the available memory amount.

To fix this problem, you should fetch your data in multiple chunks. You can use the fetch method instead and a cursor. It is well documented on the PHP.net manual. You can also take at this repo: https://github.com/adrilo/spout-pdo-example. It gives you an example for using MySQL and Spout together, in a scalable way.

Upvotes: 2

Related Questions