David Fritsch
David Fritsch

Reputation: 3751

Output 1,000s of records to text file

So I was hoping to be able to get by with a simple solution to read records from a database and save them to a text file that the user downloads. I have been doing this on the fly and for under 20,000 records, this works great. Over 20,000 records and I'm loading too much data into memory and PHP hits a fatal error.

My thought was to just grab everything in chunks. So I grab XX number of rows and echo them to the file and then loop to get the next XX rows until I'm done.

I am just echoing the results right now though, not building the file and then sending it for download, which I'm guessing I'll have to do.

The issue at this point succinctly is that with up to 20,000 rows, the file builds and downloads perfectly. With more than that, I get an empty file.

The code:

header('Content-type: application/txt');
header('Content-Disposition: attachment; filename="export.'.$file_type.'"');
header('Expires: 0');
header('Cache-Control: must-revalidate');

// I do other things to check for records before, hence the do-while loop
$this->items = $model->getItems();

do {
    foreach ($this->items as $k => $item) {
        $i=0;
        $tables = count($this->data['column']);
        foreach ($this->data['column'] as $table => $fields) {
            $columns = count($fields);
            $j = 0;
            foreach ($fields as $field => $junk) {
                if ($quote_output) {
                    echo '"'.ucwords(str_replace(array('"'), array('\"'), $item->$field)).'"';
                } else {
                    echo ''.$item->$field.'';
                }
                $j++;
                if ($j<$columns) {
                    echo $delim;
                }
            }
            $i++;
            if ($i<$tables) {
                echo $delim;
            }
        }
        echo "\n";
    }
} while($this->items = $this->_model->getItems());

Upvotes: 0

Views: 331

Answers (3)

user1864610
user1864610

Reputation:

Actually, this might be simple fix. If PHP is running out of memory it's probably because the output buffer is filling before the file is sent. If so, simply flush() at regular intervals.

This will flush after each line:

do {
    foreach(...) {
       // assemble your output line here
    }
      echo "\n";
      flush();
    }
} while($this->items = $this->_model->getItems());

Flushing after each line might prove too slow, in which case add a counter and flush after every hundred, or whatever works best.

Upvotes: 0

Alexis Wilke
Alexis Wilke

Reputation: 20828

Very large tables won't work that way.

You have to output the data as you read it from the database. If you need to sorted, then use the database ORDER BY for that purpose.

So more or less

// assuming you use a var such as $query to handle the DB
while(!$query->eof())
{
  $fields = $query->read_next();
  echo $fields; // with your formatting, maybe call a function...
}

The empty result is normal. If the memory is exhausted before any echo happens then nothing was sent to the browser.

Note also that PHP has a time limit (a watchdog) that you may need to tweak. The default is defined in your php.ini. You may set it to zero if you expect the tables to grow very much.

Upvotes: 1

Rafa Jaques
Rafa Jaques

Reputation: 135

You should change your str_replace for addslashes(). This will probably free some memory.

Then I suggest you to save a file and use php file functions to do so: fopen() or file_put_contents().

I hope that might help you!

Upvotes: 0

Related Questions