Happy Coder
Happy Coder

Reputation: 4682

Write large amount of data to CSV file using PHP

I need to write almost 109,392 rows to CSV file and currently I am doing it like this

    //open raw memory as file, no need for temp files
    $tempMemory = fopen('php://temp', 'w');

    //default php csv handler
    foreach ($inputArray as $line) {
        fputcsv($tempMemory, $line, $delimiter);
    }

    //rewrind the "file" with the csv lines
    fseek($tempMemory, 0);

    //modify header to be downloadable csv file
    header('Content-Type: application/csv');
    header('Content-Disposition: attachement; filename="' . $outputFileName . '";');

    //send file to browser for download
    fpassthru($tempMemory);

The problem here is that it is very slow and I am not getting all the rows inserted. The array size of $inputArray is 109,392 but only 50K rows are there in the CSV file produced. What is the best way of doing this ?

Upvotes: 2

Views: 6683

Answers (4)

Danny_ds
Danny_ds

Reputation: 11406

As it is now, you are generating (or trying to) the whole file in memory first, and only then start sending the data to the browser.

Since the resulting file (in memory?) could be large, you should at least check the return value of fputcsv(). Maybe it starts being FALSE which could be the reason you don't get all the rows.

Anyway, this is not the right way to do this (just imagine 10 users sending the same request to your server at the same time).

I don't know about php, but the ideal way to generate (on demand) and send such a file would be:

  • first send the http headers, including a header for chunked transfer mode. That way you don't need to know the size of the file up front.

  • while there is data in $inputArray, fill a buffer with rows (say max. 64kB) and send that buffer to the browser (also in chunked mode). You could eventually compress the data first.

That way the browser will start receiving the data immediately, and you won't need much more than 64Kb of memory on the server to generate the data.

Upvotes: 3

Danack
Danack

Reputation: 25701

You can write directly to STDOUT, rather than writing to a 'temp' memory file.

//modify header to be downloadable csv file
header('Content-Type: application/csv');
header('Content-Disposition: attachement; filename="' . $outputFileName . '";');

foreach ($inputArray as $line) {
    fputcsv(STDOUT, $line, $delimiter);
}

This should be a lot quicker.....however 109,392 rows is still a lot of rows. Generating this much data would be much better done as a background task that generates the whole file, and then serve the generated file from disk.

Upvotes: 1

H.Dave
H.Dave

Reputation: 21

Try to get a CSV file directly from mysql and store it in your system and then allow user to download it for e.g

SELECT order_id,product_name,qty FROM ordersINTO OUTFILE "/tmp/orders.csv"FIELDS TERMINATED BY ','ENCLOSED BY '"'LINES TERMINATED BY '\n';

Upvotes: 1

Armen
Armen

Reputation: 4202

My suggest to not write directly to php memory but instead write to file located in some public directory in your root of project then after filling all rows to it redirect request to that url domain.com/download_files/generated.csv then issue with missing line should be solved

Upvotes: 1

Related Questions