PL Doucet
PL Doucet

Reputation: 11

Reading a big csv file in php, can't read all the file

so I'm trying to use PHP to read a large csv file (about 64 000 entries) and stack it in a big array

Using fopen() and fgetcsv, i managed to get most of the file read, though it suddenly stops at entry 51829 for no apparent reason

I checked in my array and in the CSV and the data get imported correctly, line 51829 in the csv and in the array are the same, etc...

any of you got an idea of why I can't read all the file?

Here's my code ^^ Thanks in advance

$this->load->model('Import_model', 'import');
$this->import->clear_student_group();



//CHARGER LE CSV
$row = 1;
$data = array();
$file = "Eleve_Groupe_Mat.csv";
$lines = count(file($file));

if ($fp = fopen('Eleve_Groupe_Mat.csv', 'r')) {
    $rownumber = 0;
    while (!feof($fp)) {
        $rownumber++;
        $row = fgetcsv($fp);
        $datarow = explode(";", $row[0]);
        for($i = 0; $i <= 7; $i++) {
            $dataset[$rownumber][$i] = $datarow[$i];
        }
    }
    fclose($fp);
}
$this->import->insert_student_group($dataset);

Upvotes: 1

Views: 3035

Answers (3)

Adam Fowler
Adam Fowler

Reputation: 1751

I think Fopen has restrictions on reading files. Try using file_get_contents();

Upvotes: 1

GordonM
GordonM

Reputation: 31780

You're exhausting all the memory PHP has available to it. A file that big can't fit into memory, especially not in PHP, which stores a lot of additional data with every variable created.

You should read a limited number of lines in, say 100 or so, then process the lines you've read in and discard them. Then read the next 100 lines or so and repeat the process until you've processed the entire file.

Upvotes: 1

Asaph
Asaph

Reputation: 162851

Your script is probably running out of memory. Check your error logs or turn on error reporting to confirm. To make your script work you can try increasing the memory_limit, which can be done either in php.ini or using ini_set(). But a much better approach would be not to store the csv data in an array at all, but rather process each line as you read it. This keeps the memory footprint low and alleviates the need for increasing the memory_limit.

Upvotes: 1

Related Questions