BlackAlpha
BlackAlpha

Reputation: 376

Insert loop from CSV bug using PHP PDO

I'm tryin to insert datas (160,000+ rows) using INSERT INTO and PHP PDO but i have a bug.

When I launch the PHP script, i see more than the exact number of lines in my CSV inserted in my database.

Can someone say me if my loop is not correct or something ?

Here the code I have :

$bdd = new PDO('mysql:host=<myhost>;dbname=<mydb>', '<user>', '<pswd>');

        // I clean the table
        $req = $bdd->prepare("TRUNCATE TABLE lbppan_ticket_reglements;");
        $req->execute();

        // I read and import line by line the CSV file
        $handle = fopen('<pathToMyCsvFile>', "r");

        while (($data = fgetcsv($handle, 0, ',')) !== FALSE) {
            $reqImport = 
                "INSERT INTO lbppan_ticket_reglements 
                (<my31Columns>) 
                VALUES 
                ('$data[0]','$data[1]','$data[2]','$data[3]','$data[4]','$data[5]','$data[6]','$data[7]','$data[8]',
                    '$data[9]','$data[10]','$data[11]','$data[12]','$data[13]','$data[14]','$data[15]','$data[16]',
                    '$data[17]','$data[18]','$data[19]','$data[20]','$data[21]','$data[22]','$data[23]','$data[24]',
                    '$data[25]','$data[26]','$data[27]','$data[28]','$data[29]','$data[30]')";

             $req = $bdd->prepare($reqImport);
             $req->execute();
        }

        fclose($handle);

The script works a little because datas are in the table but i dunno why it bugs and inserts more datas. I think maybe, due to the file size (18 Mo) maybe the script crash and attempts to relaunch inserting same rows again.

I can't use LOAD DATA on the server I'm using.

Thanks for your help.

Upvotes: 0

Views: 106

Answers (2)

MonkeyZeus
MonkeyZeus

Reputation: 20737

Assuming $data[0] is the unique identifier then you can try this to spot the offending row(s):

$i = 0;

while (($data = fgetcsv($handle, 0, ',')) !== FALSE) {
    echo 'Row #'.++$i.' - '.$data[0];
}

Since you are not using prepared statements, it is very possible that one of the $data array items are causing a double-insert or some other unknown issue.

Upvotes: 0

Ne Ma
Ne Ma

Reputation: 1709

This is not an answer but adding this much into comments is quite tricky.

Start by upping the maximum execution time

If that does not solve your issue, start working your way through the code line by line and handle every exception you can think of. For example, you are truncating the table BUT you say you have loads more data after execution, could the truncate be failing?

try {
    $req = $bdd->prepare("TRUNCATE TABLE lbppan_ticket_reglements;");
    $req->execute();
} catch (\Exception $e) {
    exit($e->getMessage()); // Die immediately for ease of reading
}

Not the most graceful of try/catches but it will allow you to easily spot a problem. You can also apply this to the proceeding query...

try {
    $req = $bdd->prepare($reqImport);
    $req->execute();
} catch (\Exception $e) {
    exit($e->getMessage());
}

and also stick in some diagnostics, are you inserting 160k rows? You could optionally echo out $i on each loop and see if you can spot any breaks or abnormalities.

$i  = 0;
while (($data = fgetcsv($handle, 0, ',')) !== FALSE) {
    // ... your stuff
    $i++;
}

echo "Rows inserted " . $i . "\n\n"; 

Going beyond that you can the loop print out the SQL content for you to look at manually, perhaps its doing something weird and fruity.

Hope that helps.

Upvotes: 1

Related Questions