Jayyrus
Jayyrus

Reputation: 13051

Best way to merge and store multiple csv file using mysql and php

I'm working with a company that every day gives me a CSV file with about 25.000 rows. The difference between CSV of one day and that of the day before is that in the newest one, some rows (many less than the total) are deleted and others are added. So the two files has about 24900 rows in common.

I have to store all the rows in the time.. so every day I have to update my table in DB with the current CSV.

I think about:

<?php
   $fh = fopen($actual_csv, 'r');
   $contents = fread($fh, filesize($actual_csv));
   fclose($fh);
   $fileLines = explode("\n", $contents);
   for ($i = 1; $i < count($fileLines) - 1; $i++) {
    $fieldList = explode(';', $fileLines[$i]);
        //$fieldList[0] is my unique id
        if(mysql_num_rows(mysql_query("SELECT * FROM table_where_i_store_all WHERE id='$fieldList[0]'"))<=0){
           mysql_query("INSERT INTO table_where_i_store_all (column names..) VALUES ('$fieldList[0],........')"); // there are many column so i don't write it..
        }
   }
?>

I think this is not very powerful and fast. is there any better way? thanks!!!

Upvotes: 3

Views: 1494

Answers (2)

Antti Ryts&#246;l&#228;
Antti Ryts&#246;l&#228;

Reputation: 1545

ALTER TABLE table_where_i_store_all ADD UNIQUE( id );

$fileLines = explode("\n", $contents);
$linemax = count( $fileLines )-1;

if( $linemax < 1 ) // empty file?
  return;

$SQL = "INSERT IGNORE INTO table_where_i_store_all (column_names) VALUES ";

for ($i = 1; $i < $linemax; $i++) {
   $fieldList = explode(';', $fileLines[$i]);
   //$fieldList[0] is my unique id

   $SQL .= "('$fieldList[0],........'),";        
}
$SQL = substr( $SQL, 0, strlen($SQL)-1);  // remove extra comma from end
$res = mysql_query($SQL);

Upvotes: 1

Maxim Krizhanovsky
Maxim Krizhanovsky

Reputation: 26699

create unique index on the id fields (maybe you've done in already) and use INSERT IGNORE or INSERT ... ON DUPLICATE KEY UPDATE

Upvotes: 3

Related Questions