Reputation: 1010
I have a CSV File that I am throwing into a DB. It does this correctly but I dont want repeating records obviously. Any Ideas how I can record only unique ones? I don't want to throw them into the DB then run a DELETE
query because well thats not time effective.
if (($handle = fopen($csvfile, "r")) !== FALSE) {
while (($data = fgetcsv($handle,1000, ',')) !== FALSE) {
$count ++;
if ($count > 1) {
mysqli_query($con, "INSERT INTO $table2 (id, cat) VALUES ('".$data[0]."', '".$data[1]."')");
mysqli_query($con, "INSERT INTO $table1 (id, cat) VALUES ('".$data[2]."', '".$data[3]."')");
mysqli_query($con, "INSERT INTO $table3 (id, word) VALUES ('".$data[4]."', '".$data[5]."')");
mysqli_query($con, "INSERT INTO $table4 (id, cl, co, imp) VALUES ('".$data[0]."','".$data[6]."', '".$data[7]."', '".$data[8]."')");
}
}
}
Upvotes: 0
Views: 106
Reputation: 1210
you can either use teh unique key way or you can use
WHERE NOT EXISTS
on a specific column which you are considering to find the duplicate. So it will be like
Insert into table name() values () where not exists (Select statement)
Upvotes: 0
Reputation: 11984
One way is after inserting all the records you can add a unique index and drop the duplicates with
ALTER IGNORE TABLE your_table
ADD UNIQUE INDEX dup_idx (column_1, column_2, column_3, column_3);
Another way is that add this unique index before insertion then use insert ignore
query
INSERT IGNORE INTO table (column1,column2) VALUES ('value1','value2');
This will ignore the error occuring on insertion of duplicate entries since unique index is there also it will skip the duplicate entry insertion.
Upvotes: 3