Reputation: 7234
I have a table in my database with a string for a primary_key, which I set. I need to check frequently if around 1000 "items" exist in that table before adding them, because there must not be any duplicates. The result is 2 queries per Item, or 2000 total, which is 1-2 seconds of extra loading time.
If I try to insert the new row anyway, without checking for duplicates it doesn't get inserted and that's great, but mysql returns an error, which crashes my service.
My questions:
Upvotes: 1
Views: 94
Reputation: 74076
You could use the IGNORE
keyword to have duplicates dropped from your inserts:
INSERT IGNORE INTO yourTable Values (...)
Upvotes: 5
Reputation: 157919
mysql returns an error, which crashes my service.
it's actually your own code that crashes your service as mysql error cannot crash anything.
Is there a better way to prevent inserting duplicates
INSERT IGNORE
Upvotes: 5