Reputation: 6270
Here's the situation
Over the time I will have a MySQL database with around 40-50 thousand records. The records are pulled from a text file (through a PHP script, after a button on the web page is pressed) that updates itself (appends) 2-3 times a day from another department (not in my hand). I have to insert these new records in the database (when the button on my web page is pressed) and ignore the one's that are already there. For that purpose I am using "Insert Ignore"
statement.
My question is over the time when the records grow up (40-50 thousand) will the "Insert Ignore"
statement take long time to execute (my browser timeout is 5 seconds and I can't go beyond that). There would never be more than 300 unique records to update at one single time and my table just has 2 columns (if that matters). The system is working perfectly now but I want to be prepared for the future. I have no way to test his so I'll appreciate if someone can tell me based on his/her experience.
Upvotes: 1
Views: 413
Reputation: 57758
A table with 50k records shouldn't give you performance issues with something like that. Now if you had 50 million, I might change my tune.
As Moshe L indicated, there are a couple of ways to go about this. Here's a link to an article that measures the performance of ON DUPLICATE KEY vs. INSERT IGNORE on a database of 9 million records. According to that, you're going down the right path.
Upvotes: 2
Reputation: 1905
I have a heavy name/value system with INSERT IGNORE
and also INSERT ... On DUPLICATE KEY UPDATE
.
When table became large, I am "splitting" today info (updated heavy) and archive.
Also, I doesn't know what is your system design, but in my system (simple name/counter) I am saving one-week detailed information and the archive is grouped to get small and less rows.
Good luck!
Upvotes: 1