Reputation: 11
working on the PHP project related to web scraping and my aim is to store the data into the mysql database,i'm using unique key index on 3 indexes in 9 columns table and records are more than 5k.
Upvotes: 1
Views: 402
Reputation: 11602
Never ever create a duplicate table this is a anti SQL pattern and it makes it more difficult to work with your data.
Maybe PDO and prepared statement will give you a little boost but dont expect wonders from it.
multible INSERT IGNORE may also give you a little boost but dont expect wonders from it.
You should generate a multiinsert query like so
INSERT INTO database.table (columns) VALUES (values),(values),(values) Keep in mind to keep under the max packet size that mysql will have. this way the index file have to be updated once.
Upvotes: 1
Reputation: 552
You could use "INSERT IGNORE" in your query. That way the record will not be inserted if any unique constraints are violated.
Example:
INSERT IGNORE INTO table_name SET name = 'foo', value = 'bar', id = 12345;
Upvotes: 0
Reputation: 83
you can follow your updates with triger. You should do update table and you have to right trigger for this table.
Upvotes: 0
Reputation:
You could create a duplicate of the table that you currently have except no indices on any field. Store the data in this table.
Then use events to move the data from the temp table into the main table. Once the data is moved to the main table then delete if from the temp table.
Upvotes: 0