Khalid Khan
Khalid Khan

Reputation: 11

how to increase performance of mysql database insertion

working on the PHP project related to web scraping and my aim is to store the data into the mysql database,i'm using unique key index on 3 indexes in 9 columns table and records are more than 5k.

  1. should i check for unique data at program level like putting values in arrays and then comparing before inserting into database ?
  2. is there any way so that i can speed up my database insertion ?

Upvotes: 1

Views: 402

Answers (5)

Raymond Nijland
Raymond Nijland

Reputation: 11602

Never ever create a duplicate table this is a anti SQL pattern and it makes it more difficult to work with your data.

Maybe PDO and prepared statement will give you a little boost but dont expect wonders from it.

multible INSERT IGNORE may also give you a little boost but dont expect wonders from it.

You should generate a multiinsert query like so

INSERT INTO database.table (columns) VALUES (values),(values),(values) Keep in mind to keep under the max packet size that mysql will have. this way the index file have to be updated once.

Upvotes: 1

Tim Van Dijck
Tim Van Dijck

Reputation: 552

You could use "INSERT IGNORE" in your query. That way the record will not be inserted if any unique constraints are violated.

Example:

INSERT IGNORE INTO table_name SET name = 'foo', value = 'bar', id = 12345;

Upvotes: 0

Amrish
Amrish

Reputation: 119

use PDO, mysqli_* function, to increase insertion into database

Upvotes: 0

user2583040
user2583040

Reputation: 83

you can follow your updates with triger. You should do update table and you have to right trigger for this table.

Upvotes: 0

user2417483
user2417483

Reputation:

You could create a duplicate of the table that you currently have except no indices on any field. Store the data in this table.

Then use events to move the data from the temp table into the main table. Once the data is moved to the main table then delete if from the temp table.

Upvotes: 0

Related Questions