enzo
enzo

Reputation: 459

Inserting massively from a csv file

I wanted to ask how it would be best to proceed in the case that I will explain below:

I enter more than 200,000 rows in a mysql database via php, I take the lines from a csv file, my question is as follows:

I should parse the csv file and as I read I insert the rows, or should I parse the file, forfeit all in an array and then using a foreach insert rows in the database.

I ask this because the inclusion of the data in the database is to be really slow. The table is an InnoDB.

I hope I was clear.

N.B. I can not make a "LOAD DATA LOCAL INFILE" because fields in the .csv file have been mapped previously.

Thank you

Upvotes: 0

Views: 96

Answers (1)

Mario A
Mario A

Reputation: 3363

The two options you suggest will be nearly as fast, the one with the array maybe slightly slower because of the additional memory access.

But I suppose the bottleneck when executing a lot of insert statements is the database communication. So a third option that might be faster is to reduce the number of insert statements concatenating all insert data into one statement, e.g:

INSERT INTO mytable VALUES(1,2,3),(4,5,6),(7,8,9), ... (10000, 10001, 10002);

As far as I know a mysql statement has a maximum length, so maybe it is necessary to split the statement into multiple chunks.

This answer makes the performance benefit when consolidating insert statements quite clear: https://stackoverflow.com/a/1793209/1696795

Upvotes: 1

Related Questions