Reputation: 8670
I am using PHPSpreadsheet to take some spreadsheets a user can upload, add a column with certain values, save the file as CSV, and use the following query to import the csv file:
LOAD DATA LOCAL INFILE '{$file}'
INTO TABLE {$table}
FIELDS TERMINATED by ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
Alternatively I can do something like:
foreach($rows as $row){
// INSERT $row INTO table
}
The spreadsheets will all have the same columns/data-types.
What would be the most efficient way to do this? Going from Xlsx -> CSV -> MySQL Import seems like I am adding extra steps.
Upvotes: 0
Views: 815
Reputation: 211730
MySQL's direct CSV import is usually the fastest option, however it is not without limitations. One is that you need to import all or nothing in the file and you won't know how far along it is until it's done. As some imports can take hours, even days, you may not know where it's at. The entire insert operation on an InnoDB table takes place atomically for performance reasons but that means it's not visible until fully committed.
Another is the file must be present on the server. The LOCAL
option is a quirky feature of the mysql
command-line tool and probably doesn't work in your database driver unless emulated.
Inserting row-by-row with a CSV parser is almost always slower. If you must do a thing, be sure to prepare an INSERT
statement once and re-use it in the loop, or do a "multi-INSERT
" with as many rows as you can fit in your max statement size buffer.
Upvotes: 2