Reputation: 368
There is a huge two dimensional array which contain 500k sub one dimension arrays, every sub array contain 5 elements. Now it is my job to insert all the data into a sqlite database.
function insert_data($array){
Global $db;
$dbh=new PDO("sqlite:{$db}");
$sql = "INSERT INTO quote (f1,f2,f3,f4,f5) VALUES (?,?,?,?,?)";
$query = $dbh->prepare($sql);
foreach($array as $item){
$query->execute(array_values($item));
}
$dbh=null;
}
I want to optimize the data insert process that the execute action will be executed for 500k times,how to make it executed just one time?
Upvotes: 0
Views: 346
Reputation: 19552
If you are using a newer version of Sqlite (3.7.11+) then it supports batch inserts:
INSERT INTO quote (f1,f2,f3,f4,f5) VALUES
(?,?,?,?,?),
(?,?,?,?,?),
(?,?,?,?,?);
You can use this to chunk your array into groups, and do batch inserts this way. As pointed out by Axalix you should also wrap the whole operation in a transaction.
Upvotes: 0
Reputation: 2871
The idea is to prevent running transactions for each insert, because it will be very slow indeed. So just start and commit the transaction, say for every 10k records.
$dbh->beginTransaction();
$counter = 0;
foreach($array as $item) {
$query->execute(array_values($item));
if ($counter++ % 10000 == 0) {
$dbh->commit();
$dbh->beginTransaction();
}
}
$dbh->commit();
Another solution, you can move an array in a csv file and then just import it.
Upvotes: 3