Reputation: 12445
I have approx 80k records that I need to run an insert/update script on several times per day.
INSERT INTO `my_rankings` (`id`, `rank`) VALUES (1,100),(2,99)(3,102)...(80000,3)
ON DUPLICATE KEY UPDATE `rank` = values(`rank`);
These records are currently in an array format:
$rankings = [
['id' => 1, 'rank' => 100],
['id' => 2, 'rank' => 99],
['id' => 3, 'rank' => 102],
...
['id' => 80000, 'rank' => 3],
]
Is there a nice / performant way I can run the above mentioned update query?
I have looked at Eloquent's Model::updateOrCreate(...)
. However I don't think I can use this for bulk insert/updates.
I want to avoid using foreach
on my $rankings
array and insert/updating individual records, because the script will take far too long.
I have an answer for this question below https://stackoverflow.com/a/34815725/1239122, however it is far from elegant.
Upvotes: 4
Views: 7131
Reputation: 12445
I have a solution - It's not elegant, but very fast. 1.6s for 80k records. Any better solutions would be much appreciated.
$allResults = [
['id' => 1, 'rank' => 100],
['id' => 2, 'rank' => 99],
['id' => 3, 'rank' => 102],
...
['id' => 80000, 'rank' => 3],
];
$rankings = [];
foreach ($allResults as $result) {
$rankings[] = implode(', ', ['"' . $result['id'] . '"', $result['rank']]);
}
$rankings = Collection::make($rankings);
$rankings->chunk(500)->each(function($ch) {
$rankingString = '';
foreach ($ch as $ranking) {
$rankingString .= '(' . $ranking . '), ';
}
$rankingString = rtrim($rankingString, ", ");
try {
\DB::insert("INSERT INTO my_rankings (`id`, `rank`) VALUES $rankingString ON DUPLICATE KEY UPDATE `rank`=VALUES(`rank`)");
} catch (\Exception $e) {
print_r([$e->getMessage()]);
}
});
Upvotes: 5
Reputation: 4285
Use database transactions.
DB::beginTransaction();
foreach ($bajillionRecords as $record) {
$record->update();
}
DB::commit();
This will store your changes then commit them to the database when you run commit(). You could also do this as a closure:
DB::transaction(function() { /* */ });
Upvotes: 3