Reputation: 4089
I am working on a project where I will need to get at least 100k data from SQL table (the data is a 6 digit code indexed on the table), export them into a CSV file and run an update query to update their status.
The export time takes only seconds but because of the update queries it's taking a long time. I am looking for an efficient way to do this task.
Here's my code:
$filename = 'sample.csv';
$handle = fopen($filename, 'w+');
$collection = Code::select('code')->where('status', 0)->take($request->quantity)->get();
foreach ($collection->chunk(500) as $codes) {
foreach ($codes as $code) {
fputcsv($handle, [
$code->code,
]);
// this update query making the whole process taking long time
Code::where('code', $code->code)->update(['status' => 1]);
}
}
fclose($handle);
I am looking for a better way to update those data. Any suggestions?
Upvotes: 2
Views: 1907
Reputation: 406
Can't you just do a bulk update after the file has been written?
$filename = 'sample.csv';
$handle = fopen($filename, 'w+');
$query = Code::select('code')->where('status', 0)->take($request->quantity);
foreach ($query->get()->chunk(500) as $codes) {
foreach ($codes as $code) {
fputcsv($handle, [
$code->code,
]);
}
}
$query->update(['status' => 1]); // Update all of them in one go
fclose($handle);
Upvotes: 3