Reputation: 383
Can I get an idea; how can I import 70k+ rows to the database from CSV avoiding a lot of memory usage and time in Laravel (PHP)
Part of my code how I am trying to import:
$reader = new \PhpOffice\PhpSpreadsheet\Reader\Csv();
$reader->setInputEncoding('UTF-8');
$spreadsheet = $reader->load($localpath);
$spreadsheet = $spreadsheet->getActiveSheet()->toArray();
$collection = LazyCollection::make($spreadsheet);
if($key==null){
$key = array_map('trim', $collection->first());
$key_count=count($key);
}
$collection = $collection->except(0);
foreach($collection as $row){
if(count($row)==$key_count && !containsOnlyNull($row)){
$array[] = array_combine($key, $row);
}
}
Upvotes: 0
Views: 1679
Reputation: 383
I was able to import this file with minimal usages of memory and time using LazyCollection. Here is the code sample how I did that:
LazyCollection::make(function () use (&$csv_data,$key_count){
$datas = $csv_data;
foreach($datas as $line)
{
if(count($line)==$key_count && !containsOnlyNull($line)){
yield $line;
}
}
})
->chunk(1000)
->each(function ($lines) use (&$import_info, $user_id, $country, $csv_fields, $key, $total, $products,$allProducts, $sync_id,$import_id){
}):
Upvotes: 1
Reputation: 2951
You can use laravel-excel
with chunk reading for that https://docs.laravel-excel.com/3.1/imports/chunk-reading.html
To mitigate this increase in memory usage, you can use the WithChunkReading concern. This will read the spreadsheet in chunks and keep the memory usage under control.
This is an example of code you could write:
namespace App\Imports;
use App\User;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\WithChunkReading;
class UsersImport implements ToModel, WithChunkReading
{
public function model(array $row)
{
return new User([
'name' => $row[0],
]);
}
public function chunkSize(): int
{
return 1000;
}
}
Upvotes: 3