pmiranda
pmiranda

Reputation: 8420

Laravel-Excel 2, Insert data faster

This is the main code of a method that I use to import an excel (using Maatwebsite Laravel-Excel 2) file to my database:

$data = Excel::selectSheetsByIndex(0)->load($file, function($reader) {})->get()->toArray();
DB::beginTransaction();
try {
    foreach ($data as $key => $value) {
        $med= trim($value["med"]);
        $serial = trim($value["nro.seriemedidor"]);

        DB::table('medidores')->insert([
            "med" => $med,
            "serial_number" => $serial
        ]);
    }
    DB::commit();
} catch (\Exception $e) {
    DB::rollback();
    return redirect()->route('myroute')->withErrors("Some error message");
}

This works fine when I have "few" data (let's say less than 5000 rows in the excel file). But I need to work with a large excel file which has like 1.4 million of rows, divided in more than 1 sheet. How could I make my method more faster? Any tip?

EDIT: I will edit the question with the code that was on the link of one of the comments of the answer:

$data = Excel::selectSheetsByIndex(0)->load($file, function($reader) {})->get()->toArray();
DB::beginTransaction();
try {
  $bulk_data = [];
  foreach ($data as $key => $value) {
    $med= trim($value["med"]);
    $serial = trim($value["nro.seriemedidor"]);
    $bulk_data[] = ["med" => $med,"serial_number" => $serial] ;
  }
  $collection = collect($bulk_data);   //turn data into collection
  $chunks = $collection->chunk(100); //split into chunk of 100's
  $chunks->toArray(); //convert chunk to array
 //loop through chunks:
 foreach($chunks as $chunk)
 {
   DB::table('medidores')->insert($chunk->toArray());
 }
  DB::commit();
} catch (\Exception $e) {
  DB::rollback();
  return redirect()->route('myroute')->withErrors("Some error message");
}

The chunk thing was what worked for me.

Upvotes: 2

Views: 1965

Answers (1)

TheSalamov
TheSalamov

Reputation: 917

Yes you can , instead of performing X (number of database requests) * N (number of sheets) try doing a simple bulk insert which will only cost you the complexity of looping over data saving X * N database requests , here's an example :

$data = Excel::selectSheetsByIndex(0)->load($file, function($reader) {})->get()->toArray();
DB::beginTransaction();
try {
  $bulk_data = [];
  foreach ($data as $key => $value) {
     $med= trim($value["med"]);
     $serial = trim($value["nro.seriemedidor"]);
     $bulk_data[] = ["med" => $med,"serial_number" => $serial] ;
  }
  DB::table('medidores')->insert($bulk_data);
  DB::commit();
} catch (\Exception $e) {
  DB::rollback();
  return redirect()->route('myroute')->withErrors("Some error message");
}

You can refer to this answer for more explanation about db requests : https://stackoverflow.com/a/1793209/8008456

Upvotes: 2

Related Questions