Vivek Solanki
Vivek Solanki

Reputation: 136

How to Optimize Import Process for very large data for Laravel?

I have the functionality to import a very large amount of records around 5 million.

I have to make entries for related tables also simultaneously in the import process.

I have to make bunch insert queries for new entries and taking care of all queries and also making processes in chunks.

What are the other ways to speed up the process?

Upvotes: 9

Views: 9247

Answers (3)

bhavinG
bhavinG

Reputation: 19

Use:

Process data in Chunks Use Laravel queues

Use https://docs.laravel-excel.com/3.1/imports/ Example for user model binding

namespace App\Imports;

use App\User;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\Importable;

class UsersImport implements ToModel
{
    use Importable;

  public function model(array $row)
  {
      return new User([
       'name'     => $row[0],
       'email'    => $row[1],
       'password' => Hash::make($row[2]),
    ]);
  }
}

In Controller

(new UsersImport)->import('users.xlsx', 'local', \Maatwebsite\Excel\Excel::XLSX);

Upvotes: 0

PtrTon
PtrTon

Reputation: 3835

So to summarize for people who don't bother looking through all comments separately:

  • Use the DB facade Laravel offers instead of Eloquent models for improved performance.
  • Run the import process in batches rather than importing all data at once.
  • Disble query logging to save memory usage.
  • Run the import in a background job, not during a web request.

Besides the points already made you could consider:

  • First run a job which reads your input xlsx/csv file in batches of 50 lines (or some other sensible number). Make sure you do not keep all lines in memory. Then for each batch of lines create a new job. So you'll have a 2-step import.
  • When creating separate jobs for each batch, you can run those jobs concurrently (= multiple queue workers at the same time).
  • If you have users waiting for the import to be finished, make sure you show a progress bar or at least an animated loader of some kind. Although this does not speed up the process, it will give them an indication work is being done.
  • You could also leverage database transactions for running multiple database queries at once (credit to Philippe Thiers for this one)

Upvotes: 5

manu
manu

Reputation: 351

(copied from laracasts) This will probably help too:

DB::connection()->disableQueryLog();

"By default, Laravel keeps a log in memory of all queries that have been run for the current request. However, in some cases, such as when inserting a large number of rows, this can cause the application to use excess memory."

Upvotes: 4

Related Questions