Reputation: 136
I have the functionality to import a very large amount of records around 5 million.
I have to make entries for related tables also simultaneously in the import process.
I have to make bunch insert queries for new entries and taking care of all queries and also making processes in chunks.
What are the other ways to speed up the process?
Upvotes: 9
Views: 9247
Reputation: 19
Use:
Process data in Chunks Use Laravel queues
Use https://docs.laravel-excel.com/3.1/imports/ Example for user model binding
namespace App\Imports;
use App\User;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\Importable;
class UsersImport implements ToModel
{
use Importable;
public function model(array $row)
{
return new User([
'name' => $row[0],
'email' => $row[1],
'password' => Hash::make($row[2]),
]);
}
}
In Controller
(new UsersImport)->import('users.xlsx', 'local', \Maatwebsite\Excel\Excel::XLSX);
Upvotes: 0
Reputation: 3835
So to summarize for people who don't bother looking through all comments separately:
Besides the points already made you could consider:
Upvotes: 5
Reputation: 351
(copied from laracasts) This will probably help too:
DB::connection()->disableQueryLog();
"By default, Laravel keeps a log in memory of all queries that have been run for the current request. However, in some cases, such as when inserting a large number of rows, this can cause the application to use excess memory."
Upvotes: 4