Donal.Lynch.Msc
Donal.Lynch.Msc

Reputation: 3605

Artisan migrate fails for (very) large number of data inserts

So I've been using Laravel's database migrations in a recent project and everything works perfectly, except for a "cities" table that has about 3.8 million rows. The following works as expected:

DB::table('cities')->insert([
    'name' => 'Dublin'
]);

But when I add the additional 3.8 million rows to the above insertion array the artisan migrate command just fails/times out.

Am I missing something here or is there a better way to do it?

The file size of the cities migration is 365 MB which actually crashes Phpstorm (out of memory errors). I'm wondering if there's a way to split large db migration into smaller files?

PHP 7.2 Laravel 5.7 Docker/Laradock.

Upvotes: 0

Views: 421

Answers (3)

user10128333
user10128333

Reputation: 156

Use of job is more preferable in this case, which could use chunk of data to insert in batch and like addi2113 has explained, you should use seeder if that's for testing environment.

Upvotes: 1

Farid shahidi
Farid shahidi

Reputation: 352

First of all you can out records in two seeders

You have to raise your memory limit in php/php.ini setting

How to assign more memory to docker container

Upvotes: 1

addi2113
addi2113

Reputation: 154

i would consider it doing it in a Job and run it on a redis queue.

so just write a simple command that dispatches the job. Also i would suggest you writing the data in chunks like in 1000er chunks :)

Upvotes: 1

Related Questions