Reputation: 104
I had checked for this issue but didn't found the solution that fits my need.
I had created the table for State, City & Locality with 37, 7431 & 91853 records are available.
It was taking longer time when I was using create instead of using insert in seeder.
So, I changed my code by replacing create to insert. Then got to know about the chunk by Laravel Daily Video.
The Chunk is working fine in the CitySeeder but got the issue in LocalitySeeder.
This is the code in my Seeder:
<?php
namespace Database\Seeders;
use App\Models\Locality;
use Illuminate\Database\Seeder;
class LocalitySeeder extends Seeder
{
public function run()
{
$input = [
[ 'name' => 'Adilabad', 'city_id' => 5487, 'created_at' => now()->toDateTimeString(), 'updated_at' => now()->toDateTimeString() ],
.
.
.
.
.
.
.
.
[ 'name' => 'Nalgonda', 'city_id' => 5476, 'created_at' => now()->toDateTimeString(), 'updated_at' => now()->toDateTimeString() ],];
$chunks = array_chunk($input, 5000, true);
foreach ($chunks as $key => $data) {
Locality::insert($data);
}
}
}
Thanks in Advance.
Upvotes: 1
Views: 716
Reputation: 3490
You're trying to load an array of close to 92K entries ($input
), and then copying that into a chunked array with array_chunk
. The memory limits you've configured (or were set during installation) do not allow you to use so much RAM. One way to fix this, as Daniel pointed out, is to increase your memory limit, but another might be to read this data from a seed file. I'll try to illustrate this below, but you may need to adapt it a little.
In localities.json
, note that each line is an individual JSON object
{"name": "Adilabad", "city_id": 5487}
[...]
{"name": "Nalgonda", "city_id": 5476}
Then in the seeder class, you can implement the run
method as follows:
$now = now()->toDateTimeString();
$handle = fopen(__DIR__."/localities.json");
while(($line = fgets($handle)) !== FALSE){
$data = json_decode($line, TRUE);
$data["created_at"] = $now;
$data["updated_at"] = $now;
Locality::insert($data);
}
EDIT: With regards to the time this will take, it will always be slow with this much data, but a seeder isn't meant to be run often anyway.
Upvotes: 1