Reputation: 76
I have a Laravel application that has about 52 million records on a one table serials table below.
Schema::create('serials', function (Blueprint $table) {
$table->id();
$table->bigInteger('pinNumber');
$table->bigInteger('serialNumber');
$table->boolean('checked')->default(0);
$table->boolean('status')->default(0);
$table->string('lotNumber')->nullable();
$table->Integer('checkCode');
$table->index(['serialNumber','pinNumber']);
$table->softDeletes();
$table->timestamps();
});
I am planning to have about 100 million records but the web is extremely very slow with the current 52 million records.
My insertions( autogenerated serials) is working fine but getting the counts as per below is taking more time than expected.
$totalSerials = Serial::max('id');
$totalDownload = Lot::sum('count');
$appovedCodes = Serial::where('checked', true)->count();
Please advise on the best way to handle big data with laravel. I have a sytem with 8gb ram and 160gb ssd.
Upvotes: 2
Views: 3545
Reputation: 5010
I suggest to use Cache Servers like Redis or Memcache and store cached result instead of calculating them each time.
like :
$totalDownload = Cache::remember('total_download', 60*60 /*$seconds*/, function () {
return Lot::sum('count');
});
Upvotes: 1
Reputation: 954
For huge amount of data in order of 100 million and also your case when you don't need relational database,
it's recommended to use nosql DBMS like mogodb.
But indexing can help dbms for faster query results.
Mostly in these situations you should improve your database performance instead of relying on laravel. I suggest you to partition your table on your created_at column if date intervals are not close, let's say monthly partitioning. This will also help better performance for your database.
Also another idea is if you don't want these results like Lot::sum('count')
or Serial::where('checked', true)->count()
instantly, store them each night on separate temprory table and each night update it's value.
This last idea is not recommended for operational database. It's for BI process and Datawarehouse. Maybe you should consider something like this!
Upvotes: 2