Nikhil Bhatia
Nikhil Bhatia

Reputation: 344

Is Laravel Eloquents along with using chunks are slower compare to laravel's normal query builders with chunks while working with millions of records?

I am really new in developing web applications and so working with laravel and learning them, so I excuse for my really simple questions... But it will be really helpful, if anybody answers me those and help me learn about these stuff...

Let me first start by describing my table structure: Table Structure

I am using, MySql's InnoDB... And this table currently stores around 13k of records and is made to store millions or even billions of records in future...

Now describing my problem as under:

When I fetch all records from this table i.e. 13k records, with laravel's normal query builder using chunk (following code describes the way I fetch records), I get results within 1.39seconds, which is fine but really it should be even more efficient (these results may be because of limited RAM on my system, I haven't tested with high end systems), but, if I am making mistakes here as well then please let me know.

DB::table($tableName->dataTableName)->orderBy('id')->chunk(100, function($data) {
    foreach ($data as $record) {
        echo "DateTime: " .$record->DateTime. " id: ".$record->meter_id;
    }
})

and when I do the same with eloquent using chunks(following code describes, the way I fetch records using eloquent), I do not get complete results even after 60 seconds, which is totally unacceptable...

Data::orderBy('id')->chunk(100, function($data) {
    foreach ($data as $record) {
        echo "DateTime: " .$record->DateTime. " id: ".$record->meter_id
    }
})

I do use other models in data model to get table name... Following is the content of data model...

<?php namespace App;

use Illuminate\Database\Eloquent\Model;
use App\dataTableMaster as DataTableMaster;
use App\Company;

class Data extends Model {

    /**
     * The database table used by the model.
     *
     * @var string
     */
    protected $table = 'data';

//========This following constructor assigns table name depending upon client which has logged in... ===============
    public function __construct(){
        $user = \Auth::user();
        $associatedIdOfUser = $user->asso_id;
        $associatedCompanyObjectOfUser = Company::where('id',$associatedIdOfUser)->first();
        $companyRoot = $associatedCompanyObjectOfUser->getRoot();
        $tableObject = DataTableMaster::where('company_id',$companyRoot->id)->orderBy('created_at','desc')->first();
        $this->table = $tableObject->dataTableName;
    }

}

Now, my questions are:

  1. Is really eloquents slower than normal query builder because, eloquents creates objects of the results and later one doesnt do so.??
  2. If eloquents aren't much slower compare to query builder i.e. eloquents as well are efficient to fetch thousands or millions of records like query builder THAN
    1. What mistakes am i making here, because of which, i am getting problems? Is it in table structure or the way, i am fetching records or what is it?? Please explain about those.
    2. Is it advisable to work with such a large table using eloquents?
  3. If eloquents really are slower and not advisable to use while working with millions of records than with what ways, we can work with millions of table, is only way possible to work with them through query builders?? And also, why are they slower, is it because it creates objects or reason is something else??

Upvotes: 3

Views: 5486

Answers (1)

Coloured Panda
Coloured Panda

Reputation: 3467

Using Eloquent is roughly 3 times slower than using the Query Builder directly (benchmarks). An ORM has to map data to objects, so it will be always slower, no way around that.

If you plan on dealing with millions of records, forget about an ORM. It doesnt exist for you anymore. You'll have to do custom queries.

Chunking by 100 is not enough. I've found it safe to go up to 5000. You may increase or decrease this number by monitoring your memory usage.

Keep in mind that sql is faster than php, so do as much as you can in sql. And remember to only select the fields you need - it will bring memory usage down.

Order by is slow on large data sets, because it may do sorting in filesystem rather than in memory.

DB::table('data')->select('DateTime', 'meter_id')->chunk(5000, function($data) {
    foreach ($data as $record) {
        echo "DateTime: " .$record->DateTime. " id: ".$record->meter_id;
    }
});

Depending on your setup, echo can slow down things by a lot.

Upvotes: 5

Related Questions