Reputation: 2176
I am using laravel(v-5.6) pagination. I have two tables : table1 size ~1.6 Million rows and table2 size ~0.8 Million rows. I am performing join operation based on some condition. My query is as follows -
$results = $connection->table('table1' as db1")
->select('db1.id as kuchikomi_id','db1.comment as splitted_kuchikomi','db1.kuchikomi_sentiment','db1.status as kuchikomi_status','db2.id','db2.hotel_id','db2.comment','db2.post_date','db2.ota_name','db2.negative_comment','db2.positive_comment','db2.language')
->Join('table2' as db2", 'db1.kuchikomi_id', '=', 'db2.id')
->where('db1.kuchikomi_language', $language)
->whereIn('db1.id', $splitted_id))
->paginate(100);
here $splitted_id is a collection of ~0.6 Million ids. When I execute this query it takes more than 12 minutes and terminates because of maximum execution time exceeds.
table1 indices -
My question is- Is join operation suitable for this size of tables? if yes how can I improve execution time? Need some help to figure out these doubts. Thank you.
Upvotes: 2
Views: 180
Reputation: 8739
Since the where statements are the cause of the performance issue, you could try to create a composite index for db1.kuchikomi_language
and db1.id
, by executing this in a migration:
$table->index(['kuchikomi_language', 'id']);
That will speed the where('db1.kuchikomi_language', $language)
statement.
After that, you could try to use a filter instead of the whereIn('db1.id', $splitted_id))
statement, if this part of the query takes a long time to execute. You can analyse this by using the Laravel Debugbar.
That will give you the possibility to cache the results before and / or after the filtering of the splitted id's, depending on how long the filtering takes.
The pagination is probably not the cause of the performance issue, but it is good to analyse this as well.
Upvotes: 2