Reputation: 2208
I have searched a lot but has not found the answer for the following question. I have a table with lots of record (in the order of 100M) and I want to run the following query on it :
Entity.objects.filter(creation_time__gte=some_date).order_by('id').all()[0]
and my table is something like follows :
class Entity(models.Model):
creation_time = models.DateTimeField(null=True, blank=True)
# Other fields
Upvotes: 1
Views: 2844
Reputation: 53774
No matter what database you run this against and what index that you use on the creation_time field this query is likely to be slow. Why? Because your filter is an inequality and not an equality and you are combining that with a sort, think about it in a worst case scenario you could be sorting 99 million records.
One approach that you could use is to create a non serial primary key that is derived from creation time. Sort of like described here: https://stackoverflow.com/a/37605582/267540
You could create a BTREE creation time and then do a query that filters further with an upper limit
Entity.objects.filter(creation_time__gte=some_date
).filter(creation_time__lt=some_other_date).order_by('id').all()[0]
And lastly you could do a query that use 'only' to fetch only the primary key, that would be an index only query for postgresql and could lead to a slight speed up.
Upvotes: 0