Reputation: 3021
I have a collection with >100k of documents. A sample document will be like
{
"created_at" : 1545039649,
"priority" : 3,
"id" : 68,
"name" : "document68"
}
db.mycol.find().sort({created_at:1})
and
db.mycol.find().sort({priority:1})
results in error.
Error: error: {
"ok" : 0,
"errmsg" : "Executor error during find command: OperationFailed: Sort operation used more than the maximum 33554432 bytes of RAM. Add an index, or specify a smaller limit.",
"code" : 96,
"codeName" : "OperationFailed"
}
Then I indexed these fields.
db.mycol.createIndex({'priority':1})
db.mycol.createIndex({'created_at':1}, {sparse:true})
Added sparse index to created_at as it is a mandatory field.
Now
db.mycol.find().sort({priority:1})
gives the result. But
db.mycol.find().sort({created_at:1})
still results in the same error.
Upvotes: 0
Views: 384
Reputation: 262474
The sparse index can only be used when you filter by created_at: {$exists: true}
.
The reason being that all the other records are not part of the index (but they are still supposed to appear in the result -- probably at the end).
Maybe you don't have to make the index sparse (which only makes sense when most of the records do not have the field -- otherwise you don't save much space in index storage anyway)? created_at
sounds like most records would have it.
Added sparse index to created_at as it is a mandatory field.
Actually, it is the other way around: You only want a sparse index when the field is optional (and quite rare).
Upvotes: 1