Reputation: 1837
I have tried searching similar answers and have applied the solutions. The solutions seem not to work in my case. I am querying a mongoose collection that contains 60k documents, and I need all 60k to apply combinatorics. Hence, can't apply limits. I can reduce the data volume by querying multiple times depending on a certain property, but that will be costly in terms of performance as well. I don't see what else to try. Can someone help me out?
I am using this simple code for now:
StagingData.find({})
.lean()
.exec(function(err, results){
console.log(results) //I don't get any output
}
When I use:
let data = await StagingData.find({}).lean() //it takes forever
What should I do?
Upvotes: 0
Views: 148
Reputation: 2436
You might want to apply indexing first, e.g. precomputing some values as a separate operation, parallel processing, etc. For this you may want to jump to a a different technology maybe Elasticsearch, Spark etc depend on your code.
You may also want to identify what is the bottleneck in your process: memory, processor. Try experimenting with a shorter set of documents and see how quickly you get results. With this you might be able to infer how long will it take to do it for the whole dataset. You may also trying breaking down your operation into smaller chunks and identifying the cost of processing etc.
Upvotes: 1