Reputation: 1527
I am trying to query a large set of users with a specific property(give me all users who are enrolled into a service). The number is expected to be > 10,000. But each user object is huge. In the below function, I am able to log 'reached here' to the cmd line, showing that the function is atleast called, but I cant have access to allUsers, the string 'all users length' isnt even logged to cmd line. My guess is maybe mongoose has a read timeout for large sets of data. Users is a mongoose collection. Please anyone who might have a idea what goings on or has a better way to query large sets of data, all suggestions welcome. Thanks
function get_userIds_ready_for_fulfillment(Users) {
logger.info('reached here');
Users.find({ "isEnrolled": true }, (error, allUsers) => {
if (error) {
return logger.error('Fullfiment_job_Error', 'error querying all users');
}
logger.info('all users length', allUsers.length);
});
}
Upvotes: 1
Views: 3006
Reputation: 15366
You can consider
Users.find({"isEnrolled": true}).select({myField: 1}).exec((err, allUsers) => { ... })
. Docsskip
and limit
. Skipping is an expensive operation in the database (discussed here) but is better than what you have now. It is also possible by this approach to omit data if something is inserted past while you're paging.count
is also sort of an aggregation operator not listed in that section.)Upvotes: 4