Reputation: 1643
I want to do the following:
I have more then 10,000 users in my db.
I need to send all of them an event that will change a value in their document,
For example:
user:{
money:10,
skill : 5,
mood : 1
}
So this is what i want to avoid because it's a memory and a cpu hell:
User.find({}).exec(function(err,users){
users.forEach(function(user){
if(user.money < 10){
user.money += 5 * (some other params or something);
}
user.save();
});
);
Also i need to extract the id of each user who have money lower than 10 and send him a push.... so i cant use just "update","inc" or "set"
This code crashes my server, How can i make it better? should i use async? if yes how?
Upvotes: 0
Views: 206
Reputation: 311865
You can do that with a single update
with the $inc
operator and the {multi: true}
option so that it's applied to all matching docs:
User.update(
{money: {$lt: 10}},
{$inc: {money: 5 * (some other params or something)}},
{multi: true},
function(err, num) { ... });
If your updates are such that each doc needs special handling based on its content, you can use a streaming approach to limit the number of docs in memory at any one time:
var stream = User.find({}).stream();
stream.on('data', function(user) {
stream.pause();
if(user.money < 10){
user.money += 5 * (some other params or something);
}
// More document-specific updates
...
user.save(function(err, doc) {
// The changes to this doc are complete, move on to the next one.
stream.resume();
});
}).on('error', function(err) {
console.error(err);
}).on('close', function() {
console.log('All done!');
});
Upvotes: 1
Reputation: 1213
what about Query Stream ? , in your example you are getting all users into process memory and then do operations , with stream u just could get singles doc and process it.
Upvotes: 0