Reputation: 417
I have the following documents:
[{
"_id":1,
"name":"john",
"position":1
},
{"_id":2,
"name":"bob",
"position":2
},
{"_id":3,
"name":"tom",
"position":3
}]
In the UI a user can change position of items(eg moving Bob to first position, john gets position 2, tom - position 3). Is there any way to update all positions in all documents at once?
Upvotes: 23
Views: 32867
Reputation: 751
As mentioned by the authors, MongoDB now supports aggregation pipelines in update operations like updateOne and updateMany. This allows us to modify a set of documents with distinct values for each one.
Aggregation pipeline in update
You need to set multi: true in the options.
@Shubham's answer is correct and works for the author's case, but it might be inefficient as it uses arrayElementAt and filter. This is because, for every document MongoDB iterates over, it will filter the same list to match values.
The solution is to use a hash map, so we can leverage getField and make the aggregation much simpler, more efficient, and solve the common problem of the author.
HashMap<ID, newPosition>
let user_position_hash = {
2: 1,
1: 2,
}
aggregate
let agg = {
"$set": {
"position": {
"$let": {
"vars": {
"new_position": {
"$getField": {
"field": {
"$toString": "$_id"
},
"input": user_position_hash
}
}
},
"in": "$$new_position"
}
}
}
};
Note: The answer given by @Derick stating that it is not possible to update multiple documents with different values is incorrect and should be edited or we should strive to decrease its relevance as it can lead to misunderstandings.
Note2: I could only do this in Rust, in node, I don't know why, it doesn't work
Note3: I managed to do it in Node, just pass an array in the update parameter of updateMany
db.collection('users').updateMany(query, [agg]);
Note4: I added some performance tests at the end using Gatling, nothing too serious, just quick tests, test repo
Rust Code
let users_to_update: HashMap<String, Bson> = users
.clone()
.into_iter()
.map(|data| {
let session_id = Bson::String(Uuid::new_v4().to_string());
(data.id, session_id)
})
.collect::<HashMap<String, Bson>>();
let users_hash_bson: Bson = convert_hashmap_to_bson(users_to_update);
let agg: Vec<Document> = vec![doc! {
"$set": {
"session_id": {
"$let": {
"vars": {
"new_session_id": {
"$getField": {
"field": {
"$toString": "$_id"
},
"input": users_hash_bson
}
}
},
"in": "$$new_session_id"
}
}
}
}];
let teste = self
.user_repository
.update_many(
doc! {
"_id": { "$in": object_ids }
},
agg,
None,
)
.await;
Gatling Tests
Results:
As expected, using update inside a for loop proved disastrous, resulting in the worst performance. The best performance was observed with bulkWrite and aggregate with hash, which are very similar (within the margin of error). Lastly, using aggregate again, but with worse performance than the previous two approaches, as the filtering approach for each item is both costly and unnecessary.
Upvotes: 0
Reputation: 499
Suppose after updating your position your array will looks like
const objectToUpdate = [{
"_id":1,
"name":"john",
"position":2
},
{
"_id":2,
"name":"bob",
"position":1
},
{
"_id":3,
"name":"tom",
"position":3
}].map( eachObj => {
return {
updateOne: {
filter: { _id: eachObj._id },
update: { name: eachObj.name, position: eachObj.position }
}
}
})
YourModelName.bulkWrite(objectToUpdate,
{ ordered: false }
).then((result) => {
console.log(result);
}).catch(err=>{
console.log(err.result.result.writeErrors[0].err.op.q);
})
It will update all position with different value.
Note : I have used here ordered : false for better performance.
Upvotes: 1
Reputation: 1426
From mongodb 4.2 you can do using pipeline in update using $set operator
there are many ways possible now due to many operators in aggregation pipeline though I am providing one of them
exports.updateDisplayOrder = async keyValPairArr => {
try {
let data = await ContestModel.collection.update(
{ _id: { $in: keyValPairArr.map(o => o.id) } },
[{
$set: {
displayOrder: {
$let: {
vars: { obj: { $arrayElemAt: [{ $filter: { input: keyValPairArr, as: "kvpa", cond: { $eq: ["$$kvpa.id", "$_id"] } } }, 0] } },
in:"$$obj.displayOrder"
}
}
}
}],
{ runValidators: true, multi: true }
)
return data;
} catch (error) {
throw error;
}
}
example key val pair is: [{"id":"5e7643d436963c21f14582ee","displayOrder":9}, {"id":"5e7643e736963c21f14582ef","displayOrder":4}]
Upvotes: 8
Reputation: 404
Since MongoDB 4.2 update
can accept aggregation pipeline as second argument, allowing modification of multiple documents based on their data.
Excerpt from documentation:
Modify a Field Using the Values of the Other Fields in the Document
Create a members collection with the following documents:
db.members.insertMany([ { "_id" : 1, "member" : "abc123", "status" : "A", "points" : 2, "misc1" : "note to self: confirm status", "misc2" : "Need to activate", "lastUpdate" : ISODate("2019-01-01T00:00:00Z") }, { "_id" : 2, "member" : "xyz123", "status" : "A", "points" : 60, "misc1" : "reminder: ping me at 100pts", "misc2" : "Some random comment", "lastUpdate" : ISODate("2019-01-01T00:00:00Z") } ])
Assume that instead of separate misc1 and misc2 fields, you want to gather these into a new comments field. The following update operation uses an aggregation pipeline to:
- add the new comments field and set the lastUpdate field.
- remove the misc1 and misc2 fields for all documents in the collection.
db.members.update( { }, [ { $set: { status: "Modified", comments: [ "$misc1", "$misc2" ], lastUpdate: "$$NOW" } }, { $unset: [ "misc1", "misc2" ] } ], { multi: true } )
Upvotes: 1
Reputation: 3019
You can use db.collection.bulkWrite()
to perform multiple operations in bulk. It has been available since 3.2
.
It is possible to perform operations out of order to increase performance.
Upvotes: 8
Reputation: 36784
You can not update two documents at once with a MongoDB query. You will always have to do that in two queries. You can of course set a value of a field to the same value, or increment with the same number, but you can not do two distinct updates in MongoDB with the same query.
Upvotes: 34