Reputation: 1052
input_values = [{"001":"john"},{"002":"Josh"}] (consider there are many dicts in this)
collection to be updated:
{
id : ObjectId("asasas87897s89as")
"name_id": "name1",
"name": ""
}
I need to match the key of the input_values dict with the collection's name_id and update the value.
The code I have tried:
for key, value in input_values.items():
self.client[collection].update({"name_id": key},
{"$set": {"name": value}},
upsert=False, multi=True)
But this updates one record at a time. But I need to process 500 records at a time. I am using pymongo.
Upvotes: 0
Views: 245
Reputation: 360
You need to use BulkWrite approach:
The number of operations in each group cannot exceed the value of the maxWriteBatchSize of the database. As of MongoDB 3.6, this value is 100,000. This value is shown in the isMaster.maxWriteBatchSize field.
This limit prevents issues with oversized error messages. If a group exceeds this limit, the client driver divides the group into smaller groups with counts less than or equal to the value of the limit. For example, with the maxWriteBatchSize value of 100,000, if the queue consists of 200,000 operations, the driver creates 2 groups, each with 100,000 operations.
Using pymongo library you need to aggregate updates into list and execute bulk write operation. Here is some example for it just to see the idea of bulk operations in pymongo:
from pymongo import UpdateOne, MongoClient
db = MongoClient()
input_values = [{"001": "john"}, {"002": "Josh"}]
updates = []
for value in input_values:
key = list(value.keys())[0]
updates.append(UpdateOne({'name': key}, {'$set': {'name': value[key]}}))
db.collection.bulk_write(updates)
Upvotes: 2