Reputation: 95
I am using mongoDB with mongoose for our Nodejs api where we need to do sort of seed for collections where data-source is a JSON, i am using Model.bulkwrite
which internally uses mongodb's Bulkwrite(https://docs.mongodb.com/manual/core/bulk-write-operations).
Code below,
await Model.bulkWrite(docs.map(doc => ({
updateOne: { ..... } // update document
insertOne: { ....... } // insert document
updateOne: { ..... } // update document
insertOne: { ....... } // insert document
.
.
.n
})))
This works fine for our current use-case with just few hundred documents, But we are worried about how will it scale,its performance when the number of documents will increase a lot, Like will there be any issues when number of document will be in 10 thousands. Just want to confirm that are we on the right path or is there any room for improvement.
Upvotes: 1
Views: 3634
Reputation: 1318
Bulkwrite in Mongodb is currently having maximum limit of 100,000 write operations in a single batch. From the docs
The number of operations in each group cannot exceed the value of the maxWriteBatchSize of the database. As of MongoDB 3.6, this value is 100,000. This value is shown in the isMaster.maxWriteBatchSize field.
This limit prevents issues with oversized error messages. If a group exceeds this limit, the client driver divides the group into smaller groups with counts less than or equal to the value of the limit. For example, with the maxWriteBatchSize value of 100,000, if the queue consists of 200,000 operations, the driver creates 2 groups, each with 100,000 operations.
So, you won't face any performance issues until you exceed this limit.
For your reference:
Upvotes: 1