aedm
aedm

Reputation: 6614

Severe performance drop with MongoDB Change Streams

I want to get real-time updates about MongoDB database changes in Node.js.

A single MongoDB change stream sends update notifications almost instantly. But when I open multiple (10+) streams, there are massive delays (up to several minutes) between database writes and notification arrival.

That's how I set up a change stream:

let cursor = collection.watch([
  {$match: {"fullDocument.room": roomId}},
]);
cursor.stream().on("data", doc => {...});

I tried an alternative way to set up a stream, but it's just as slow:

let cursor = collection.aggregate([
  {$changeStream: {}},
  {$match: {"fullDocument.room": roomId}},
]);
cursor.forEach(doc => {...});

An automated process inserts tiny documents into the collection while collecting performance data.

Some additional details:

Both setups produce the same issue. What could be going on here?

Upvotes: 45

Views: 15187

Answers (1)

aedm
aedm

Reputation: 6614

The default connection pool size in the Node.js client for MongoDB is 5. Since each change stream cursor opens a new connection, the connection pool needs to be at least as large as the number of cursors.

In version 3.x of the Node Mongo Driver use 'poolSize':

const mongoConnection = await MongoClient.connect(URL, {poolSize: 100});

In version 4.x of the Node Mongo Driver use 'minPoolSize' and 'maxPoolSize':

const mongoConnection = await MongoClient.connect(URL, {minPoolSize: 100, maxPoolSize: 1000});

(Thanks to MongoDB Inc. for investigating this issue.)

Upvotes: 58

Related Questions