Reputation: 1
I'm working on a Node.js application where I need to fetch large collections from MongoDB (potentially several GBs of data), convert the data to JSON format in chunks, and upload it to an AWS S3 bucket. However, during the data aggregation process, I often encounter the following error, which causes my script to stop processing the current collection and move on to the next:
Error fetching data: PoolClearedOnNetworkError: Connection to [IP:PORT] interrupted due to server monitor timeout
at ConnectionPool.interruptInUseConnections (path-to/node_modules/mongodb/lib/cmap/connection_pool.js:272:36)
[Symbol(errorLabels)]: Set(1) { 'PoolRequestedRetry' },
[Symbol(errorLabels)]: Set(2) { 'ResetPool', 'InterruptInUseConnections' },
[Symbol(beforeHandshake)]: false,
[cause]: undefined
So far I have used a batch processing approach with aggregation stages and $skip and $limit to control the data fetched.
Any help?
Upvotes: 0
Views: 92