Reputation: 33
We run a small NodeJS app that manages subscriptions for our mobile apps. It's backend is a MongoDB with only 100MB of memory. Currently the data size is around 120MB. It's all hosted on a PaaS called Nodechef.
After running for about a week the Mongo server hit 98MB/100MB in memory usage. Not knowing what would happen, we forced a restart and it dropped back to 70MB or so. It's slowly creeping back up.
A few questions:
Is this normal behavior for Mongo to keep growing in memory up to the max?
What happens when it hits max? Will it reboot or crash, or do some kind of garbage collection?
Is restarting weekly a pretty normal fix for this type of issue?
Upvotes: 0
Views: 203
Reputation: 33
This is a Nodechef specific answer based on how they handle this. Other PaaS may handle differently:
"When it hits 125%, SWAP included, it will auto restart itself. It does a graceful shutdown so there should not be any problems there.
In regards to if this is normal, depends, i have seen cases where the app does not close cursors properly causing a cursor leak on the database server which results in memory continuously increasing. Another issue could also be memory fragmentation on the server itself. As long as the restarts are not happening hourly, you are more than fine. Taking a couple week to hit its peak is ok."
Upvotes: 0
Reputation: 14490
According to this you can try setting hostInfo.system.memLimitMB
but I am surprised MongoDB runs at all with 100 MB available memory (if this is accurate).
If the MongoDB process runs out of memory (i.e. requests a memory allocation which is denied) it is likely to immediately terminate.
Upvotes: 1