Jack
Jack

Reputation: 111

Regarding Memory Consumption of CouchDB

I did some evaluations on CouchDB recently. I found that memory consumption is pretty high for view construction (map & reduce) as well as importing a larger JSON document into CouchDB. I evaluated the view construction function on a Ubuntu system (4 cores, Intel® Xeon® CPU E3-1240 v5 @ 3.50GHz). Here are the results:

  1. four hundred 100KB datasets would cost around 683 MB memory;
  2. one 80 MB dataset would cost around 2.5 GB memory;
  3. four 80 MB datasets would cost around 10 GB memory.

It seems that memory consumption is hundreds of times of original JSON dataset. If we use 1 GB dataset, then CouchDB would run out of the memory. Does anyone know the reason why memory consumption is so huge? Many thanks!

Upvotes: 10

Views: 2787

Answers (2)

Emir Cangır
Emir Cangır

Reputation: 23

I know that late to answer but I'll leave this answer for someone to benefit. Actually, it's about the caching responses. Couchdb wants to cache the responses to return the results faster. You can handle the issue by setting the caching limits.

Check it: https://docs.couchdb.org/en/latest/config/couchdb.html

Upvotes: 1

MitchB
MitchB

Reputation: 41

I don't know why the memory is so high, but I know it's consistent with CouchDB and you can't really get around it as long as you have large document sizes. I eventually split out the data that I wanted to build views on and then kept the full documents in a separate database for later extraction.

Upvotes: 1

Related Questions