Reputation: 339
I fired up a Zilla instance of heroku postgres which is advertised as having 17GB of memory cache.
When I run show all; I see: effective_cache_size | 12240000kB
Does this mean the cache is 12GB and not 17GB? Or am I missing something? Queries run much slower when my dataset goes above the 12GB point.
Upvotes: 0
Views: 782
Reputation: 13306
There is a limit on the available memory on the underlying hardware (17G for a zilla). This amount of memory cannot be used entirely for the "hot dataset" cache, however. Many other aspects of normal postgres operations also require memory, as you may imagine. Some of that includes establishing a connection (which spawns a backend), queries requiring joins, queries requiring sorts, or aggregates like count, sum, max, etc. Additionally, processes such as autovacuum also use part of that available memory.
Upvotes: 1