Reputation: 1175
I'm using Guava cache for my application and was wondering the what the default behavior would be if the maximumSize was not set. I understand the behavior when the maximumSize is set as it's explained in https://github.com/google/guava/wiki/CachesExplained#size-based-eviction.
But what happens when the maximumSize is not set and JVM run out of heap space? I assume that garbage collector will run and will free up space which means that entries will be dropped from the cache?
Upvotes: 2
Views: 4130
Reputation: 48794
Under the covers a Cache
is just a fancy Map
, so it has similar space limitations. Like Map
it can't contain more than than Integer.MAX_VALUE
entries (since that's the return type of size()
), so your theoretical upper-bound cache size is ~2 billion elements. You might also be interested in Guava's awesome element-cost analysis, which details the exact number of bytes used by different data structures.
Of course in practice the real concern isn't usually the number of elements in the cache (its size), but the amount of memory consumed by the objects being cached. This is independent from the cache's size - a single cached object could be large enough consume all your heap.
By default Cache
doesn't do anything special in this case, and the JVM crashes. Most of the time this is what you want - dropping elements from the cache silently could break your program's assumptions.
If you really do want to drop entries as you approach out-of-memory conditions you can use soft references with CacheBuilder.softValues()
. The JVM will attempt to garbage collect soft references when it's at risk of running out of free heap space. I would encourage you to only use this option as a last resort - the JVM has to do extra work to handle soft references, and needing to use them is often a hint that you could be doing something else a different way.
Upvotes: 4