fortm
fortm

Reputation: 4198

HashMap for storing big Cache

Consider there are some Objects that need to be reused number of times in a class. Individuall each Object can be large and the number of such Objects could be very large as well. A very simple exampleof such Objects could be Records in Database.

Storing such data in HAshMap instead of Querying again to Database after every 10 lines can help in performance. But, memory wise it is highly demanding .

How could HashMap contain lot of data but not keep all in memory at one. If it could give Objects on demand , it would be best ?

Upvotes: 0

Views: 3241

Answers (2)

VGR
VGR

Reputation: 44292

Usually, when implementing a cache that has the potential to be large, you want to use SoftReferences. Typically it looks like this:

private final Map<KeyType, Reference<MyLargeObject>> cache =
    new HashMap<>();    // Or LinkedHashMap, as per Quoi's suggestion

public MyLargeObject getCachedValue(KeyType key) {
    Reference<MyLargeObject> ref = cache.get(key);
    return (ref != null ? ref.get() : null);
}

public void addToCache(KeyType key, MyLargeObject value) {
    cache.put(key, new SoftReference<MyLargeObject>(value));
}

A SoftReference holds an object but will allow that object to be garbage collected if memory becomes tight. If the object does get garbage collected, SoftReference.get() returns null.

Upvotes: 1

Subhrajyoti Majumder
Subhrajyoti Majumder

Reputation: 41200

You could use LRU based map for cache and where your length of the cache size is fixed where Least-Recently-Used Objects will be remain in memory.

It is easy to get such map in java, [LinkedHashMap][1].

final int MAX_ENTRIES = 100;
Map cache = new LinkedHashMap(MAX_ENTRIES+1, .75F, true) {
    // Returns true if this map should remove its eldest entry
    public boolean removeEldestEntry(Map.Entry eldest) {
        return size() > MAX_ENTRIES;
    }
};

You can also make your Map synchronized.

Map m = Collections.synchronizedMap(cache);

Upvotes: 3

Related Questions