Reputation: 1484
What I need is a fairly complex data structure with the following requirements:
It should support concurrent reads/writes without any excessive locking (like java.util.concurrent.ConcurrentHashMap
)
It should have capacity limit and block once the limit is reached (just like BlockingQueue
implementations)
It should have efficient search mechanism, like Map
/HashSet
do: given an ID of an object, I need to be able to find it without sequential scan.
It should be possible to evict elements on timeout, for instance: if an entry is put in this structure more than X
minutes ago, it should be automatically removed.
Of course, there's always a chance to implement it on my own, but I'd prefer to find something existing, optimized and well-tested.
The only thing that's near is Guava's cache, but it seems to be missing #2. Any ideas on known implementations of this?
Upvotes: 0
Views: 837
Reputation: 9023
You could write a simple BlockingCache, which wraps an existing Guava Cache and checks capacity on put
operations, so the put would look something like this:
public V put(K key, V value)
{
while (size() >= capacity) Thread.sleep(100);
return innerCache.put(key, value);
}
Upvotes: 1