Reputation: 1749
Introduction
I've implemented a simple Cache with an LRU policy using an ArrayDeque
and following a Generics
solution:
public class Cache<T> extends ArrayDeque<T> implements Serializable {
private static final long serialVersionUID = 1L;
private int MAX_SIZE;
public Cache(int maxSize) {
MAX_SIZE = maxSize;
}
public void store(T e) {
if (super.size() >= MAX_SIZE) {
this.pollLast();
}
this.addFirst(e);
}
public T fetch(T e) {
Iterator<T> it = this.iterator();
while (it.hasNext()) {
T current = it.next();
if (current.equals(e)) {
this.remove(current);
this.addFirst(current);
return current;
}
}
return null;
}
}
Problem
When I instantiate the class and push an element,
Cache<CachedClass> cache = new Cache<CachedClass>(10);
cache.store(new CachedClass());
at this point the queue does not contain anything.
Why is this happening?
Observation
By the way, the CachedClass
overrides the method .equals()
.
Tests
public class CacheTest {
@Test
public void testStore() {
Cache<Integer> cache = new Cache<Integer>(3);
cache.store(1);
assertTrue(cache.contains(1));
cache.store(2);
cache.store(3);
cache.store(4);
assertEquals(cache.size(), 3);
}
@Test
public void testFetch() {
Cache<Context> cache = new Cache<Context>(2);
Context c1 = new Context(1);
Context c2 = new Context(2);
cache.store(c1);
cache.store(c2);
assertEquals((Context) cache.peekFirst(), (new Context(2)));
Context c = cache.fetch(c1);
assertTrue(c == c1);
assertEquals(cache.size(), 2);
assertEquals((Context) cache.peekFirst(), (new Context(1)));
}
}
EDIT It passes both tests successfully.
It passes the first test. It fails throwing an
AssertException
onassertTrue(cache.peekFirst() == 1);
of the second test,
Upvotes: 1
Views: 567
Reputation: 9086
The Javadoc for LinkedHashMap says
"This kind of map is well-suited to building LRU caches."
You really need a good reason to ignore this. My guess is that performance will be indistinguishable between the implementations on puts and much better on gets with the Map - but hey, why don't you run your own benchmark.
Finally your implementation (as well as the one provided by LinkedHashMap) is not thread safe. If this is an issue for you, the synchronisation logic will add performance overhead.
Upvotes: 1
Reputation: 533530
I use a LinkedHashMap as an LRU cache.
public static <K,V> Map<K,V> lruCache(final int maxSize) {
return new LinkedHashMap<K,V>(maxSize*4/3, 0.75f, true) {
@Override
protected boolean removeEldestEntry(Map.Entry<K,V> eldest) {
return size() > maxSize;
}
};
}
An important difference is that a key is associated with a value. If you implement a cache as a set all you can do is confirm an object you already have is in the cache or not, which is not as useful IMHO.
Tests
@Test
public void testStore() {
Map<Integer, String> cache = lruCache(3);
cache.put(1, "one");
assertEquals("one", cache.get(1));
cache.put(2, "two");
cache.put(3, "three");
cache.put(4, "four");
assertEquals(cache.size(), 3);
assertEquals(null, cache.get(1));
}
@Test
public void testFetch() {
Map<Integer, String> cache = lruCache(3);
cache.put(1, "one");
cache.put(2, "two");
assertEquals((Integer) 1, cache.keySet().iterator().next());
String s = cache.get(1);
assertEquals("one", s);
assertEquals(cache.size(), 2);
assertEquals((Integer) 2, cache.keySet().iterator().next());
}
Upvotes: 0
Reputation: 37813
I do this in a main method:
Cache<Integer> cache = new Cache<Integer>(10);
cache.store(new Integer(0));
System.out.println(cache.size()); // prints 1
Integer foo = cache.fetch(new Integer(0));
System.out.println(foo == null); // prints false
and it prints 1, so there is 1 element in your Cache
. Works fine.
For fetching your CachedClass
muss override equals()
. Your code works perfectly as intended without a change.
Upvotes: 0