LRU stands for Least Recently Used.
Idea behind such a cache is that it evicts least recently used entries when number of entries put into it exceed the max capacity.
Clearly, some kind of a map is a must for any cache to enable fast look-ups.
Along with that, we need some other data-structure which is able to identify least recently used entries and remove them when required.
When an entry is put, it must be put in the beginning of some list and if that list exceeds the given limit,
its last entry should be removed.
A map and a list => both come together in a LinkedHashMap
To use a LinkedHashMap like an LRU cache, we should do the following:
public class LRU<K,V> extends LinkedHashMap<K,V>
{
@Override
V put (K key, V value)
{
V existing = super.get(key);
if (existing != null)
{
super.remove(key); // Done to remove current key from the list
}
super.put(key, value); // So that this operation makes sure the key is at the end of the list
// The least recently used entries are now in the beginning of the list
// And most recently used entries are at the end of the list
// If size exceed, remove list beginning which are least recently used by virtue of above operations
if (super.size() > MAX_CACHE_SIZE)
{
K firstKeyInList = super.entrySet().iterator().next().getKey();
super.remove(firstKeyInList);
}
return existing;
}
@Override
V get(K key)
{
V existing = super.get(key);
if (existing == null)
{
return null;
}
// Make sure get operation also moves the accessed entries to end of the list
// thus leaving least recently used in the beginning of the list
super.remove(key);
super.put(key, existing);
return existing;
}
}