Monday 20 June 2022

What are the different cache eviction strategies?


Cache is a hardware or software component that stores data for faster access to serve future requests. Since cache has limited capacity in nature, we need to specify an eviction strategy to delete the items from the cache to make the room for new items.

 

What is cache eviction strategy?

Cache eviction strategy is an algorithm that select the items to remove from the cache to make room for the new ones.

 

Following are the widely used cache eviction policies.

a.   Random Replacement

b.   First In First Out (FIFO)

c.    Last In First Out (LIFO) or First In Last Out (FILO)

d.   Least recently used (LRU)

e.   Most recently used (MRU)

f.     Least frequently used (LFU)

 

To demo these strategies, I am going to use below sample data.

 


 

Random replacement

Replace a randomly selected candidate. When the cache is full, and you want to add new entry to the cache, cache replace any random entry with the new entry.

 

First In First Out Policy (FIFO)

Delete the entries in the order they added to the cache. It is similar to a FIFO Queue.

 

In the above example, when you try to add new entry to the cache which is already full, this eviction policy remove the entry with the key k1.

 

 


Last In First Out (LIFO) or First In Last Out (FILO) policy

This policy acts as a stack. When you try to add new entry to the cache which is already full, this eviction policy remove the entry that is most recently added to the cache. In the above example, entry with the key k9 is deleted.




Least recently used (LRU) policy

This policy delete the least recently used items first. This algorithm maintain a property to capture the last access times of an entry in the cache.

 

In the above example, when you try to add new entry to the cache which is already full, this eviction policy remove the entry with the key k4.

 

 


Most recently used (MRU) policy

This policy delete the most recently used items first. Similar to LRU policy, this algorithm maintain a property to capture the last access times of an entry in the cache.

 

In the above example, when you try to add new entry to the cache which is already full, this eviction policy remove the entry with the key k1 (This is accessed most recently).




Least frequently used (LFU) policy

This policy delete the most recently used items first. This algorithm maintains an access count for every entry in the cache, and perform eviction based on access count.

 

In the above example, when you try to add new entry to the cache which is already full, this eviction policy remove the entry with the key k6 (This is accessed least frequently).

 

 


There are many other policies like Least frequent recently used (LFRU), LFU with dynamic aging etc., you can find more about these policies in this wiki article.

 


You may like

Interview Questions

Implement LRU cache using LinkedHashMap

float vs double in Java

Quick guide to race condition in Java with examples

How to read the file content using an iterator in Java?

Implement a Closeable iterator in Java

Quick guide to generate random data in Java

 

 

 

No comments:

Post a Comment