LRU Cache
Visualize the Least Recently Used (LRU) cache eviction policy.
LRU Cache: Least Recently Used Eviction Policy
Concept Overview
A Least Recently Used (LRU) cache is a data structure designed to manage memory efficiently by keeping recently accessed items while discarding the ones that haven't been used for the longest time. When the cache reaches its maximum capacity and a new item needs to be added, the LRU policy evicts the item that was accessed the furthest in the past. This makes it highly effective for scenarios where data accessed recently is likely to be accessed again in the near future (temporal locality).
Mathematical Definition
Let a cache C have a fixed capacity N. The state of the cache at time t can be represented as an ordered sequence of elements:
Key Concepts
Temporal Locality
The LRU algorithm is built on the principle of temporal locality, which states that if a memory location is accessed, it is highly likely to be accessed again soon. By keeping recently used items, the cache maximizes its hit rate.
Efficient Implementation
To achieve O(1) time complexity for both access (get) and update (put) operations, an LRU cache is typically implemented using a combination of two data structures:
- Hash Map: Stores keys and references to the corresponding nodes, providing O(1) access time.
- Doubly Linked List: Maintains the order of elements from Most Recently Used (head) to Least Recently Used (tail), allowing O(1) removal and insertion of nodes.
Cache Hit vs. Cache Miss
A cache hit occurs when the requested data is found in the cache, resulting in fast access. A cache miss happens when the data is not in the cache, requiring a slower fetch from the main memory or persistent storage, after which the data is inserted into the cache.
Historical Context
The concept of caching dates back to the early days of computing, introduced by Maurice Wilkes in 1965. As processor speeds rapidly outpaced memory access speeds, the need for intermediate, faster storage layers became critical. LRU emerged as one of the most effective and widely adopted replacement policies, balancing implementation complexity with high practical performance for virtual memory paging and hardware caches.
Real-world Applications
- Operating Systems: Page replacement algorithms in virtual memory management to decide which pages to swap out to disk.
- Web Browsers: Storing recently visited web pages, images, and resources to speed up backward navigation and page reloads.
- Database Systems: Buffer pools in relational databases (like PostgreSQL and MySQL) use variations of LRU to keep heavily queried data in RAM.
- Distributed Caches: In-memory data stores like Redis and Memcached use LRU policies to manage limited memory capacity for high-traffic web applications.
Related Concepts
- Linked List Operations — the underlying doubly linked list mechanism used in O(1) LRU implementations.
- Stack & Queue — understanding FIFO (First-In, First-Out), a simpler alternative cache policy.
Experience it interactively
Adjust parameters, observe in real time, and build deep intuition with Riano’s interactive LRU Cache module.
Try LRU Cache on Riano →