Design LRU Cache

Implement Least Recently Used cache

Overview

Design and implement a Least Recently Used (LRU) cache. The cache should support get and put operations in O(1) time complexity.

When the cache reaches its capacity, it should invalidate the least recently used item before inserting a new item.

This is a very common interview question that tests understanding of data structures and algorithms.

Key Concepts

HashMap for O(1) lookup

Doubly Linked List for O(1) removal and insertion

Combination of both data structures

Head = most recently used, Tail = least recently used

Update order on every access

Code Example

java
1class LRUCache {
2    class Node {
3        int key;
4        int value;
5        Node prev;
6        Node next;
7        
8        Node(int key, int value) {
9            this.key = key;
10            this.value = value;
11        }
12    }
13    
14    private Map<Integer, Node> cache;
15    private int capacity;
16    private Node head; // Most recently used
17    private Node tail; // Least recently used
18    
19    public LRUCache(int capacity) {
20        this.capacity = capacity;
21        this.cache = new HashMap<>();
22        
23        // Dummy head and tail
24        head = new Node(0, 0);
25        tail = new Node(0, 0);
26        head.next = tail;
27        tail.prev = head;
28    }
29    
30    public int get(int key) {
31        if (!cache.containsKey(key)) {
32            return -1;
33        }
34        
35        Node node = cache.get(key);
36        // Move to front (most recently used)
37        remove(node);
38        insertAtFront(node);
39        
40        return node.value;
41    }
42    
43    public void put(int key, int value) {
44        if (cache.containsKey(key)) {
45            // Update existing
46            Node node = cache.get(key);
47            node.value = value;
48            remove(node);
49            insertAtFront(node);
50        } else {
51            // Insert new
52            if (cache.size() >= capacity) {
53                // Remove LRU (tail)
54                Node lru = tail.prev;
55                remove(lru);
56                cache.remove(lru.key);
57            }
58            
59            Node newNode = new Node(key, value);
60            cache.put(key, newNode);
61            insertAtFront(newNode);
62        }
63    }
64    
65    private void remove(Node node) {
66        node.prev.next = node.next;
67        node.next.prev = node.prev;
68    }
69    
70    private void insertAtFront(Node node) {
71        node.next = head.next;
72        node.prev = head;
73        head.next.prev = node;
74        head.next = node;
75    }
76}
77
78// Usage
79public class Main {
80    public static void main(String[] args) {
81        LRUCache cache = new LRUCache(2);
82        
83        cache.put(1, 1);
84        cache.put(2, 2);
85        System.out.println(cache.get(1)); // returns 1
86        cache.put(3, 3); // evicts key 2
87        System.out.println(cache.get(2)); // returns -1 (not found)
88        cache.put(4, 4); // evicts key 1
89        System.out.println(cache.get(1)); // returns -1 (not found)
90        System.out.println(cache.get(3)); // returns 3
91        System.out.println(cache.get(4)); // returns 4
92    }
93}

Uses HashMap for O(1) lookup and Doubly Linked List for O(1) insertion/deletion. Head is MRU, tail is LRU.

When to Use

  • Browser cache

  • Database query cache

  • CDN cache

  • Operating system page cache

  • Any system needing fast access to recently used data

Advantages

  • O(1) time complexity for get and put

  • Automatically removes least used items

  • Fixed memory usage (bounded by capacity)

  • Simple to understand and implement

Disadvantages

  • Uses extra space for linked list pointers

  • Not suitable for all caching scenarios (LFU might be better)

  • Requires careful implementation to avoid bugs

Real-World Example

  • Browser caching recent web pages
  • Mobile apps caching recent images
  • Database caching frequently accessed queries
  • Operating systems caching recently used files

Best Practices

  • 1.

    Use dummy head and tail nodes to simplify edge cases

  • 2.

    Always update order on get operations

  • 3.

    Test edge cases: capacity 1, getting non-existent keys, updating existing keys

  • 4.

    Consider thread-safety for production use (synchronized or ConcurrentHashMap)

  • 5.

    Document time and space complexity

💡 Interview Tips

  • Explain why both HashMap and LinkedList are needed

  • Draw diagram showing head (MRU) and tail (LRU)

  • Discuss time complexity: O(1) for both operations

  • Space complexity: O(capacity)

  • Mention Java's LinkedHashMap as built-in solution

  • Be ready to extend: TTL, thread-safety, LFU variant

  • Test with examples: write down step-by-step state changes

AI Tutor

Ask about the topic

Sign in Required

Please sign in to use the AI tutor

Sign In
Design LRU Cache - LLD Interview Questions | LLD | Revise Algo