146. LRU Cache

Medium

Design a data structure that follows the constraints of a Least Recently Used (LRU) cache.

Implement the LRUCache class:

  • LRUCache(int capacity) Initialize the LRU cache with positive size capacity.

  • int get(int key) Return the value of the key if the key exists, otherwise return -1.

  • void put(int key, int value) Update the value of the key if the key exists. Otherwise, add the key-value pair to the cache. If the number of keys exceeds the capacity from this operation, evict the least recently used key.

The functions get and put must each run in O(1) average time complexity.

Example 1

Input: ["LRUCache", "put", "put", "get", "put", "get", "put", "get", "get", "get"] [[2], [1, 1], [2, 2], [1], [3, 3], [2], [4, 4], [1], [3], [4]] Output: [null, null, null, 1, null, -1, null, -1, 3, 4] Explanation: LRUCache lRUCache = new LRUCache(2); lRUCache.put(1, 1); // cache is {1=1} lRUCache.put(2, 2); // cache is {1=1, 2=2} lRUCache.get(1); // return 1 lRUCache.put(3, 3); // LRU key was 2, evicts key 2, cache is {1=1, 3=3} lRUCache.get(2); // returns -1 (not found) lRUCache.put(4, 4); // LRU key was 1, evicts key 1, cache is {4=4, 3=3} lRUCache.get(1); // return -1 (not found) lRUCache.get(3); // return 3 lRUCache.get(4); // return 4

Constraints

  • 1 <= capacity <= 3000

  • 0 <= key <= 10^4

  • 0 <= value <= 10^5

  • At most 2 * 10^5 calls will be made to get and put.

Solution

My Solution

Optimal Solution

The optimal solution for an LRU Cache uses a combination of:

  1. HashMap for O(1) key-value lookups

  2. Doubly Linked List for O(1) insertion/deletion

  3. Pointer manipulation for updating access order

Approach Analysis

The LRU Cache implementation uses two main data structures:

  1. HashMap (map[int]*Node):

    • Provides O(1) access to cache entries

    • Maps keys to doubly-linked list nodes

    • Enables quick lookups for get/put operations

  2. Doubly Linked List:

    • Maintains access order of elements

    • Most recently used items at front

    • Least recently used items at back

    • Allows O(1) removal/insertion of nodes

Visualization of Both Approaches

Complexity Analysis

Time Complexity:

  • Constructor: O(1)

  • Get: O(1)

  • Put: O(1)

  • All internal operations (moveToFront, removeLRU): O(1)

Space Complexity:

  • O(capacity) for both HashMap and Linked List

  • Each entry requires:

    • HashMap entry: O(1)

    • Linked List node: O(1)

    • Total per entry: O(1)

Why Solution Works

  1. HashMap + Linked List Synergy:

    • HashMap provides instant access to nodes

    • Linked List maintains order without requiring shifts

    • Both structures complement each other's weaknesses

  2. Sentinel Nodes (Head/Tail):

    • Eliminate edge cases in node manipulation

    • Simplify insertion/deletion logic

    • No need to check for null pointers

  3. Encapsulated Operations:

    • moveToFront handles the "recently used" aspect

    • removeLRU maintains capacity constraint

    • Clean separation of concerns

When to Use

Use LRU Cache when:

  1. Need O(1) access time for cache operations

  2. Want to evict least recently used items when capacity is reached

  3. Memory usage can be bounded by a fixed capacity

  4. Recent access patterns are good predictors of future accesses

Common applications:

  • Database query caching

  • Web browser page caching

  • File system caching

  • Memory management in operating systems

Common Patterns & Applications

  1. Caching Pattern:

    • Fast access to recent items

    • Automatic eviction of stale items

    • Fixed memory footprint

  2. Doubly Linked List Pattern:

    • O(1) removal from any position

    • O(1) insertion at any position

    • Bidirectional traversal capability

  3. HashMap + Linked List Pattern:

    • Common in cache implementations

    • Combines fast lookup with ordered access

    • Used in many system designs

Interview Tips

  1. Implementation Strategy:

    • Start with data structure definitions

    • Implement basic operations first

    • Add edge case handling

    • Consider thread safety if asked

  2. Key Points to Mention:

    • Why you chose the data structures

    • Time/space complexity analysis

    • How you handle edge cases

    • Possible optimizations

  3. Common Follow-up Questions:

    • How to make it thread-safe?

    • How to implement LFU Cache instead?

    • How to handle variable entry sizes?

    • How to implement distributed LRU Cache?

  4. Gotchas to Watch For:

    • Memory leaks in evicted entries

    • Proper linking/unlinking of nodes

    • HashMap and List consistency

    • Capacity edge cases

result

Leetcode: link

Last updated

Was this helpful?