Implement LRU Cache

💡Strategies for Solving This Problem

Understanding LRU Cache

A Least Recently Used (LRU) cache is a data structure that removes the least recently accessed item when the cache reaches its capacity. It's fundamental in operating systems (page replacement), databases, and web browsers.

Core Requirements

  • get(key): Return value if exists, -1 otherwise. Mark as recently used. O(1)
  • put(key, value): Insert or update value. Mark as recently used. If at capacity, evict LRU item. O(1)

Key Challenge

How to achieve O(1) for both operations?

Naive Approaches (too slow):

  • Array + timestamp: get O(1), put O(n) to find LRU
  • Sorted list: get O(n), put O(n)
  • Hash map only: O(1) lookup but no LRU tracking

Optimal Solution: Hash Map + Doubly Linked List

Why Doubly Linked List?

  • Can remove node in O(1) if we have reference
  • Can add to front/back in O(1)
  • Head = most recent, Tail = least recent

Why Hash Map?

  • O(1) lookup by key
  • Maps key → node reference in linked list

Data Structure Design

Hash Map: key → Node

Doubly Linked List:
  Head (most recent) → [node] ↔ [node] ↔ ... ↔ [node] ← Tail (least recent)

  Each node: {key, value, prev, next}

Operations

get(key):

  1. Check if key in hash map (O(1))
  2. If exists: move node to front of list (most recent)
  3. Return value

put(key, value):

  1. If key exists: update value, move to front
  2. If new key:
    • Create new node
    • Add to front of list
    • Add to hash map
    • If at capacity: remove tail node (LRU)

Interview Variations

  • LFU Cache: Evict least frequently used instead
  • TTL Cache: Items expire after time limit
  • 2Q Cache: Two-queue system for better hit rate
  • Write-through vs Write-back: When to persist to storage

Asked at: Amazon, Google, Facebook, Microsoft, Uber, Stripe

Scroll to Top