Skip to content

LRUCache

krishanthan4 edited this page Dec 16, 2025 · 1 revision

LRU Cache

Problem Description

Design and implement a Least Recently Used (LRU) cache that supports get and put operations in O(1) time complexity.

What is an LRU Cache?

An LRU cache is a cache eviction policy that removes the least recently used item when the cache reaches its capacity. Items are ordered by their usage - most recently used items are kept, while least recently used items are evicted.

Characteristics

  • Time Complexity: O(1) for both get and put operations
  • Space Complexity: O(capacity)
  • Key Operations: get, put
  • Eviction Policy: Remove least recently used item when at capacity

How It Works

  1. When an item is accessed (get or put), it becomes the most recently used
  2. When cache is full and a new item is added:
    • Remove the least recently used item
    • Add the new item as most recently used
  3. Maintain order of usage for efficient eviction

Core Operations

  1. get(key):

    • Return value if key exists (and mark as recently used)
    • Return -1 if key doesn't exist
  2. put(key, value):

    • If key exists: update value and mark as recently used
    • If key doesn't exist:
      • If at capacity: remove LRU item
      • Add new key-value pair as most recently used

Your Task

Implement the following in LRUCache.java:

  1. LRUCache(int capacity) - Constructor
  2. get(int key) - Get value and update usage
  3. put(int key, int value) - Put value and handle eviction
  4. Helper methods for managing doubly linked list

Data Structure Design

Recommended Approach: HashMap + Doubly Linked List

  • HashMap: O(1) access to nodes by key
  • Doubly Linked List: O(1) removal and insertion
    • Head: Most recently used
    • Tail: Least recently used

Test Cases to Consider

  • Basic put and get operations
  • Eviction when at capacity
  • Updating existing keys
  • Access order (get makes item recent)
  • Put on existing key updates it
  • Edge case: capacity = 1
  • Multiple evictions

Hints

  • Use a HashMap to store key -> Node mapping
  • Use a doubly linked list to maintain usage order
  • Head of list = most recently used
  • Tail of list = least recently used
  • When accessing: move node to head
  • When evicting: remove node from tail

Implementation Strategy

Node structure:
- key
- value  
- prev pointer
- next pointer

Main structure:
- HashMap<Integer, Node>
- head pointer (most recent)
- tail pointer (least recent)
- capacity

Common Operations Pattern

Move to front:

  1. Remove node from current position
  2. Add node at head

Remove from tail:

  1. Get tail node
  2. Remove from HashMap
  3. Update tail pointer

Applications

  • Web browser cache
  • Database query cache
  • Operating system page replacement
  • CDN caching
  • API rate limiting

Follow-up Questions

  1. How would you implement an LFU (Least Frequently Used) cache?
  2. How would you add a time-to-live (TTL) feature?
  3. How would you make this thread-safe?
  4. How would you implement this with only a HashMap (without doubly linked list)?

Common Pitfalls

  • Forgetting to update usage order on get
  • Not handling capacity = 1 edge case
  • Memory leaks (not removing from HashMap when evicting)
  • Incorrect pointer updates in doubly linked list
  • Not checking if key exists before evicting

Variants

  • LFU Cache: Evict least frequently used
  • FIFO Cache: Evict first inserted
  • TTL Cache: Items expire after time
  • 2Q Cache: Two queues with different policies

Related Problems

  • Design a cache with expiration
  • Implement HashMap
  • Design a file system cache
  • Implement browser history

Complexity Analysis

Operation Time Space
get() O(1) -
put() O(1) -
Overall - O(capacity)

Clone this wiki locally