Problem Statement
Design and implement a Least Recently Used (LRU) cache that efficiently stores and retrieves cached data, automatically evicting the least recently used items when capacity is reached.Constraints and Assumptions
- Caching results of web queries
- Inputs are valid (no validation needed)
- The cache fits in memory
- Need efficient O(1) access and update operations
Design Overview
The LRU cache implementation uses three classes:- Node: Represents a cached item in a doubly-linked list
- LinkedList: Maintains items in order of recency
- Cache: The main LRU cache with hash map for O(1) lookups
The LRU cache combines a hash map and a doubly-linked list to achieve O(1) time complexity for both get and set operations. The hash map provides fast lookups, while the linked list maintains the order of access.
Implementation
Node Class
Represents a node in the doubly-linked list:LinkedList Class
Manages the order of cached items by recency:move_to_front(): Move an accessed node to the front (most recently used)append_to_front(): Add a new node at the frontremove_from_tail(): Remove the least recently used node
Cache Class
The main LRU cache implementation:Key Design Patterns
Hybrid Data Structure
The LRU cache combines two data structures for optimal performance:- Hash Map (
lookup): Provides O(1) access to nodes by query key - Doubly-Linked List: Maintains items in order of recency with O(1) insertion/deletion
Access Pattern Updates
Every time an item is accessed (viaget() or set()), it moves to the front of the list:
- Get operation: Retrieve value and move to front
- Set operation: Insert at front or update and move to front
- Eviction: Remove from tail when at capacity
Complexity Analysis
| Operation | Time Complexity | Space Complexity |
|---|---|---|
| get() | O(1) | O(1) |
| set() | O(1) | O(1) |
| Space | - | O(n) |
Both get and set operations achieve O(1) time complexity because the hash map provides constant-time lookups, and the doubly-linked list allows constant-time insertion, deletion, and reordering of nodes.
Design Considerations
Advantages
- Fast access: O(1) lookup and update operations
- Automatic eviction: Removes least recently used items when at capacity
- Efficient memory usage: Only stores items up to MAX_SIZE
- Recency tracking: Always knows which items are most/least recently used
Use Cases
- Web caching: Cache results of expensive database queries or API calls
- Browser cache: Store recently accessed web pages
- CDN: Cache frequently accessed content
- CPU cache: Hardware-level caching strategies
Potential Improvements
- TTL (Time To Live): Add expiration times for cached items
- Cache statistics: Track hit/miss rates and performance metrics
- Weighted LRU: Give different priorities to different types of queries
- Thread safety: Add locks for concurrent access
- Persistence: Save cache to disk for recovery after restart
