LFU Cache
# System & Data Structure Design Design problems in DSA interviews test your ability to translate requirements into a functional, efficient, and maintainable class structure. Unlike standard algorithmic problems, the focus here is on **State Management** and **API Design**. ### Core Principles 1. **Encapsulation:** Keep data private and expose functionality through well-defined methods. 2. **Trade-offs:** Every design choice has a cost. Is it better to have $O(1)$ read and $O(N)$ write, or vice versa? 3. **State Consistency:** Ensure that your internal data structures (e.g., a Map and a List) stay in sync after every operation. ### Common Design Patterns #### 1. HashMap + Doubly Linked List (DLL) The "Gold Standard" for $O(1)$ caching (LRU/LFU). ```text [Head] <-> [Node A] <-> [Node B] <-> [Node C] <-> [Tail] ^ ^ ^ ^ ^ (MRU) (Data) (Data) (Data) (LRU) ``` - **HashMap:** Provides $O(1)$ lookups for keys to their corresponding nodes. - **DLL:** Provides $O(1)$ addition/removal of nodes at both ends, maintaining the order of access. #### 2. Amortized Analysis (Rebalancing) Commonly used in **Queue using Stacks** or **Dynamic Arrays**. - Instead of doing heavy work on every call, we batch it. Pushing to a stack is $O(1)$, and "flipping" elements to another stack happens only when necessary, averaging $O(1)$ per operation. #### 3. Ring Buffers (Circular Arrays) Used for fixed-size memory management (e.g., **Circular Queue**, **Hit Counter**). ```text [0] [1] [2] [3] [4] [5] ^ ^ ^ Head (Data) Tail (Pops) (Next Push) ``` - Use `(index + 1) % capacity` to wrap around the array. #### 4. Concurrency & Thread Safety For "Hard" design problems (e.g., **Bounded Blocking Queue**). - Use **Mutexes** (Locks) to prevent data races. - Use **Condition Variables** (`wait`/`notify`) to manage producer-consumer logic efficiently without busy-waiting. ### How to Approach a Design Problem 1. **Identify the API:** What methods do you need to implement? (`get`, `put`, `push`, etc.) 2. **Define the State:** What variables represent the current state? (Size, Capacity, Pointers). 3. **Choose the Data Structures:** Select the combination that minimizes time complexity for the most frequent operations. 4. **Dry Run:** Trace the state changes through a sequence of operations based on your chosen structure.
LFU Cache
Design a data structure that follows the constraints of a Least Frequently Used (LFU) cache.
Requirement
- get(key): Return value if exists, update frequency.
- put(key, value): Insert/update value. If at capacity, evict the least frequently used item. If there's a tie, evict the least recently used.
Examples
Level I: Brute Force Scan
Intuition
Store entries in a List or Map. For every put at capacity, iterate through the entire collection to find the item with the minimum frequency and the oldest access time. This is per operation but extremely simple.
Detailed Dry Run
Cache: {(A, f:2, t:1), (B, f:1, t:2)}. Put(C) evicts B because its frequency (1) is lower than A (2).
Level II: Priority Queue (O(log N))
Intuition
Use a PriorityQueue to store entries sorted by frequency, and then by access time (tie-breaker). While push and pop are , it is simpler to implement than the DLL version.
Level III: Map Frequency to Doubly Linked List
Intuition
Maintain a minFreq variable. Use one map for key -> node and another for freq -> DLL of nodes. When a key is accessed, move it from count DLL to count+1 DLL. If count DLL becomes empty and count == minFreq, increment minFreq. Eviction happens at freqMap[minFreq].tail.prev.
Found an issue or have a suggestion?
Help us improve! Report bugs or suggest new features on our Telegram group.