Complexity Analysis Guide
Understanding algorithmic complexity is critical for coding interviews. This guide teaches you how to analyze and communicate the efficiency of your solutions.What is Big O Notation?
Big O describes how an algorithm’s runtime or space requirements grow as input size increases. It focuses on the dominant term and ignores constants.Key Principles
- Drop Constants: O(2n) → O(n)
- Drop Lower Terms: O(n² + n) → O(n²)
- Different Variables: O(n + m) stays as O(n + m)
- Worst Case: Big O typically describes worst-case behavior
Common Time Complexities (Best to Worst)
| Big O | Name | Example Operations |
|---|---|---|
| O(1) | Constant | Array access, hash lookup |
| O(log n) | Logarithmic | Binary search, balanced BST operations |
| O(n) | Linear | Array traversal, linear search |
| O(n log n) | Linearithmic | Merge sort, heap sort |
| O(n²) | Quadratic | Nested loops, bubble sort |
| O(2ⁿ) | Exponential | Recursive fibonacci, subset generation |
| O(n!) | Factorial | Permutations |
O(1) - Constant Time
Definition: Time doesn’t depend on input size.Examples
When You See O(1)
- Direct array index access:
arr[i] - Hash map operations: insert, lookup, delete (average case)
- Mathematical formulas: sum of 1 to n = n(n+1)/2
- Stack/queue push/pop operations
O(log n) - Logarithmic Time
Definition: Input size halves with each operation.Why Log n?
If you divide n by 2 repeatedly until reaching 1:- n → n/2 → n/4 → … → 1
- Number of steps = log₂(n)
Examples
When You See O(log n)
- Binary search on sorted array
- Balanced BST operations: search, insert, delete
- Heap operations: push, pop
- Divide and conquer that processes one half
O(n) - Linear Time
Definition: Time grows linearly with input size.Examples
When You See O(n)
- Single loop through array/list
- Tree/graph traversal (DFS/BFS)
- Building hash map from array
- String concatenation in a loop (careful: string operations can be O(n))
O(n log n) - Linearithmic Time
Definition: Combination of linear and logarithmic, typical for efficient sorting.Why n log n?
Merge Sort: Divide array log n times, merge takes O(n) at each level.- Levels: log n
- Work per level: n
- Total: n × log n
Examples
When You See O(n log n)
- Efficient comparison sorts: merge sort, heap sort, quicksort (average)
- Building segment tree or other divide-and-conquer structures
- Sorting + linear processing
O(n²) - Quadratic Time
Definition: Time grows quadratically, often from nested loops.Examples
When You See O(n²)
- Nested loops iterating over same collection
- Brute force pair/triplet finding
- Simple sorting algorithms: bubble, insertion, selection sort
- Checking all substrings of length 2+
O(2ⁿ) - Exponential Time
Definition: Time doubles with each additional input element.Examples
When You See O(2ⁿ)
- Generating all subsets/combinations
- Recursive solutions without memoization
- Problems requiring exhaustive search
Space Complexity
Space complexity measures memory usage beyond input storage.What Counts?
- Yes: Auxiliary data structures (hash maps, arrays, stacks)
- Yes: Recursion call stack
- No: Input array (unless you’re modifying it)
- No: Output array (unless explicitly asked)
Examples
How to Analyze Complexity
Step-by-Step Process
- Identify input size: What is n? (array length, string length, tree nodes, etc.)
-
Count operations in loops:
- Single loop → O(n)
- Nested loops → Multiply iterations
- Sequential loops → Add complexities
-
Analyze recursion:
- Count recursive calls
- Measure recursion depth
- Use recurrence relations: T(n) = aT(n/b) + O(nᶜ)
-
Consider data structure operations:
- Hash map insert/lookup: O(1) average
- Sorted array binary search: O(log n)
- Heap push/pop: O(log n)
-
Take dominant term:
- O(n² + n log n + n) → O(n²)
- O(3n + 100) → O(n)
Real Example Analysis
Problem: #037 - Minimum Window SubstringCommon Complexity Patterns
Pattern Recognition Table
| Code Pattern | Time | Space |
|---|---|---|
| Single loop | O(n) | O(1) |
| Nested loops (independent) | O(n×m) | O(1) |
| Nested loops (same array) | O(n²) | O(1) |
| Binary search | O(log n) | O(1) |
| Sorting + linear | O(n log n) | O(1) or O(n) |
| Hash map build + lookup | O(n) | O(n) |
| DFS/BFS on tree | O(n) | O(h) or O(w) |
| DFS/BFS on graph | O(V+E) | O(V) |
| Sliding window | O(n) | O(k) |
| Two pointers | O(n) | O(1) |
| Heap of size k | O(n log k) | O(k) |
| DP 2D table | O(n×m) | O(n×m) or O(m) |
| Backtracking (subsets) | O(2ⁿ) | O(n) |
Interview Communication Tips
How to Explain Complexity
Good explanation structure:-
State the complexity clearly:
- “The time complexity is O(n log n) and space is O(1).”
-
Explain the dominant operation:
- “We sort the array which takes O(n log n), then iterate once in O(n), so sorting dominates.”
-
Define variables:
- “Where n is the number of intervals and m is the maximum number of overlaps.”
-
Mention trade-offs:
- “We use O(n) extra space with a hash map to achieve O(n) time instead of O(n²) with nested loops.”
Example Script
“For this Two Sum solution, the time complexity is O(n) where n is the length of the input array. We iterate through the array once, and for each element, we do a hash map lookup which is O(1) on average, giving us O(n) overall. The space complexity is O(n) because in the worst case, we might store all n elements in the hash map before finding the answer. This is optimal because we need to examine each element at least once, so we can’t do better than O(n) time.”
Practice Problems by Complexity
O(1) and O(log n)
- #010 - Find Peak Element - O(log n)
- #042 - Find First and Last Position - O(log n)
O(n)
- #017 - Two Sum - O(n) time, O(n) space
- #029 - Valid Palindrome - O(n) time, O(1) space
- #037 - Minimum Window Substring - O(m+n)
O(n log n)
- #016 - Merge Intervals - O(n log n)
- #015 - Top K Frequent Elements - O(n log k)
O(n²)
- #065 - 3Sum - O(n²) time, O(1) space
Next Steps
- Practice: Analyze complexity for every solution you write
- Compare: Understand trade-offs between different approaches
- Optimize: Learn when O(n) space is worth O(n²) → O(n) time savings