Sorting Algorithms Questions Placement
Sorting Algorithms - Complete Placement Interview Question Bank
Last Updated: March 2026
Introduction: Why Sorting Matters in Coding Interviews
Sorting is one of the most fundamental algorithmic concepts in computer science. It appears directly or indirectly in 50%+ of coding interviews because it demonstrates:
- Algorithmic thinking: Understanding trade-offs between time and space
- Complexity analysis: Knowing when O(n log n) is achievable vs O(n²)
- Problem transformation: Many problems become trivial after sorting
- Implementation skill: Writing bug-free sorting logic under pressure
Sorting is crucial for:
- Database query optimization
- Search algorithms (binary search requires sorted data)
- Data analysis and statistics
- Scheduling and resource allocation
Key Concepts and Theory
1. Sorting Algorithm Categories
COMPARISON-BASED SORTS (Ω(n log n) lower bound):
├── Simple: Bubble, Selection, Insertion (O(n²))
├── Efficient: Merge, Quick, Heap (O(n log n))
└── Hybrid: TimSort, IntroSort
NON-COMPARISON SORTS (O(n) possible):
├── Counting Sort
├── Radix Sort
└── Bucket Sort
IN-PLACE vs NOT IN-PLACE:
├── In-place: Bubble, Selection, Insertion, Quick, Heap (O(1) space)
└── Not in-place: Merge Sort (O(n) space)
STABLE vs UNSTABLE:
├── Stable: Bubble, Insertion, Merge, Counting, Radix
└── Unstable: Selection, Quick, Heap
2. Algorithm Complexities Summary
| Algorithm | Best | Average | Worst | Space | Stable |
|---|---|---|---|---|---|
| Bubble Sort | O(n) | O(n²) | O(n²) | O(1) | Yes |
| Selection Sort | O(n²) | O(n²) | O(n²) | O(1) | No |
| Insertion Sort | O(n) | O(n²) | O(n²) | O(1) | Yes |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes |
| Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) | No |
| Heap Sort | O(n log n) | O(n log n) | O(n log n) | O(1) | No |
| Counting Sort | O(n+k) | O(n+k) | O(n+k) | O(k) | Yes |
| Radix Sort | O(nk) | O(nk) | O(nk) | O(n+k) | Yes |
| Tim Sort | O(n) | O(n log n) | O(n log n) | O(n) | Yes |
Time/Space Complexity Cheat Sheet
┌─────────────────────────────────────────────────────────────┐
│ SORTING ALGORITHMS AT A GLANCE │
├─────────────────────────────────────────────────────────────┤
│ When to use what: │
│ │
│ Small arrays (n < 50) │ Insertion Sort (cache-friendly)│
│ General purpose │ Quick Sort (fastest in practice)│
│ Worst-case guarantee │ Merge Sort or Heap Sort │
│ Linked lists │ Merge Sort │
│ External sorting │ Merge Sort │
│ Limited range integers │ Counting Sort (O(n)) │
│ Large integers/strings │ Radix Sort │
│ Nearly sorted data │ Insertion Sort or Tim Sort │
│ Stable sort needed │ Merge Sort or Insertion Sort │
└─────────────────────────────────────────────────────────────┘
Complete Algorithm Implementations
Bubble Sort
def bubble_sort(arr):
"""
Repeatedly swap adjacent elements if in wrong order.
Time: O(n²) | Space: O(1) | Stable: Yes
"""
n = len(arr)
arr = arr.copy()
for i in range(n):
swapped = False
# Last i elements are already sorted
for j in range(0, n - i - 1):
if arr[j] > arr[j + 1]:
arr[j], arr[j + 1] = arr[j + 1], arr[j]
swapped = True
# Optimization: stop if no swaps
if not swapped:
break
return arr
Selection Sort
def selection_sort(arr):
"""
Find minimum element and place at beginning.
Time: O(n²) | Space: O(1) | Stable: No
"""
n = len(arr)
arr = arr.copy()
for i in range(n):
min_idx = i
for j in range(i + 1, n):
if arr[j] < arr[min_idx]:
min_idx = j
arr[i], arr[min_idx] = arr[min_idx], arr[i]
return arr
Insertion Sort
def insertion_sort(arr):
"""
Build sorted array one element at a time.
Time: O(n²) worst, O(n) best | Space: O(1) | Stable: Yes
"""
arr = arr.copy()
for i in range(1, len(arr)):
key = arr[i]
j = i - 1
# Shift elements greater than key
while j >= 0 and arr[j] > key:
arr[j + 1] = arr[j]
j -= 1
arr[j + 1] = key
return arr
Merge Sort
def merge_sort(arr):
"""
Divide and conquer: split, sort, merge.
Time: O(n log n) | Space: O(n) | Stable: Yes
"""
if len(arr) <= 1:
return arr
# Split
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
# Merge
return merge(left, right)
def merge(left, right):
result = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] <= right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result.extend(left[i:])
result.extend(right[j:])
return result
# In-place version (less efficient but O(1) extra)
def merge_sort_inplace(arr, left=0, right=None):
if right is None:
right = len(arr) - 1
if left < right:
mid = (left + right) // 2
merge_sort_inplace(arr, left, mid)
merge_sort_inplace(arr, mid + 1, right)
merge_inplace(arr, left, mid, right)
def merge_inplace(arr, left, mid, right):
# Create temp arrays
left_arr = arr[left:mid + 1]
right_arr = arr[mid + 1:right + 1]
i = j = 0
k = left
while i < len(left_arr) and j < len(right_arr):
if left_arr[i] <= right_arr[j]:
arr[k] = left_arr[i]
i += 1
else:
arr[k] = right_arr[j]
j += 1
k += 1
while i < len(left_arr):
arr[k] = left_arr[i]
i += 1
k += 1
while j < len(right_arr):
arr[k] = right_arr[j]
j += 1
k += 1
Quick Sort
def quick_sort(arr):
"""
Divide and conquer: partition around pivot, recurse.
Time: O(n log n) avg, O(n²) worst | Space: O(log n) | Stable: No
"""
arr = arr.copy()
_quick_sort_helper(arr, 0, len(arr) - 1)
return arr
def _quick_sort_helper(arr, low, high):
if low < high:
# Partition and get pivot index
pivot_idx = partition(arr, low, high)
# Recursively sort subarrays
_quick_sort_helper(arr, low, pivot_idx - 1)
_quick_sort_helper(arr, pivot_idx + 1, high)
def partition(arr, low, high):
"""Lomuto partition scheme"""
pivot = arr[high]
i = low - 1 # Index of smaller element
for j in range(low, high):
if arr[j] <= pivot:
i += 1
arr[i], arr[j] = arr[j], arr[i]
arr[i + 1], arr[high] = arr[high], arr[i + 1]
return i + 1
# Hoare partition (more efficient)
def partition_hoare(arr, low, high):
pivot = arr[(low + high) // 2]
i = low - 1
j = high + 1
while True:
i += 1
while arr[i] < pivot:
i += 1
j -= 1
while arr[j] > pivot:
j -= 1
if i >= j:
return j
arr[i], arr[j] = arr[j], arr[i]
# Randomized QuickSort (avoids worst case)
import random
def quick_sort_randomized(arr):
arr = arr.copy()
_quick_sort_random(arr, 0, len(arr) - 1)
return arr
def _quick_sort_random(arr, low, high):
if low < high:
pivot_idx = partition_random(arr, low, high)
_quick_sort_random(arr, low, pivot_idx - 1)
_quick_sort_random(arr, pivot_idx + 1, high)
def partition_random(arr, low, high):
rand_idx = random.randint(low, high)
arr[rand_idx], arr[high] = arr[high], arr[rand_idx]
return partition(arr, low, high)
Heap Sort
def heap_sort(arr):
"""
Build max heap, repeatedly extract maximum.
Time: O(n log n) | Space: O(1) | Stable: No
"""
arr = arr.copy()
n = len(arr)
# Build max heap (bottom-up)
for i in range(n // 2 - 1, -1, -1):
heapify(arr, n, i)
# Extract elements one by one
for i in range(n - 1, 0, -1):
arr[0], arr[i] = arr[i], arr[0] # Move max to end
heapify(arr, i, 0) # Heapify reduced heap
return arr
def heapify(arr, n, i):
"""Maintain max heap property"""
largest = i
left = 2 * i + 1
right = 2 * i + 2
if left < n and arr[left] > arr[largest]:
largest = left
if right < n and arr[right] > arr[largest]:
largest = right
if largest != i:
arr[i], arr[largest] = arr[largest], arr[i]
heapify(arr, n, largest)
Counting Sort
def counting_sort(arr):
"""
Count occurrences, construct sorted array.
Time: O(n + k) | Space: O(k) | Stable: Yes
k = range of input values
"""
if not arr:
return []
# Find range
min_val = min(arr)
max_val = max(arr)
# Count occurrences
count = [0] * (max_val - min_val + 1)
for num in arr:
count[num - min_val] += 1
# Construct result
result = []
for i, cnt in enumerate(count):
result.extend([i + min_val] * cnt)
return result
# Stable counting sort with key function
def counting_sort_stable(arr, key=None):
"""
Stable version using position tracking.
"""
if not arr:
return []
# Apply key function
keys = [key(x) if key else x for x in arr] if key else arr
min_val = min(keys)
max_val = max(keys)
# Count and positions
count = [0] * (max_val - min_val + 1)
for k in keys:
count[k - min_val] += 1
# Cumulative count for positions
for i in range(1, len(count)):
count[i] += count[i - 1]
# Build output (stable)
output = [None] * len(arr)
for i in range(len(arr) - 1, -1, -1):
idx = keys[i] - min_val
count[idx] -= 1
output[count[idx]] = arr[i]
return output
Radix Sort
def radix_sort(arr):
"""
Sort by each digit position (LSD to MSD).
Time: O(d × (n + k)) | Space: O(n + k) | Stable: Yes
d = digits, k = digit range (usually 10)
"""
if not arr:
return []
arr = arr.copy()
max_num = max(abs(x) for x in arr)
exp = 1 # Current digit position
while max_num // exp > 0:
counting_sort_by_digit(arr, exp)
exp *= 10
return arr
def counting_sort_by_digit(arr, exp):
"""Sort by digit at exp position"""
n = len(arr)
output = [0] * n
count = [0] * 10 # Digits 0-9
# Count occurrences of each digit
for num in arr:
digit = (num // exp) % 10
count[digit] += 1
# Cumulative count
for i in range(1, 10):
count[i] += count[i - 1]
# Build output (stable)
for i in range(n - 1, -1, -1):
digit = (arr[i] // exp) % 10
count[digit] -= 1
output[count[digit]] = arr[i]
# Copy back
for i in range(n):
arr[i] = output[i]
Practice Questions with Solutions
Question 1: Sort Colors (Dutch National Flag) (MEDIUM ⭐⭐)
Problem: Sort array of 0s, 1s, and 2s in-place in one pass.
Solution:
def sort_colors(nums):
"""
Three-way partitioning - O(n) time, O(1) space
Maintain three regions:
[0, low) = 0s, [low, mid) = 1s, (high, end] = 2s
"""
low = mid = 0
high = len(nums) - 1
while mid <= high:
if nums[mid] == 0:
nums[low], nums[mid] = nums[mid], nums[low]
low += 1
mid += 1
elif nums[mid] == 1:
mid += 1
else: # nums[mid] == 2
nums[mid], nums[high] = nums[high], nums[mid]
high -= 1
# Don't increment mid, need to check swapped element
Question 2: Merge Intervals (MEDIUM ⭐⭐)
Problem: Merge all overlapping intervals.
Solution:
def merge(intervals):
"""
Sort + Single pass - O(n log n) time, O(n) space
"""
if not intervals:
return []
# Sort by start time
intervals.sort(key=lambda x: x[0])
merged = [intervals[0]]
for current in intervals[1:]:
last = merged[-1]
if current[0] <= last[1]: # Overlapping
last[1] = max(last[1], current[1])
else:
merged.append(current)
return merged
Question 3: Kth Largest Element (MEDIUM ⭐⭐)
Problem: Find kth largest element in unsorted array.
Solution:
import random
def find_kth_largest(nums, k):
"""
QuickSelect - O(n) avg, O(n²) worst time, O(1) space
"""
return quickselect(nums, 0, len(nums) - 1, len(nums) - k)
def quickselect(nums, left, right, k):
"""
Returns kth smallest (0-indexed)
"""
if left == right:
return nums[left]
# Random pivot
pivot_idx = random.randint(left, right)
pivot_idx = partition(nums, left, right, pivot_idx)
if k == pivot_idx:
return nums[k]
elif k < pivot_idx:
return quickselect(nums, left, pivot_idx - 1, k)
else:
return quickselect(nums, pivot_idx + 1, right, k)
def partition(nums, left, right, pivot_idx):
pivot_val = nums[pivot_idx]
nums[pivot_idx], nums[right] = nums[right], nums[pivot_idx]
store_idx = left
for i in range(left, right):
if nums[i] < pivot_val:
nums[store_idx], nums[i] = nums[i], nums[store_idx]
store_idx += 1
nums[right], nums[store_idx] = nums[store_idx], nums[right]
return store_idx
# Min heap approach - O(n log k) time, O(k) space
import heapq
def find_kth_largest_heap(nums, k):
min_heap = nums[:k]
heapq.heapify(min_heap)
for num in nums[k:]:
if num > min_heap[0]:
heapq.heapreplace(min_heap, num)
return min_heap[0]
Question 4: Top K Frequent Elements (MEDIUM ⭐⭐)
Problem: Return k most frequent elements.
Solution:
from collections import Counter
import heapq
def top_k_frequent(nums, k):
"""
Bucket sort approach - O(n) time, O(n) space
"""
# Count frequencies
freq = Counter(nums)
# Bucket by frequency
# freq_buckets[i] = list of elements with frequency i
buckets = [[] for _ in range(len(nums) + 1)]
for num, count in freq.items():
buckets[count].append(num)
# Collect top k
result = []
for i in range(len(buckets) - 1, 0, -1):
for num in buckets[i]:
result.append(num)
if len(result) == k:
return result
return result
# Min heap approach - O(n log k) time
def top_k_frequent_heap(nums, k):
freq = Counter(nums)
return heapq.nlargest(k, freq.keys(), key=freq.get)
Question 5: Find the Duplicate Number (MEDIUM ⭐⭐)
Problem: Find duplicate in array where nums[i] is in [1, n], array has n+1 elements.
Solution:
def find_duplicate(nums):
"""
Floyd's Cycle Detection - O(n) time, O(1) space
Treat array as linked list: index -> value is like node -> next
Duplicate creates cycle
"""
# Phase 1: Find meeting point
slow = fast = nums[0]
while True:
slow = nums[slow]
fast = nums[nums[fast]]
if slow == fast:
break
# Phase 2: Find cycle start (duplicate)
slow = nums[0]
while slow != fast:
slow = nums[slow]
fast = nums[fast]
return slow
# Binary search approach - O(n log n) time, O(1) space
def find_duplicate_binary_search(nums):
"""
Count numbers <= mid to determine which half has duplicate
"""
left, right = 1, len(nums) - 1
while left < right:
mid = (left + right) // 2
count = sum(1 for num in nums if num <= mid)
if count > mid:
right = mid
else:
left = mid + 1
return left
Question 6: Meeting Rooms II (MEDIUM ⭐⭐)
Problem: Find minimum number of conference rooms required.
Solution:
import heapq
def min_meeting_rooms(intervals):
"""
Min heap approach - O(n log n) time, O(n) space
"""
if not intervals:
return 0
# Sort by start time
intervals.sort(key=lambda x: x[0])
# Min heap of end times
# Heap top is room that becomes free earliest
rooms = [intervals[0][1]] # End time of first meeting
for start, end in intervals[1:]:
# If earliest ending room is free, reuse it
if rooms[0] <= start:
heapq.heappop(rooms)
# Allocate room (new or reused)
heapq.heappush(rooms, end)
return len(rooms)
# Chronological ordering - O(n log n) time, O(n) space
def min_meeting_rooms_chronological(intervals):
if not intervals:
return 0
starts = sorted([i[0] for i in intervals])
ends = sorted([i[1] for i in intervals])
rooms = 0
end_ptr = 0
for start in starts:
if start >= ends[end_ptr]:
# Reuse room
end_ptr += 1
else:
# Need new room
rooms += 1
return rooms
Question 7: Sort List (MEDIUM ⭐⭐)
Problem: Sort linked list in O(n log n) time, O(1) space.
Solution:
class ListNode:
def __init__(self, val=0, next=None):
self.val = val
self.next = next
def sort_list(head):
"""
Merge Sort - O(n log n) time, O(log n) stack space
Bottom-up can achieve O(1) space
"""
if not head or not head.next:
return head
# Find middle
def find_middle(head):
slow = fast = head
prev = None
while fast and fast.next:
prev = slow
slow = slow.next
fast = fast.next.next
prev.next = None
return slow
# Merge two sorted lists
def merge(l1, l2):
dummy = ListNode(0)
curr = dummy
while l1 and l2:
if l1.val <= l2.val:
curr.next = l1
l1 = l1.next
else:
curr.next = l2
l2 = l2.next
curr = curr.next
curr.next = l1 or l2
return dummy.next
mid = find_middle(head)
left = sort_list(head)
right = sort_list(mid)
return merge(left, right)
Question 8: Wiggle Sort II (MEDIUM ⭐⭐)
Problem: Reorder such that nums[0] < nums[1] > nums[2] < nums[3]...
Solution:
def wiggle_sort(nums):
"""
Sort and rearrange - O(n log n) time, O(n) space
"""
nums.sort()
n = len(nums)
# Split into two halves
# Small half: first half sorted ascending
# Large half: second half sorted descending
temp = nums[:]
# Fill even indices from end of small half
# Fill odd indices from end of large half
j = n - 1
for i in range(1, n, 2): # Odd indices
nums[i] = temp[j]
j -= 1
for i in range(0, n, 2): # Even indices
nums[i] = temp[j]
j -= 1
# O(n) time using median of medians (complex)
Question 9: Maximum Gap (HARD ⭐⭐⭐)
Problem: Find maximum difference between successive elements in sorted form.
Solution:
def maximum_gap(nums):
"""
Bucket sort concept - O(n) time, O(n) space
Key insight: Max gap is at least (max - min) / (n - 1)
"""
if len(nums) < 2:
return 0
n = len(nums)
min_val, max_val = min(nums), max(nums)
if min_val == max_val:
return 0
# Bucket size ensures max gap won't be inside a bucket
bucket_size = max(1, (max_val - min_val) // (n - 1))
bucket_count = (max_val - min_val) // bucket_size + 1
# Track min and max of each bucket
buckets_min = [float('inf')] * bucket_count
buckets_max = [float('-inf')] * bucket_count
for num in nums:
idx = (num - min_val) // bucket_size
buckets_min[idx] = min(buckets_min[idx], num)
buckets_max[idx] = max(buckets_max[idx], num)
# Find max gap between buckets
max_gap = 0
prev_max = min_val
for i in range(bucket_count):
if buckets_min[i] == float('inf'):
continue # Empty bucket
max_gap = max(max_gap, buckets_min[i] - prev_max)
prev_max = buckets_max[i]
return max_gap
Question 10: Count of Smaller Numbers After Self (HARD ⭐⭐⭐)
Problem: For each element, count how many smaller elements are to its right.
Solution:
def count_smaller(nums):
"""
Modified Merge Sort - O(n log n) time, O(n) space
Count inversions during merge step
"""
n = len(nums)
result = [0] * n
# Store (original_index, value) pairs
indexed_nums = [(i, nums[i]) for i in range(n)]
def merge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
return merge(left, right)
def merge(left, right):
merged = []
i = j = 0
# Count inversions
# Elements in right that are smaller than current left element
while i < len(left) and j < len(right):
if left[i][1] <= right[j][1]:
# right[0:j] are all smaller than left[i]
result[left[i][0]] += j
merged.append(left[i])
i += 1
else:
merged.append(right[j])
j += 1
# Remaining left elements
while i < len(left):
result[left[i][0]] += j # Add remaining right count
merged.append(left[i])
i += 1
# Remaining right elements
while j < len(right):
merged.append(right[j])
j += 1
return merged
merge_sort(indexed_nums)
return result
Common Interview Follow-up Questions
-
"Which sorting algorithm does Python/Java use?"
- Python: Timsort (hybrid merge + insertion)
- Java: Dual-Pivot Quicksort for primitives, Timsort for objects
- C++: Introsort (quick + heap + insertion)
-
"When is Bubble Sort actually useful?"
- Almost sorted data (O(n) with optimization)
- Educational purposes
- Detecting very small number of swaps needed
-
"How would you sort 1 billion integers?"
- External merge sort (data doesn't fit in memory)
- Distribute into chunks, sort each, k-way merge
- Consider distributed sorting (MapReduce)
-
"Stable vs Unstable - when does it matter?"
- Sorting by multiple keys (sort by name, then by age)
- Maintaining original order for equal elements
- Database operations with secondary indices
-
"Implement a sorting algorithm without comparisons"
- Counting sort (integer keys)
- Radix sort (integer/string keys)
- Bucket sort (uniform distribution)
Companies That Frequently Ask Sorting
| Company | Frequency | Common Questions |
|---|---|---|
| ⭐⭐⭐⭐⭐ | Kth Largest, Wiggle Sort, Count Inversions | |
| Amazon | ⭐⭐⭐⭐⭐ | Merge Intervals, Meeting Rooms, Top K |
| Microsoft | ⭐⭐⭐⭐ | Sort Colors, Sort List, Find Duplicate |
| Meta | ⭐⭐⭐⭐ | K Closest Points, Reorganize String |
| Apple | ⭐⭐⭐ | Maximum Gap, Skyline Problem |
| Uber | ⭐⭐⭐ | Sort by frequency, Custom sort comparator |
| Netflix | ⭐⭐⭐ | External sorting, Stream processing |
| Adobe | ⭐⭐⭐ | Dutch National Flag variations |
Preparation Tips for Sorting Problems
-
Know Your Complexities: Memorize best/average/worst cases for all major algorithms.
-
Understand Stability: Know which sorts are stable and why it matters.
-
Master QuickSelect: For "find kth" problems, QuickSelect is often optimal.
-
Recognize Patterns:
- "Find kth" → QuickSelect or heap
- "Merge intervals" → Sort + linear scan
- "Sort colors" → Dutch National Flag
- "Almost sorted" → Insertion sort
-
Practice Writing From Scratch: Don't rely on
sorted()in interviews unless allowed. -
Consider Constraints: Small range → Counting sort. Linked list → Merge sort.
-
Know Language Defaults: Be ready to explain what your language uses internally.
Frequently Asked Questions (FAQ)
Q1: Why is the lower bound for comparison sorting Ω(n log n)?
A: A decision tree for sorting n elements has at least n! leaves. A binary tree of height h has at most 2^h leaves. Therefore: 2^h ≥ n! → h ≥ log(n!) ≈ n log n (by Stirling's approximation).
Q2: What's the difference between Quicksort and QuickSelect?
A: Quicksort recursively sorts both partitions. QuickSelect only recurses into the partition containing the kth element, giving O(n) average time to find a single order statistic.
Q3: When should I use Heapsort over Quicksort?
A: Heapsort guarantees O(n log n) worst-case time and O(1) space. Use when:
- Worst-case performance is critical
- Memory is limited (embedded systems)
- Deterministic behavior is required
Q4: Can Radix Sort be used for floating-point numbers?
A: Yes, by treating the bit representation as integers, or by separating into integer and fractional parts. IEEE 754 format needs careful handling of sign bit.
Q5: How do I sort very large files that don't fit in memory?
A: Use external merge sort:
- Read chunks that fit in memory
- Sort each chunk and write to disk
- Perform k-way merge on sorted chunks
- Can be parallelized across multiple machines
Quick Reference: Sorting Algorithm Selection
┌────────────────────────────────────────────────────────────────┐
│ SCENARIO │ ALGORITHM │
├────────────────────────────────────────────────────────────────┤
| General purpose │ Quicksort / Timsort │
| Worst-case O(n log n) │ Mergesort / Heapsort │
| Small arrays (n < 50) │ Insertion Sort │
| Nearly sorted │ Insertion Sort / Timsort │
| Linked lists │ Merge Sort │
| Limited integer range │ Counting Sort │
| Large integers/strings │ Radix Sort │
| k largest/smallest │ QuickSelect / Min-Max Heap │
| Top k frequent │ Bucket Sort + Heap │
| External sorting │ External Merge Sort │
└────────────────────────────────────────────────────────────────┘
Sort out your interview preparation! 📊