Why Sorting Matters
Google’s email: "Know at least one or two O(n log n) sorting algorithms (e.g., QuickSort, Merge Sort). Avoid Bubble Sort."
Sorting is a prerequisite for many algorithms: binary search, two-pointer, and merge patterns all require sorted data.
Sorting Algorithms Comparison
| Algorithm | Time (avg) | Time (worst) | Space | Stable? |
|---|---|---|---|---|
Merge Sort |
O(n log n) |
O(n log n) |
O(n) |
✅ Yes |
Quick Sort |
O(n log n) |
O(n²) |
O(log n) |
❌ No |
Heap Sort |
O(n log n) |
O(n log n) |
O(1) |
❌ No |
Tim Sort (Python) |
O(n log n) |
O(n log n) |
O(n) |
✅ Yes |
Counting Sort |
O(n + k) |
O(n + k) |
O(k) |
✅ Yes |
Bubble Sort |
O(n²) |
O(n²) |
O(1) |
✅ Yes — never use this |
Stable means equal elements keep their original relative order. This matters when sorting by multiple keys.
Merge Sort — The "Divide and Conquer" Sort
The KEY sorting algorithm to know. It’s the foundation of the divide-and-conquer paradigm.
Strategy: Split the array in half recursively until single elements, then MERGE sorted halves back together.
Quick Sort — The "Partition" Sort
Average case is faster than merge sort (better constants), but worst case is O(n²).
Strategy: Pick a PIVOT, partition elements into "less than pivot" and "greater than pivot", recurse on each partition.
Exercises
-
Merge Sort — full implementation with detailed explanation
-
Quick Sort — with pivot selection strategies
-
Kth Largest Element — QuickSelect algorithm