Sorting Algorithms Interview Guide: Quicksort, Mergesort, Counting Sort

Why Sorting Algorithms Matter in Interviews

Sorting knowledge signals deep CS understanding. Interviewers ask about sorting to evaluate: time/space complexity analysis, understanding of divide-and-conquer, when to choose different algorithms, and how to apply sorting creatively (sorting as a preprocessing step). You will rarely be asked to implement a sorting algorithm from scratch but must explain the trade-offs and know O(n log n) is the comparison sort lower bound.

Quicksort

Average O(n log n), worst O(n²) time; O(log n) space (call stack). In-place partitioning: choose a pivot, rearrange so all elements smaller than pivot are to its left and larger to its right. Recurse on both halves.

Partition (Lomuto scheme): set pivot = nums[right]. Use store_index = left. Iterate i from left to right-1: if nums[i] <= pivot, swap nums[i] with nums[store_index], increment store_index. Swap pivot with nums[store_index]. Pivot is now at store_index (its final position).

Worst case O(n²): sorted input with always-first-element pivot. Mitigation: random pivot selection, or choose median of first/middle/last elements (median-of-3).

Why used in practice: excellent cache performance (in-place, sequential memory access), low constant factor. Python’s sorted() uses Timsort (merge sort + insertion sort); Java Arrays.sort() for primitives uses dual-pivot quicksort.

Mergesort

O(n log n) time always (best/average/worst); O(n) space for merge buffer. Stable sort (equal elements maintain relative order). Divide array in half, recursively sort each half, merge the two sorted halves.

When to prefer mergesort: when stability matters (sort by secondary key after primary key), when sorting linked lists (merge step is O(n) without random access), when guaranteed O(n log n) is required (can’t afford quicksort’s O(n²) worst case for sensitive applications like database query planners).

Heapsort

O(n log n) time; O(1) space (in-place). Not stable. Build a max-heap from the array (O(n) via Floyd’s algorithm, starting from the last non-leaf). Repeatedly extract the max (swap root with last element, reduce heap size by 1, heapify down). Produces a sorted array in ascending order. Less cache-friendly than quicksort (accesses scattered memory locations during heap operations) → slower in practice despite same asymptotic complexity.

Counting Sort

O(n + k) time and space, where k is the range of values. For values in range [0, k]: create a count array of size k+1. Increment count[value] for each element. Compute prefix sums of count (determines final position). Build output array: for each element, place at count[element]-1, decrement count[element]. Stable sort. Only applicable when k is small relative to n (e.g., sort by age 0-120, sort by single character, sort bytes 0-255).

Radix Sort

O(d × (n + k)) time where d = number of digits and k = base. Sort by least significant digit first (LSD radix sort), using a stable sort (counting sort) for each digit. After d passes, array is fully sorted. For 32-bit integers with base 256: d=4 passes, k=256. Total: 4 × (n + 256) = O(n) for large n — linear time for integers. Requires stable inner sort to maintain correct relative ordering across digit passes.

Timsort (Python’s sorted())

Timsort is a hybrid merge sort + insertion sort used in Python, Java (for objects), and many other languages. It exploits natural runs (already-sorted subsequences) in real data. Minimum run length = 32-64 elements. For each run: if shorter than minimum, use insertion sort to extend it to minimum length (insertion sort is fast for small arrays — good cache behavior). Merge runs using merge sort. Best case: O(n) for already-sorted or reverse-sorted data (one long natural run). Worst case: O(n log n). Stable. Practical winner for general-purpose sorting.

Comparison Sort Lower Bound

Any comparison-based sorting algorithm requires at least Ω(n log n) comparisons in the worst case. Proof sketch: there are n! possible orderings of n elements. Each comparison eliminates at most half the remaining orderings. To distinguish between n! orderings: log₂(n!) comparisons needed. By Stirling’s approximation: log₂(n!) = Ω(n log n). This is why quicksort, mergesort, and heapsort are all optimal comparison sorts. Non-comparison sorts (counting, radix, bucket) bypass this bound by using value information, not just comparisons.

Sorting in Interview Problem Solving

Sorting as preprocessing unlocks O(n log n) solutions to O(n²) problems:

  • Two Sum (sorted array): two pointers from both ends → O(n) after sort
  • 3Sum: sort, fix one element, two pointers for the other two → O(n²)
  • Meeting Rooms: sort by start time → O(n log n) + O(n) linear scan
  • Merge Intervals: sort by start → merge overlapping in one pass
  • Largest Number (LC 179): custom comparator: compare a+b vs b+a as strings
  • H-Index (LC 274): sort descending, find largest i where citations[i] >= i+1
  • Closest K Points to Origin: sort by distance² or use a max-heap of size k

Quick Reference

Algorithm Best Average Worst Space Stable
Quicksort O(n log n) O(n log n) O(n²) O(log n) No
Mergesort O(n log n) O(n log n) O(n log n) O(n) Yes
Heapsort O(n log n) O(n log n) O(n log n) O(1) No
Timsort O(n) O(n log n) O(n log n) O(n) Yes
Counting sort O(n+k) O(n+k) O(n+k) O(n+k) Yes

{
“@context”: “https://schema.org”,
“@type”: “FAQPage”,
“mainEntity”: [
{
“@type”: “Question”,
“name”: “Why is O(n log n) the lower bound for comparison-based sorting?”,
“acceptedAnswer”: {
“@type”: “Answer”,
“text”: “The comparison sort lower bound proves that no comparison-based algorithm can sort n elements in fewer than u03a9(n log n) comparisons in the worst case. Proof by decision tree: any comparison sort can be modeled as a binary decision tree where each internal node is a comparison (is nums[i] > n (huge range), counting sort is worse than comparison sort. Examples: sort ages (0-120), sort character frequencies (0-255), sort exam grades (0-100), sort single-digit numbers (0-9). Use radix sort when: values are integers or fixed-length strings with d digits in base b. Time O(d u00d7 (n+b)), space O(n+b). For 32-bit integers with base 256: O(4 u00d7 (n+256)) = O(n). Most efficient when d is small relative to n. Examples: sort IP addresses (32-bit integers), sort fixed-length strings, sort dates (year-month-day as integers). Use comparison sort (quicksort/mergesort) when: values are arbitrary floats, strings of variable length, complex objects, or when the range is unbounded. The O(n log n) comparison sort is the correct default u2014 reach for counting/radix only when you can prove the input satisfies their constraints and performance is critical.”
}
}
]
}

  • Snap Interview Guide
  • LinkedIn Interview Guide
  • Atlassian Interview Guide
  • Uber Interview Guide
  • Apple Interview Guide
  • Companies That Ask This

    Scroll to Top