The Pattern
Divide and conquer splits a problem into independent subproblems, solves each recursively, and combines the results. Three steps: Divide (split input), Conquer (recurse), Combine (merge results). Key property: subproblems are independent. Master theorem determines time complexity.
Master Theorem
For recurrences T(n) = aT(n/b) + O(n^d):
- d > log_b(a): O(n^d) — combine step dominates
- d == log_b(a): O(n^d * log n) — balanced
- d < log_b(a): O(n^log_b(a)) — recursion dominates
Merge sort: a=2, b=2, d=1 → d == log_2(2) → O(n log n). Binary search: a=1, b=2, d=0 → O(log n).
Merge Sort
Divide array in half, sort each half, merge. Stable, O(n log n) always, O(n) space.
def merge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
return merge(left, right)
def merge(a, b):
result = []
i = j = 0
while i < len(a) and j < len(b):
if a[i] <= b[j]:
result.append(a[i]); i += 1
else:
result.append(b[j]); j += 1
return result + a[i:] + b[j:]
Merge sort enables counting inversions (LC 315, 493): during the merge step, when a right-side element is chosen before a left-side element, add len(a) – i to the inversion count.
Quick Select (Kth Largest, LC 215)
Find the kth largest element in average O(n) by partitioning around a pivot and recursing into only the relevant half.
import random
def findKthLargest(nums, k):
def partition(lo, hi):
pivot_idx = random.randint(lo, hi)
nums[pivot_idx], nums[hi] = nums[hi], nums[pivot_idx]
pivot = nums[hi]
store = lo
for i in range(lo, hi):
if nums[i] >= pivot:
nums[i], nums[store] = nums[store], nums[i]
store += 1
nums[store], nums[hi] = nums[hi], nums[store]
return store
lo, hi = 0, len(nums) - 1
while lo <= hi:
p = partition(lo, hi)
if p == k - 1: return nums[p]
elif p < k - 1: lo = p + 1
else: hi = p - 1
Randomize pivot selection to avoid O(n^2) worst case on sorted input.
Count Inversions (LC 315, 493)
An inversion is a pair (i,j) where i < j but arr[i] > arr[j]. Use merge sort: when a right-side element is merged before all remaining left-side elements, add len(left) – i to the count.
def count_inversions(arr):
if len(arr) <= 1:
return arr, 0
mid = len(arr) // 2
left, lc = count_inversions(arr[:mid])
right, rc = count_inversions(arr[mid:])
merged, mc = merge_count(left, right)
return merged, lc + rc + mc
def merge_count(a, b):
result, count = [], 0
i = j = 0
while i < len(a) and j < len(b):
if a[i] b[j]
result.append(b[j]); j += 1
return result + a[i:] + b[j:], count
Median of Two Sorted Arrays (LC 4)
O(log(min(m,n))) by binary searching on the smaller array’s partition point. For each candidate partition of the smaller array, determine the required partition of the larger array. Check if the partition is valid (left max <= right min on both sides). Eliminate half the smaller array on each step.
When to Use Divide and Conquer
- Sorting problems where a merge step adds information: count inversions, sort linked list (LC 148)
- Selection: kth largest (quickselect), median of medians
- Geometric: closest pair of points, skyline problem (LC 218)
- Mathematical: matrix exponentiation (Fibonacci in O(log n)), fast power
- When subproblems are truly independent (vs DP where they overlap)
{“@context”:”https://schema.org”,”@type”:”FAQPage”,”mainEntity”:[{“@type”:”Question”,”name”:”What is the divide and conquer pattern and when should you use it?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”Divide and conquer splits a problem into independent subproblems, solves each recursively, and combines the results. Three steps: Divide (split), Conquer (recurse), Combine (merge). Key property: subproblems must be independent (unlike DP where they overlap). Use divide and conquer when: you need to sort (merge sort), find the kth element (quickselect), count cross-partition relationships (inversions), solve geometric problems (closest pair), or compute mathematical functions efficiently (fast exponentiation, Strassen matrix multiply). The Master Theorem T(n) = aT(n/b) + O(n^d) gives the time complexity: if d == log_b(a), time is O(n^d * log n).”}},{“@type”:”Question”,”name”:”How does quickselect find the kth largest element in O(n) average time?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”Quickselect is like quicksort but only recurses into the relevant partition. Choose a pivot, partition the array so elements >= pivot are left and < pivot are right. If the pivot lands at index k-1, return it. If pivot index < k-1, recurse right. If pivot index > k-1, recurse left. Average case O(n): each partition eliminates roughly half the array. Worst case O(n^2) if pivot always lands at the end — use random pivot selection to avoid this. Unlike a heap approach which is always O(n log k), quickselect is O(n) average with O(1) extra space. For very large k or when the data is streamed, a min-heap of size k is preferable (LC 215).”}},{“@type”:”Question”,”name”:”How do you count inversions using merge sort?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”An inversion is a pair (i,j) where i < j but arr[i] > arr[j]. During merge sort's merge step, when a right-side element b[j] is chosen before the remaining left-side elements a[i..end], it creates inversions with every remaining left element. Add len(a) – i to the inversion count at each such choice. Time O(n log n) — the merge sort framework gives us the count for free. LC 315 (Count of Smaller Numbers After Self): for each element, count how many elements to its right are smaller. Same idea: reverse the array, count inversions. LC 493 (Reverse Pairs): modified merge sort counting pairs where nums[i] > 2 * nums[j] for i < j.”}},{“@type”:”Question”,”name”:”What is the difference between divide and conquer and dynamic programming?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”Divide and conquer: subproblems are independent. Each subproblem is computed once, with no reuse. Examples: merge sort, quicksort, closest pair of points. Dynamic programming: subproblems overlap — the same subproblem is solved multiple times and cached. Examples: Fibonacci, longest common subsequence, knapsack. Rule of thumb: draw the recursion tree. If the same node appears multiple times, use DP. If the tree has no repeated nodes, divide and conquer is fine. Merge sort's recursion tree has no repeated subproblems (each sub-array is unique). Fibonacci's recursion tree has massive repetition at fib(n-2), fib(n-3), etc. — DP reduces it from O(2^n) to O(n).”}},{“@type”:”Question”,”name”:”How do you find the median of two sorted arrays in O(log(min(m,n)))?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”Binary search on the smaller array's partition point. For arrays A (size m) and B (size n), binary search on a partition i of A (0 to m). The corresponding partition of B is j = (m+n+1)//2 – i. The partition is valid when A[i-1] <= B[j] and B[j-1] <= A[i] (left sides are all <= right sides). If A[i-1] > B[j]: move i left (too many elements from A on the left). If B[j-1] > A[i]: move i right. When valid: if total length is odd, median = max(A[i-1], B[j-1]); if even, median = (max(A[i-1], B[j-1]) + min(A[i], B[j])) / 2. Time O(log(min(m,n))). This is LC 4.”}}]}
Meta coding interviews test divide and conquer and merge sort patterns. See common questions for Meta interview: divide and conquer and sorting problems.
Google coding interviews frequently test divide and conquer patterns. See problems for Google interview: divide and conquer algorithm problems.
Apple coding interviews test divide and conquer and quickselect. See patterns for Apple interview: divide and conquer and sorting problems.