,

Contents · Randomized algorithms and tail bounds (Chernoff, Hoeffding)


Overview

Randomized algorithms use randomness to achieve simplicity, speed, or better expected guarantees. Concentration inequalities like Chernoff and Hoeffding bounds quantify how sums of independent random variables concentrate around their means, enabling high-probability performance analyses and amplification.


Details

  • Randomized paradigms: Las Vegas (always correct, random runtime) vs Monte Carlo (bounded runtime, small error probability).
  • Amplification: repeat and take majority/median to drive error probability down exponentially.
  • Hoeffding’s inequality (qualitative): bounded independent variables concentrate via sub-Gaussian tails.
  • Chernoff bounds (qualitative): sums of independent Bernoulli/RV with mgf bounds yield multiplicative concentration.
  • Applications: randomized quicksort expected O(n log n); hashing and load balancing; sampling and estimation; reservoir sampling.
  • Median-of-means for robust mean estimation; pairwise independence vs full independence; limited independence Chernoff (high-level).

Exercises

  1. Analyze the expected number of comparisons in randomized quicksort and show it is 2n ln n + O(n).
  2. Apply a Chernoff bound to show that in m balls thrown into n bins uniformly at random, the maximum load is O((m/n) + log n/log log n) with high probability (outline).
  3. Design a median-of-means estimator for sub-Gaussian variables and prove a high-probability error bound using Hoeffding.