当前课程知识点:算法设计与分析 > 11 Randomized Algorithms > 11.4 Chernoff Bounds > Chernoff Bounds
Chernoff bound是一个强力的工具
广泛应用于随机算法分析中
定理 假设X_1,…,X_n是独立的0-1随机变量
令X=X_1+…+X_n
对于任意μ≥E[X] 及δ>0
我们有
Pr[X>(1+δ)μ]<[e^δ/(1+δ)^1+δ]^μ
这个定理刻画了随机变量的和
大于它的期望值的概率边界
下面的定理描述了随机变量的和
小于它的期望值的概率边界
定理 假设X_1,…,X_n是独立的0-1随机变量
令X=X_1+…+X_n
对于任意μ≤E[X] 及0<δ<1
我们有
Pr[X<(1-δ)μ] 负载平衡是我们熟悉的问题 我们再来看随机算法 应用于负载平衡问题 负载平衡问题为 有m个任务 需要在n个相同的处理器上加工 希望尽量均匀的安排 以平衡各处理器上的负载数目 如果有个中央控制 我么可以几乎平均地安排任务 每个处理器上至多处理┌m/n┐个任务 现实中 往往没有一个中央控制 每个任务各自随机地 安排在一台处理器 问题是 安排最多任务的处理上 会处理多少任务呢 这是一个随机算法 我们要考察的目标是一个随机变量 先令随机变量X_i 为处理器i上安排的任务数目 如果任务j被安排在处理器i上 令Y_ij=1 否则Y_ij=0 任务j被安排在处理器i上的概率为1/n 因此我们有E[Y_ij]=1/n X_i=ΣY_ij 令μ=E[X_i]=1 应用Chernoff bound 其中令δ=c–1 得到 Pr[X_i>c] 令γ(n)为使得x^x=n的x 因此 c=eγ(n) Pr[X_i>c] Union bound表明 存在一个X_i使得X_i>c的概率 这里c=eγ(n)=Φ(log n/loglog n) 也就是说 以≥1-1/n的概率 每台处理机上处理的任务数目 小于等于Φ(log n/loglog n)
-1.1 Introduction
-1.2 A First Problem: Stable Matching
--A First Problem: Stable Matching
-1.3 Gale-Shapley Algorithm
-1.4 Understanding Gale-Shapley Algorithm
--Understanding Gale-Shapley Algorithm
-Homework1
-Lecture note 1
--Lecture note 1 Introduction of Algorithm
-2.1 Computational Tractability
-2.2 Asymptotic Order of Growth
-2.3 A Survey of Common Running Times
--A Survey of Common Running Times
-Homework2
-Lecture note 2
--Lecture note 2 Basics of Algorithm Analysis
-3.1 Basic Definitions and Applications
--Basic Definitions and Applications
-3.2 Graph Traversal
-3.3 Testing Bipartiteness
-3.4 Connectivity in Directed Graphs
--Connectivity in Directed Graphs
-3.5 DAG and Topological Ordering
--DAG and Topological Ordering
-Homework3
-Lecture note 3
-4.1 Coin Changing
-4.2 Interval Scheduling
-4.3 Interval Partitioning
-4.4 Scheduling to Minimize Lateness
--Scheduling to Minimize Lateness
-4.5 Optimal Caching
-4.6 Shortest Paths in a Graph
-4.7 Minimum Spanning Tree
-4.8 Correctness of Algorithms
-4.9 Clustering
-Homework4
-Lecture note 4
--Lecture note 4 Greedy Algorithms
-5.1 Mergesort
-5.2 Counting Inversions
-5.3 Closest Pair of Points
-5.4 Integer Multiplication
-5.5 Matrix Multiplication
--Video
-5.6 Convolution and FFT
-5.7 FFT
--FFT
-5.8 Inverse DFT
-Homework5
-Lecture note 5
--Lecture note 5 Divide and Conquer
-6.1 Weighted Interval Scheduling
--Weighted Interval Scheduling
-6.2 Segmented Least Squares
-6.3 Knapsack Problem
-6.4 RNA Secondary Structure
-6.5 Sequence Alignment
-6.6 Shortest Paths
-Homework6
-Lecture note 6
--Lecture note 6 Dynamic Programming
-7.1 Flows and Cuts
-7.2 Minimum Cut and Maximum Flow
--Minimum Cut and Maximum Flow
-7.3 Ford-Fulkerson Algorithm
-7.4 Choosing Good Augmenting Paths
--Choosing Good Augmenting Paths
-7.5 Bipartite Matching
-Homework7
-Lecture note 7
-8.1 Polynomial-Time Reductions
-8.2 Basic Reduction Strategies I
--Basic Reduction Strategies I
-8.3 Basic Reduction Strategies II
--Basic Reduction Strategies II
-8.4 Definition of NP
-8.5 Problems in NP
-8.6 NP-Completeness
-8.7 Sequencing Problems
-8.8 Numerical Problems
-8.9 co-NP and the Asymmetry of NP
--co-NP and the Asymmetry of NP
-Homework8
-Lecture note 8
--Lecture note 8 NP and Computational Intractability
-9.1 Load Balancing
-9.2 Center Selection
-9.3 The Pricing Method: Vertex Cover
--The Pricing Method: Vertex Cover
-9.4 LP Rounding: Vertex Cover
-9.5 Knapsack Problem
-Homework9
-Lecture note 9
--Lecture note 9 Approximation Algorithms
-10.1 Landscape of an Optimization Problem
--Landscape of an Optimization Problem
-10.2 Maximum Cut
-10.3 Nash Equilibria
-10.4 Price of Stability
-Homework10
-Lecture note 10
--Lecture note 10 Local Search
-11.1 Contention Resolution
-11.2 Linearity of Expectation
-11.3 MAX 3-SAT
-11.4 Chernoff Bounds
-Homework11
-Lecture note 11
--Lecture note 11 Randomized Algorithms
-Exam