当前课程知识点:算法设计与分析 >  11 Randomized Algorithms >  11.4 Chernoff Bounds >  Chernoff Bounds

返回《算法设计与分析》慕课在线视频课程列表

Chernoff Bounds在线视频

Chernoff Bounds

下一节:Lecture note 11 Randomized Algorithms

返回《算法设计与分析》慕课在线视频列表

Chernoff Bounds课程教案、知识点、字幕

Chernoff bound是一个强力的工具

广泛应用于随机算法分析中

定理 假设X_1,…,X_n是独立的0-1随机变量

令X=X_1+…+X_n

对于任意μ≥E[X] 及δ>0

我们有

Pr[X>(1+δ)μ]<[e^δ/(1+δ)^1+δ]^μ

这个定理刻画了随机变量的和

大于它的期望值的概率边界

下面的定理描述了随机变量的和

小于它的期望值的概率边界

定理 假设X_1,…,X_n是独立的0-1随机变量

令X=X_1+…+X_n

对于任意μ≤E[X] 及0<δ<1

我们有

Pr[X<(1-δ)μ]

负载平衡是我们熟悉的问题

我们再来看随机算法

应用于负载平衡问题

负载平衡问题为

有m个任务

需要在n个相同的处理器上加工

希望尽量均匀的安排

以平衡各处理器上的负载数目

如果有个中央控制

我么可以几乎平均地安排任务

每个处理器上至多处理┌m/n┐个任务

现实中 往往没有一个中央控制

每个任务各自随机地

安排在一台处理器

问题是

安排最多任务的处理上

会处理多少任务呢

这是一个随机算法

我们要考察的目标是一个随机变量

先令随机变量X_i

为处理器i上安排的任务数目

如果任务j被安排在处理器i上

令Y_ij=1

否则Y_ij=0

任务j被安排在处理器i上的概率为1/n

因此我们有E[Y_ij]=1/n

X_i=ΣY_ij

令μ=E[X_i]=1

应用Chernoff bound

其中令δ=c–1

得到 Pr[X_i>c]

令γ(n)为使得x^x=n的x

因此 c=eγ(n)

Pr[X_i>c]=(1/γ(n))^eγ(n)<(1/γ(n))^2γ(n)=1/n^2

Union bound表明

存在一个X_i使得X_i>c的概率

这里c=eγ(n)=Φ(log n/loglog n)

也就是说 以≥1-1/n的概率

每台处理机上处理的任务数目

小于等于Φ(log n/loglog n)

算法设计与分析课程列表:

1 Introduction of Algorithm

-1.1 Introduction

--Introduction

-1.2 A First Problem: Stable Matching

--A First Problem: Stable Matching

-1.3 Gale-Shapley Algorithm

--Gale-Shapley Algorithm

-1.4 Understanding Gale-Shapley Algorithm

--Understanding Gale-Shapley Algorithm

-Homework1

-Lecture note 1

--Lecture note 1 Introduction of Algorithm

2 Basics of Algorithm Analysis

-2.1 Computational Tractability

--Computational Tractability

-2.2 Asymptotic Order of Growth

--Asymptotic Order of Growth

-2.3 A Survey of Common Running Times

--A Survey of Common Running Times

-Homework2

-Lecture note 2

--Lecture note 2 Basics of Algorithm Analysis

3 Graph

-3.1 Basic Definitions and Applications

--Basic Definitions and Applications

-3.2 Graph Traversal

--Graph Traversal

-3.3 Testing Bipartiteness

--Testing Bipartiteness

-3.4 Connectivity in Directed Graphs

--Connectivity in Directed Graphs

-3.5 DAG and Topological Ordering

--DAG and Topological Ordering

-Homework3

-Lecture note 3

--Lecture note 3 Graph

4 Greedy Algorithms

-4.1 Coin Changing

--Coin Changing

-4.2 Interval Scheduling

--Interval Scheduling

-4.3 Interval Partitioning

--Interval Partitioning

-4.4 Scheduling to Minimize Lateness

--Scheduling to Minimize Lateness

-4.5 Optimal Caching

--Optimal Caching

-4.6 Shortest Paths in a Graph

--Shortest Paths in a Graph

-4.7 Minimum Spanning Tree

--Minimum Spanning Tree

-4.8 Correctness of Algorithms

--Correctness of Algorithms

-4.9 Clustering

--Clustering

-Homework4

-Lecture note 4

--Lecture note 4 Greedy Algorithms

5 Divide and Conquer

-5.1 Mergesort

--Mergesort

-5.2 Counting Inversions

--Counting Inversions

-5.3 Closest Pair of Points

--Closest Pair of Points

-5.4 Integer Multiplication

--Integer Multiplication

-5.5 Matrix Multiplication

--Video

-5.6 Convolution and FFT

--Convolution and FFT

-5.7 FFT

--FFT

-5.8 Inverse DFT

--Inverse DFT

-Homework5

-Lecture note 5

--Lecture note 5 Divide and Conquer

6 Dynamic Programming

-6.1 Weighted Interval Scheduling

--Weighted Interval Scheduling

-6.2 Segmented Least Squares

--Segmented Least Squares

-6.3 Knapsack Problem

--Knapsack Problem

-6.4 RNA Secondary Structure

--RNA Secondary Structure

-6.5 Sequence Alignment

--Sequence Alignment

-6.6 Shortest Paths

--Shortest Paths

-Homework6

-Lecture note 6

--Lecture note 6 Dynamic Programming

7 Network Flow

-7.1 Flows and Cuts

--Flows and Cuts

-7.2 Minimum Cut and Maximum Flow

--Minimum Cut and Maximum Flow

-7.3 Ford-Fulkerson Algorithm

--Ford-Fulkerson Algorithm

-7.4 Choosing Good Augmenting Paths

--Choosing Good Augmenting Paths

-7.5 Bipartite Matching

--Bipartite Matching

-Homework7

-Lecture note 7

--Lecture note 7 Network Flow

8 NP and Computational Intractability

-8.1 Polynomial-Time Reductions

--Polynomial-Time Reductions

-8.2 Basic Reduction Strategies I

--Basic Reduction Strategies I

-8.3 Basic Reduction Strategies II

--Basic Reduction Strategies II

-8.4 Definition of NP

--Definition of NP

-8.5 Problems in NP

--Problems in NP

-8.6 NP-Completeness

--NP-Completeness

-8.7 Sequencing Problems

--Sequencing Problems

-8.8 Numerical Problems

--Numerical Problems

-8.9 co-NP and the Asymmetry of NP

--co-NP and the Asymmetry of NP

-Homework8

-Lecture note 8

--Lecture note 8 NP and Computational Intractability

9 Approximation Algorithms

-9.1 Load Balancing

--Load Balancing

-9.2 Center Selection

--Center Selection

-9.3 The Pricing Method: Vertex Cover

--The Pricing Method: Vertex Cover

-9.4 LP Rounding: Vertex Cover

--LP Rounding: Vertex Cover

-9.5 Knapsack Problem

--Knapsack Problem

-Homework9

-Lecture note 9

--Lecture note 9 Approximation Algorithms

10 Local Search

-10.1 Landscape of an Optimization Problem

--Landscape of an Optimization Problem

-10.2 Maximum Cut

--Maximum Cut

-10.3 Nash Equilibria

--Nash Equilibria

-10.4 Price of Stability

--Price of Stability

-Homework10

-Lecture note 10

--Lecture note 10 Local Search

11 Randomized Algorithms

-11.1 Contention Resolution

--Contention Resolution

-11.2 Linearity of Expectation

--Linearity of Expectation

-11.3 MAX 3-SAT

--MAX 3-SAT

-11.4 Chernoff Bounds

--Chernoff Bounds

-Homework11

-Lecture note 11

--Lecture note 11 Randomized Algorithms

Exam

-Exam

Chernoff Bounds笔记与讨论

也许你还感兴趣的课程:

© 柠檬大学-慕课导航 课程版权归原始院校所有,
本网站仅通过互联网进行慕课课程索引,不提供在线课程学习和视频,请同学们点击报名到课程提供网站进行学习。