- Kolmogorov's inequality
In
probability theory , Kolmogorov's inequality is a so-called "maximalinequality " that gives a bound on the probability that thepartial sum s of afinite collection ofindependent random variables exceed some specified bound. The inequality is named after theRussia nmathematician Andrey Kolmogorov .Fact|date=May 2007tatement of the inequality
Let "X"1, ..., "X""n" : Ω → R be independent
random variable s defined on a commonprobability space (Ω, "F", Pr), withexpected value E ["X""k"] = 0 andvariance Var ["X""k"] < +∞ for "k" = 1, ..., "n". Then, for each λ > 0,:Pr left(max_{1leq kleq n} | S_k |geqlambda ight)leq frac{1}{lambda^2} operatorname{Var} [S_n] equiv frac{1}{lambda^2}sum_{k=1}^n operatorname{Var} [X_k] ,
where "S""k" = "X"1 + ... + "X""k".
Proof
The following argument is due to
Kareem Amin and employs discrete martingales. As argued in the discussion ofDoob's martingale inequality , the sequence S_1, S_2, dots, S_n is a martingale.Without loss of generality , we can assume that S_0 = 0 and S_i geq 0 for all i.Define Z_i)_{i=0}^n as follows. Let Z_0 = 0, and:Z_{i+1} = left{ egin{array}{ll}S_{i+1} & ext{ if } displaystyle max_{1 leq j leq i} S_j < lambda \ Z_i & ext{ otherwise}end{array} ight.for all i.Then Z_i)_{i=0}^n is a also a martingale. Since ext{E} [S_{i}] = ext{E} [S_{i-1}] for all i and ext{E} [ ext{E} [X|Y] = ext{E} [X] by thelaw of total expectation ,:egin{align}sum_{i=1}^n ext{E} [ (S_i - S_{i-1})^2] &= sum_{i=1}^n ext{E} [ S_i^2 - 2 S_i S_{i-1} + S_{i-1}^2 ] \&= sum_{i=1}^n ext{E}left [ S_i^2 - 2 ext{E} [ S_i S_{i-1} | S_{i-1} ] + ext{E} [S_{i-1}^2 | S_{i-1}] ight] \&= sum_{i=1}^n ext{E}left [ S_i^2 - 2 ext{E} [ S^2_{i-1} | S_{i-1} ] + ext{E} [S_{i-1}^2 | S_{i-1}] ight] \&= ext{E} [S_n^2] - ext{E} [S_0^2] = ext{E} [S_n^2] .end{align}The same is true for Z_i)_{i=0}^n. Thus:egin{align} ext{Pr}left( max_{1 leq i leq n} S_i geq lambda ight) &= ext{Pr} [Z_n geq lambda] \&leq frac{1}{lambda^2} ext{E} [Z_n^2] =frac{1}{lambda^2} sum_{i=1}^n ext{E} [(Z_i - Z_{i-1})^2] \&leq frac{1}{lambda^2} sum_{i=1}^n ext{E} [(S_i - S_{i-1})^2] =frac{1}{lambda^2} ext{E} [S_n^2] = frac{1}{lambda^2} ext{Var} [S_n] .end{align}byChebyshev's inequality .ee also
*
Chebyshev's inequality
*Doob's martingale inequality
*Etemadi's inequality
*Landau-Kolmogorov inequality
*Markov's inequality References
* (Theorem 22.4)
*----
Wikimedia Foundation. 2010.