Kolmogorov's inequality

Kolmogorov's inequality

In probability theory, Kolmogorov's inequality is a so-called "maximal inequality" that gives a bound on the probability that the partial sums of a finite collection of independent random variables exceed some specified bound. The inequality is named after the Russian mathematician Andrey Kolmogorov.Fact|date=May 2007

tatement of the inequality

Let "X"1, ..., "X""n" : Ω → R be independent random variables defined on a common probability space (Ω, "F", Pr), with expected value E ["X""k"] = 0 and variance Var ["X""k"] < +∞ for "k" = 1, ..., "n". Then, for each λ > 0,

:Pr left(max_{1leq kleq n} | S_k |geqlambda ight)leq frac{1}{lambda^2} operatorname{Var} [S_n] equiv frac{1}{lambda^2}sum_{k=1}^n operatorname{Var} [X_k] ,

where "S""k" = "X"1 + ... + "X""k".

Proof

The following argument is due to Kareem Amin and employs discrete martingales. As argued in the discussion of Doob's martingale inequality, the sequence S_1, S_2, dots, S_n is a martingale.
Without loss of generality, we can assume that S_0 = 0 and S_i geq 0 for all i.Define (Z_i)_{i=0}^n as follows. Let Z_0 = 0, and:Z_{i+1} = left{ egin{array}{ll}S_{i+1} & ext{ if } displaystyle max_{1 leq j leq i} S_j < lambda \ Z_i & ext{ otherwise}end{array} ight.for all i.Then (Z_i)_{i=0}^n is a also a martingale. Since ext{E} [S_{i}] = ext{E} [S_{i-1}] for all i and ext{E} [ ext{E} [X|Y] = ext{E} [X] by the law of total expectation,:egin{align}sum_{i=1}^n ext{E} [ (S_i - S_{i-1})^2] &= sum_{i=1}^n ext{E} [ S_i^2 - 2 S_i S_{i-1} + S_{i-1}^2 ] \&= sum_{i=1}^n ext{E}left [ S_i^2 - 2 ext{E} [ S_i S_{i-1} | S_{i-1} ] + ext{E} [S_{i-1}^2 | S_{i-1}] ight] \&= sum_{i=1}^n ext{E}left [ S_i^2 - 2 ext{E} [ S^2_{i-1} | S_{i-1} ] + ext{E} [S_{i-1}^2 | S_{i-1}] ight] \&= ext{E} [S_n^2] - ext{E} [S_0^2] = ext{E} [S_n^2] .end{align}The same is true for (Z_i)_{i=0}^n. Thus:egin{align} ext{Pr}left( max_{1 leq i leq n} S_i geq lambda ight) &= ext{Pr} [Z_n geq lambda] \&leq frac{1}{lambda^2} ext{E} [Z_n^2] =frac{1}{lambda^2} sum_{i=1}^n ext{E} [(Z_i - Z_{i-1})^2] \&leq frac{1}{lambda^2} sum_{i=1}^n ext{E} [(S_i - S_{i-1})^2] =frac{1}{lambda^2} ext{E} [S_n^2] = frac{1}{lambda^2} ext{Var} [S_n] .end{align}by Chebyshev's inequality.

ee also

* Chebyshev's inequality
* Doob's martingale inequality
* Etemadi's inequality
* Landau-Kolmogorov inequality
* Markov's inequality

References

* (Theorem 22.4)
*

----


Wikimedia Foundation. 2010.

Игры ⚽ Поможем сделать НИР

Look at other dictionaries:

  • Kolmogorov's theorem — is any of several different results by Andrey Kolmogorov:;In statistics * Kolmogorov Smirnov test;In probability theory * Hahn Kolmogorov theorem * Kolmogorov existence theorem * Kolmogorov continuity theorem * Kolmogorov s three series theorem * …   Wikipedia

  • Inequality (mathematics) — Not to be confused with Inequation. Less than and Greater than redirect here. For the use of the < and > signs as punctuation, see Bracket. More than redirects here. For the UK insurance brand, see RSA Insurance Group. The feasible regions… …   Wikipedia

  • Kolgomorov's inequality — Kolmogorov s inequality is an inequality which gives a relation among a function and its first and second derivatives. Kolmogorov s inequality states the following:Let f colon mathbb{R} ightarrow mathbb{R} be a twice differentiable function on… …   Wikipedia

  • Kolmogorov-Smirnov test — In statistics, the Kolmogorov ndash;Smirnov test (also called the K S test for brevity) is a form of minimum distance estimation used as a nonparametric test of equality of one dimensional probability distributions used to compare a sample with a …   Wikipedia

  • Andrey Kolmogorov — Infobox Scientist name = Andrey Kolmogorov birth date = birth date|1903|4|25 birth place = Tambov, Imperial Russia nationality = Russian death date = death date and age|1987|10|20|1903|4|25 death place = Moscow, USSR field = Mathematician work… …   Wikipedia

  • Doob's martingale inequality — In mathematics, Doob s martingale inequality is a result in the study of stochastic processes. It gives a bound on the probability that a stochastic process exceeds any given value over a given interval of time. As the name suggests, the result… …   Wikipedia

  • Etemadi's inequality — In probability theory, Etemadi s inequality is a so called maximal inequality , an inequality that gives a bound on the probability that the partial sums of a finite collection of independent random variables exceed some specified bound. The… …   Wikipedia

  • Landau-Kolmogorov inequality — In mathematics, the Landau Kolmogorov inequality is an inequality between different derivatives of a function. There are many inequalities holding this name (sometimes they are also called Kolmogorov type inequalities), common formula is:… …   Wikipedia

  • Chain rule for Kolmogorov complexity — The chain rule for Kolmogorov complexity is an analogue of the chain rule for information entropy, which states: H(X,Y) = H(X) + H(Y | X) That is, the combined randomness of two sequences X and Y is the sum of the randomness of X plus whatever… …   Wikipedia

  • Dvoretzky–Kiefer–Wolfowitz inequality — In the theory of probability and statistics, the Dvoretzky–Kiefer–Wolfowitz inequality predicts how close an empirically determined distribution function will be to the distribution function from which the empirical samples are drawn. It is named …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”