Multidimensional Chebyshev's inequality

Multidimensional Chebyshev's inequality

In probability theory, the multidimensional Chebyshev's inequality is a generalization of Chebyshev's inequality, which puts a bound on the probability of the event that a random variable differs from its expected value by more than a specified amount.

Let X be an N-dimensional random vector with expected value \mu=\mathbb{E} \left[ X \right] and covariance matrix

V=\mathbb{E} \left[ \left(X - \mu \right) \left( X - \mu \right)^T \right]. \,

If V is an invertible matrix (i.e., a strictly positive-definite matrix), for any real number t > 0:


\mathrm{Pr}\left( \sqrt{\left( X-\mu\right)^T \, V^{-1} \, \left( X-\mu\right) } > t \right) \le \frac{N}{t^2}

where N = trace(V − 1V).

Proof

Since V is positive-definite, so is V − 1. Define the random variable


y = \left( X-\mu\right)^T \, V^{-1} \, \left( X-\mu\right) .

Since y is positive, Markov's inequality holds:


\mathrm{Pr}\left( \sqrt{\left( X-\mu\right)^T \, V^{-1} \, \left( X-\mu\right) } > t\right) = \mathrm{Pr}\left( \sqrt{y} > t\right) =\mathrm{Pr}\left( y > t^2 \right) \le \frac{\mathbb{E}[y]}{t^2} .

Finally,

\mathbb{E}[y] = \mathbb{E}[\left( X-\mu\right)^T \, V^{-1} \, \left( X-\mu\right)]
=\mathbb{E}[ \mathrm{trace} (  V^{-1} \, \left( X-\mu\right) \,   \left( X-\mu\right)^T )]
= \mathrm{trace} (  V^{-1} V ) = N .

Wikimedia Foundation. 2010.

Игры ⚽ Нужно решить контрольную?

Look at other dictionaries:

  • Chebyshev's inequality — For the similarly named inequality involving series, see Chebyshev s sum inequality. In probability theory, Chebyshev’s inequality (also spelled as Tchebysheff’s inequality) guarantees that in any data sample or probability distribution, nearly… …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

  • Variance — In probability theory and statistics, the variance of a random variable, probability distribution, or sample is one measure of statistical dispersion, averaging the squared distance of its possible values from the expected value (mean). Whereas… …   Wikipedia

  • Central limit theorem — This figure demonstrates the central limit theorem. The sample means are generated using a random number generator, which draws numbers between 1 and 100 from a uniform probability distribution. It illustrates that increasing sample sizes result… …   Wikipedia

  • List of numerical analysis topics — This is a list of numerical analysis topics, by Wikipedia page. Contents 1 General 2 Error 3 Elementary and special functions 4 Numerical linear algebra …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”