- Proof of the law of large numbers
Given "X"1, "X"2, ... an infinite sequence of
i.i.d. random variables with finite expected value "E(X"1")" = "E(X"2")" = ... = µ < ∞, we are interested in the convergence of the sample average:
__TOC__
The weak law
Theorem:
Proof using Chebyshev's inequality
This proof uses the assumption of finite variance (for all ). The independence of the random variables implies no correlation between them, and we have that
:
The common mean μ of the sequence is the mean of the sample average:
:
Using
Chebyshev's inequality on results in:
This may be used to obtain the following:
:
As "n" approaches infinity, the expression approaches 1. And by definition of convergence in probability (see
Convergence of random variables ), we have obtained:
Proof using convergence of characteristic functions
By
Taylor's theorem forcomplex function s, the characteristic function of any random variable, "X", with finite mean μ, can be written as:
All "X"1, "X"2, ... have the same characteristic function, so we will simply denote this "φ""X".
Among the basic properties of characteristic functions there are
:
These rules can be used to calculate the characteristic function of in terms of "φ""X":
:
The limit "e""it"μ is the characteristic function of the constant random variable μ, and hence by the
Lévy continuity theorem , converges in distribution to μ::
μ is a constant, which implies that convergence in distribution to μ and convergence in probability to μ are equivalent. (See
Convergence of random variables ) This implies that:
This proof states, in fact, that the sample mean converges in probability to the derivative of the characteristic function at the origin, as long as this exists.
Wikimedia Foundation. 2010.