- List of convolutions of probability distributions
In
probability theory , theprobability distribution of the sum of two or more independentrandom variable s is the convolution of their individual distributions. The term is motivated by the fact that theprobability mass function orprobability density function of a sum of random variables is theconvolution of their corresponding probabilty mass functions or probability density functions respectively. Many well known distributions have simple convolutions. The following is a list of these convolutions. Each statement is of the form:sum_{i=1}^n X_i sim Ywhere X_1, X_2,dots, X_n, are independent and identically distributed random variables. In place of X_i and Y the names of the corresponding distributions and their parameters have been indicated.Discrete distributions
* sum_{i=1}^n mathrm{Bernoulli}(p) sim mathrm{Binomial}(n,p) qquad 0
* sum_{i=1}^n mathrm{Binomial}(n_i,p) sim mathrm{Binomial}(sum_{i=1}^n n_i,p) qquad 0
* sum_{i=1}^n mathrm{NegativeBinomial}(n_i,p) sim mathrm{NegativeBinomial}(sum_{i=1}^n n_i,p) qquad 0
* sum_{i=1}^n mathrm{Geometric}(p) sim mathrm{NegativeBinomial}(n,p) qquad 0
* sum_{i=1}^n mathrm{Poisson}(lambda_i) sim mathrm{Poisson}(sum_{i=1}^n lambda_i) qquad lambda_i>0 ,!Continuous distributions
* sum_{i=1}^n mathrm{Normal}(mu_i,sigma_i^2) sim mathrm{Normal}(sum_{i=1}^n mu_i, sum_{i=1}^n sigma_i^2) qquad -infty
0
* sum_{i=1}^n mathrm{Gamma}(alpha_i,eta) sim mathrm{Gamma}(sum_{i=1}^n alpha_i,eta) qquad alpha_i>0 quad eta>0
* sum_{i=1}^n mathrm{Exponential}( heta) sim mathrm{Gamma}(n, heta) qquad heta>0 quad n=1,2,dots
* sum_{i=1}^n chi^2(r_i) sim chi^2(sum_{i=1}^n r_i) qquad r_i=1,2,dots
* sum_{i=1}^r N^2(0,1) sim chi^2_r qquad r=1,2,dots
* sum_{i=1}^n(X_i - ar X)^2 sim sigma^2 chi^2_{n-1} qquad mathrm{where} quad X_i sim N(mu,sigma^2) quad mathrm{and} quad ar X = frac{1}{n} sum_{i=1}^n X_i ,!.Example proof
There are various ways to prove the above relations. A straightforward technique is to use the
moment generating function , which is unique to a given distribution.
= Proof that sum_{i=1}^n mathrm{Bernoulli}(p) sim mathrm{Binomial}(n,p) =:X_i sim mathrm{Bernoulli}(p) quad 0
:Y=sum_{i=1}^n X_i:Z sim mathrm{Binomial}(n,p) ,!
The moment generating function of each X_i and of Z is:M_{X_i}(t)=1-p+pe^t qquad M_Z(t)=(1-p+pe^t)^n where "t" is within some neighborhood of zero.
:M_Y(t)=E(e^{tsum_{i=1}^n X_i})=E(prod_{i=1}^n e^{tX_i})=prod_{i=1}^n E(e^{tX_i})=prod_{i=1}^n (1-p+pe^t)=(1-p+pe^t)^n=M_Z(t)
The expectation of the product is the product of the expectations since each X_i is independent.Since Y and Z have the same moment generating function they must have the same distribution.
See also
*
Uniform distribution
*Bernoulli distribution
*Binomial distribution
*Geometric distribution
*Negative binomial distribution
*Poisson distribution
*Exponential distribution
*Beta distribution
*Gamma distribution
*Chi-square distribution
*Normal distribution References
*cite book|last=Craig|first=Allen T.|coauthors=Robert V. Hogg, Joseph W. McKean| title=Introduction to Mathematical Statistics|year=2005| edition=sixth edition | publisher=Pearson Prentice Hall| isbn=0-13-008507-3
Wikimedia Foundation. 2010.