Convolution of probability distributions

Convolution of probability distributions

The convolution of probability distributions arises in probability theory and statistics as the operation in terms of probability distributions that corresponds to the addition of independent random variables and, by extension, to forming linear combinations of random variables. The operation here is a special case of convolution for which special results apply because the context is that of probability distributions.

Contents

Introduction

The probability distribution of the sum of two or more independent random variables is the convolution of their individual distributions. The term is motivated by the fact that the probability mass function or probability density function of a sum of random variables is the convolution of their corresponding probability mass functions or probability density functions respectively. Many well known distributions have simple convolutions: see List of convolutions of probability distributions

Example derivation

There are several ways of derive formulae for the convolution of probability distributions. Often the manipulation of integrals can be avoided by use of some type of generating function. Such methods can also be useful in deriving properties of the resulting distribution, such as moments, even if an explicit formula for the distribution itself cannot be derived.

One of the straightforward techniques is to use characteristic functions, which always exists and are unique to a given distribution.

Convolution of Bernoulli distributions

The convolution of two independent Bernoulli random variables is a Binomial random variable. That is, in a shorthand notation,

 \sum_{i=1}^2 \mathrm{Bernoulli}(p) \sim \mathrm{Binomial}(2,p).

To show this let

X_i \sim \mathrm{Bernoulli}(p), \quad 0<p<1, \quad 1 \le i \le 2

and define

Y=\sum_{i=1}^2 X_i.

Also, let Z denote a generic binomial random variable:

Z \sim \mathrm{Binomial}(2,p) \,\! .

Using probability mass functions

As X1 and X2 are independent,

\begin{align}\mathbb{P}[Y=n]&=\mathbb{P}\left[\sum_{i=1}^2 X_i=n\right] \\ 
&=\sum_{m\in\mathbb{Z}} \mathbb{P}[X_1=m]\times\mathbb{P}[X_2=n-m] \\
&=\sum_{m\in\mathbb{Z}}\left[\binom{1}{m}p^m\left(1-p\right)^{1-m}\right]\left[\binom{1}{n-m}p^{n-m}\left(1-p\right)^{1-n+m}\right]\\
&=p^n\left(1-p\right)^{2-n}\sum_{m\in\mathbb{Z}}\binom{1}{m}\binom{1}{n-m} \\
&=p^n\left(1-p\right)^{2-n}\left[\binom{1}{n}\binom{1}{0}+\binom{1}{n-1}\binom{1}{1}\right]\\
&=\binom{2}{n}p^n\left(1-p\right)^{2-n}=\mathbb{P}[Z=n]
\end{align}


Here, use was made of the fact that \tbinom{n}{k}=0 for k>n in the last but three equality, and of Pascal's rule in the second last equality.

Using characteristic functions

The moment generating function of each Xk and of Z is

\varphi_{X_k}(t)=1-p+pe^{it} \qquad \varphi_Z(t)=\left(1-p+pe^{it}\right)^2

where t is within some neighborhood of zero.

\begin{align}\varphi_Y(t)&=\mathbb{E}\left(e^{it\sum_{k=1}^2 X_k}\right)=\mathbb{E}\left(\prod_{k=1}^2 e^{itX_k}\right)\\
&=\prod_{k=1}^2 \mathbb{E}\left(e^{itX_k}\right)=\prod_{k=1}^2 \left(1-p+pe^{it}\right)\\
&=\left(1-p+pe^{it}\right)^2=\varphi_Z(t)\end{align}

The expectation of the product is the product of the expectations since each Xk is independent. Since Y and Z have the same characteristic function, they must have the same distribution.

References


Wikimedia Foundation. 2010.

Игры ⚽ Поможем сделать НИР

Look at other dictionaries:

  • List of probability distributions — Many probability distributions are so important in theory or applications that they have been given specific names.Discrete distributionsWith finite support* The Bernoulli distribution, which takes value 1 with probability p and value 0 with… …   Wikipedia

  • List of convolutions of probability distributions — In probability theory, the probability distribution of the sum of two or more independent random variables is the convolution of their individual distributions. The term is motivated by the fact that the probability mass function or probability… …   Wikipedia

  • Convolution — For the usage in formal language theory, see Convolution (computer science). Convolution of two square pulses: the resulting waveform is a triangular pulse. One of the functions (in this case g) is first reflected about τ = 0 and then offset by t …   Wikipedia

  • Probability distribution — This article is about probability distribution. For generalized functions in mathematical analysis, see Distribution (mathematics). For other uses, see Distribution (disambiguation). In probability theory, a probability mass, probability density …   Wikipedia

  • Probability density function — Boxplot and probability density function of a normal distribution N(0, σ2). In probability theory, a probability density function (pdf), or density of a continuous random variable is a function that describes the relative likelihood for this… …   Wikipedia

  • Convolution random number generator — In statistics and computer software, a convolution random number generator is a pseudo random number sampling method that can be used to generate random variates from certain classes of probability distribution. The particular advantage of this… …   Wikipedia

  • Convolution power — In mathematics, the convolution power is the n fold iteration of the convolution with itself. Thus if x is a function on Euclidean space Rd and n is a natural number, then the convolution power is defined by where * denotes the convolution… …   Wikipedia

  • Stability (probability) — In probability theory and statistics, the stability of a family of probability distributions is an important property which basically states that if one has a number of random variates that are in the family , any linear combination of these… …   Wikipedia

  • Negative probability — In 1942, Paul Dirac wrote a paper The Physical Interpretation of Quantum Mechanics [1] where he introduced the concept of negative energies and negative probabilities: Negative energies and probabilities should not be considered as nonsense. They …   Wikipedia

  • Contiguity (probability theory) — In probability theory, two sequences of probability measures are said to be contiguous if asymptotically they share the same support. Thus the notion of contiguity extends the concept of absolute continuity to the sequences of measures. The… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”