Skellam distribution

Skellam distribution

Probability distribution
name =Skellam
type =mass
pdf_

Examples of the probability mass function for the Skellam distribution. The horizontal axis is the index "k". (Note that the function is only defined at integer values of "k". The connecting lines do not indicate continuity.)
cdf_

parameters =mu_1ge 0,~~mu_2ge 0
support ={ldots, -2,-1,0,1,2,ldots}
pdf =e^{-(mu_1!+!mu_2)}left(frac{mu_1}{mu_2} ight)^{k/2}!!I_k(2sqrt{mu_1mu_2})
cdf =
mean =mu_1-mu_2,
median =N/A
mode =
variance =mu_1+mu_2,
skewness =frac{mu_1-mu_2}{(mu_1+mu_2)^{3/2

kurtosis =1/(mu_1+mu_2),
entropy =
mgf =e^{-(mu_1+mu_2)+mu_1e^t+mu_2e^{-t
char =e^{-(mu_1+mu_2)+mu_1e^{it}+mu_2e^{-it

The Skellam distribution is the discrete probability distribution of the difference K_1-K_2 of two correlated or uncorrelated random variables K_1 and K_2 having Poisson distributions with different expected values mu_1 and mu_2. It is useful in describing the statistics of the difference of two images with simple photon noise, as well as describing the point spread distribution in certain sports where all scored points are equal, such as baseball, hockey and soccer.

Only the case of uncorrelated variables will be considered in this article. SeeKarlis & Ntzoufras, 2003 for the use of the Skellam distribution to describe the difference of correlated Poisson-distributed variables.

Note that the probability mass function of a Poisson distribution with mean μ is given by

: f(k;mu)={mu^kover k!}e^{-mu}.,

The Skellam probability mass function is the cross-correlation of two Poisson distributions: (Skellam, 1946)

: f(k;mu_1,mu_2) =sum_{n=-infty}^infty !f(k!+!n;mu_1)f(n;mu_2) : =e^{-(mu_1+mu_2)}sum_{n=-infty}^infty mu_1^{k+n}mu_2^n}over{n!(k+n)! : = e^{-(mu_1+mu_2)} left({mu_1overmu_2} ight)^{k/2}I_k(2sqrt{mu_1mu_2})

where "I" k(z) is the modified Bessel functionof the first kind. The above formulas have assumed that any term with a negativefactorial is set to zero. The special case for mu_1=mu_2(=mu) is given by Irwin (1937):

: fleft(k;mu,mu ight) = e^{-2mu}I_k(2mu).

Note also that, using the limiting values of the Bessel function for small arguments, we can recover the Poisson distribution as a special case of the Skellam distribution for mu_2=0.

Properties

The Skellam probability mass function is of course normalized:

: sum_{k=-infty}^infty f(k;mu_1,mu_2)=1.

We know that the generating function for a
Poisson distribution is:

: Gleft(t;mu ight)= e^{mu(t-1)}.

It follows that the generating function G(t;mu_1,mu_2) for a Skellam probability function will be:

:G(t;mu_1,mu_2) = sum_{k=0}^infty f(k;mu_1,mu_2)t^k

:= Gleft(t;mu_1 ight)Gleft(1/t;mu_2 ight),

:= e^{-(mu_1+mu_2)+mu_1 t+mu_2/t}.

Notice that the form of the
generating function implies that thedistribution of the sums or the differences of any number of independent Skellam-distributed variables are again Skellam-distributed.

It is sometimes claimed that any linear combination of two Skellam-distributed variables are again Skellam-distributed, but this is clearly not true since any multiplier other than +/-1 would change the support of the distribution.

The moment-generating function is given by:

:Mleft(t;mu_1,mu_2 ight) = G(e^t;mu_1,mu_2)

: = sum_{k=0}^infty { t^k over k!},m_k

which yields the raw moments "mk" . Define:

:Delta stackrel{mathrm{def{=} mu_1-mu_2,

:mu stackrel{mathrm{def{=} (mu_1+mu_2)/2.,

Then the raw moments "m""k" are

:m_1=left.Delta ight.,

:m_2=left.2mu+Delta^2 ight.,

:m_3=left.Delta(1+6mu+Delta^2) ight.,

The central moments "M" "k" are

:M_2=left.2mu ight.,,

:M_3=left.Delta ight.,,

:M_4=left.2mu+12mu^2 ight..,

The mean, variance,
skewness, and kurtosis excess are respectively:

:left. ight.E(n)=Delta,

:sigma^2=left.2mu ight.,

:gamma_1=left.Delta/(2mu)^{3/2} ight.,

:gamma_2=left.1/2mu ight..,

The cumulant-generating function is given by:

: K(t;mu_1,mu_2) stackrel{mathrm{def{=} ln(M(t;mu_1,mu_2)) = sum_{k=0}^infty { t^k over k!},kappa_k

which yields the cumulants:

:kappa_{2k}=left.2mu ight.:kappa_{2k+1}=left.Delta ight. .

For the special case when μ1 = μ2, an
asymptotic expansion of the modified Bessel function of the first kind yields for large μ:

: f(k;mu,mu)sim {1oversqrt{4pimuleft [1+sum_{n=1}^infty (-1)^n{{4k^2-1^2}{4k^2-3^2}cdots{4k^2-(2n-1)^2} over n!,2^{3n},(2mu)^n} ight]

(Abramowitz & Stegun 1972, p. 377).Also, for this special case, when "k" is also large, and of
order of the square root of 2μ, the distributiontends to a normal distribution:

: f(k;mu,mu)sim {e^{-k^2/4mu}oversqrt{4pimu.

These special results can easily be extended to the more general case ofdifferent means.

References

*Abramowitz, M. and Stegun, I. A. (Eds.). 1972. Modified Bessel functions I and K. Sections 9.6–9.7 in "Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables", 9th printing, pp. 374–378. New York: Dover.
*Irwin, J. O. 1937. The frequency distribution of the difference between two independent variates following the same Poisson distribution. "Journal of the Royal Statistical Society: Series A" 100 (3): 415–416. [http://links.jstor.org/sici?sici=0952-8385%281937%29100%3A3%3C415%3ATFDOTD%3E2.0.CO%3B2-R]
*Karlis, D. and Ntzoufras, I. 2003. Analysis of sports data using bivariate Poisson models. "Journal of the Royal Statistical Society: Series D (The Statistician)" 52 (3): 381–393. [http://dx.doi.org/10.1111/1467-9884.00366 doi:10.1111/1467-9884.00366]
*Karlis D. and Ntzoufras I. (2006). Bayesian analysis of the differences of count data . Statistics in Medicine 25, 1885-1905. [http://stat-athens.aueb.gr/~jbn/papers/paper11.htm]
*Skellam, J. G. 1946. The frequency distribution of the difference between two Poisson variates belonging to different populations. "Journal of the Royal Statistical Society: Series A" 109 (3): 296. [http://links.jstor.org/sici?sici=0952-8385%281946%29109%3A3%3C296%3ATFDOTD%3E2.0.CO%3B2-U]


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать реферат

Look at other dictionaries:

  • Normal distribution — This article is about the univariate normal distribution. For normally distributed vectors, see Multivariate normal distribution. Probability density function The red line is the standard normal distribution Cumulative distribution function …   Wikipedia

  • Cauchy distribution — Not to be confused with Lorenz curve. Cauchy–Lorentz Probability density function The purple curve is the standard Cauchy distribution Cumulative distribution function …   Wikipedia

  • Maxwell–Boltzmann distribution — Maxwell–Boltzmann Probability density function Cumulative distribution function parameters …   Wikipedia

  • Probability distribution — This article is about probability distribution. For generalized functions in mathematical analysis, see Distribution (mathematics). For other uses, see Distribution (disambiguation). In probability theory, a probability mass, probability density …   Wikipedia

  • Negative binomial distribution — Probability mass function The orange line represents the mean, which is equal to 10 in each of these plots; the green line shows the standard deviation. notation: parameters: r > 0 number of failures until the experiment is stopped (integer,… …   Wikipedia

  • Exponential distribution — Not to be confused with the exponential families of probability distributions. Exponential Probability density function Cumulative distribution function para …   Wikipedia

  • Multivariate normal distribution — MVN redirects here. For the airport with that IATA code, see Mount Vernon Airport. Probability density function Many samples from a multivariate (bivariate) Gaussian distribution centered at (1,3) with a standard deviation of 3 in roughly the… …   Wikipedia

  • Chi-squared distribution — This article is about the mathematics of the chi squared distribution. For its uses in statistics, see chi squared test. For the music group, see Chi2 (band). Probability density function Cumulative distribution function …   Wikipedia

  • Hypergeometric distribution — Hypergeometric parameters: support: pmf …   Wikipedia

  • Multinomial distribution — Multinomial parameters: n > 0 number of trials (integer) event probabilities (Σpi = 1) support: pmf …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”