- Multinomial distribution
-
Multinomial parameters: n > 0 number of trials (integer)
event probabilities (Σpi = 1)support:
pmf: mean: E{Xi} = npi variance:
mgf: pgf: In probability theory, the multinomial distribution is a generalization of the binomial distribution.
The binomial distribution is the probability distribution of the number of "successes" in n independent Bernoulli trials, with the same probability of "success" on each trial. In a multinomial distribution, the analog of the Bernoulli distribution is the categorical distribution, where each trial results in exactly one of some fixed finite number k of possible outcomes, with probabilities p1, ..., pk (so that pi ≥ 0 for i = 1, ..., k and ), and there are n independent trials. Then let the random variables Xi indicate the number of times outcome number i was observed over the n trials. The vector X = (X1, ..., Xk) follows a multinomial distribution with parameters n and p, where p = (p1, ..., pk).
Note that, in some fields, such as natural language processing, the categorical and multinomial distributions are conflated, and it is common to speak of a "multinomial distribution" when a categorical distribution is actually meant. This stems from the fact that it is sometimes convenient to express the outcome of a categorical distribution as a "1-of-K" vector (a vector with one element containing a 1 and all other elements containing a 0) rather than as an integer in the range ; in this form, a categorical distribution is equivalent to a multinomial distribution over a single observation.
Contents
Specification
Probability mass function
The probability mass function of the multinomial distribution is:
for non-negative integers x1, ..., xk.
Visualization
As slices of generalized Pascal's triangle
Just like one can interpret the binomial distribution as (normalized) 1D slices of Pascal's triangle, so too can one interpret the multinomial distribution as 2D (triangular) slices of Pascal's pyramid, or 3D/4D/+ (pyramid-shaped) slices of higher-dimensional analogs of Pascal's triangle. This reveals an interpretation of the range of the distribution: discretized equilaterial "pyramids" in arbitrary dimension -- i.e. a simplex with a grid.
As polynomial coefficients
Similarly, just like one can interpret the binomial distribution as the polynomial coefficients of (px1 + (1 − p)x2)n when expanded, one can interpret the multinomial distribution as the coefficients of (p1x1 + p2x2 + p3x3 + ... + pkxk)n when expanded. (Note that just like the binomial distribution, the coefficients must sum to 1.) This is the origin of the name "multinomial distribution".
Properties
The expected number of times the outcome i was observed over n trials is
The covariance matrix is as follows. Each diagonal entry is the variance of a binomially distributed random variable, and is therefore
The off-diagonal entries are the covariances:
for i, j distinct.
All covariances are negative because for fixed n, an increase in one component of a multinomial vector requires a decrease in another component.
This is a k × k positive-semidefinite matrix of rank k − 1.
The off-diagonal entries of the corresponding correlation matrix are
Note that the sample size drops out of this expression.
Each of the k components separately has a binomial distribution with parameters n and pi, for the appropriate value of the subscript i.
The support of the multinomial distribution is the set
Its number of elements is
the number of n-combinations of a multiset with k types, or multiset coefficient.
Example
In a recent three-way election for a large country, candidate A received 20% of the votes, candidate B received 30% of the votes, and candidate C received 50% of the votes. If six voters are selected randomly, what is the probability that there will be exactly one supporter for candidate A, two supporters for candidate B and three supporters for candidate C in the sample?
Note: Since we’re assuming that the voting population is large, it is reasonable and permissible to think of the probabilities as unchanging once a voter is selected for the sample. Technically speaking this is sampling without replacement, so the correct distribution is the multivariate hypergeometric distribution, but the distributions converge as the population grows large.
Sampling from a multinomial distribution
First, reorder the parameters such that they are sorted in descending order (this is only to speed up computation and not strictly necessary). Now, for each trial, draw an auxiliary variable X from a uniform (0, 1) distribution. The resulting outcome is the component
This is a sample for the multinomial distribution with n = 1. A sum of independent repetitions of this experiment is a sample from a multinomial distribution with n equal to the number of such repetitions.
Related distributions
- When k = 2, the multinomial distribution is the binomial distribution.
- The continuous analogue is Multivariate normal distribution.
- Categorical distribution, the distribution of each trial; for k = 2, this is the Bernoulli distribution.
- The Dirichlet distribution is the conjugate prior of the multinomial in Bayesian statistics.
- Multivariate Pólya distribution.
- Beta-binomial model.
See also
References
- Evans, Merran; Hastings, Nicholas; Peacock, Brian (2000). Statistical Distributions. New York: Wiley. pp. 134–136. ISBN 0-471-37124-6. 3rd ed..
Categories:- Discrete distributions
- Multivariate discrete distributions
- Factorial and binomial topics
Wikimedia Foundation. 2010.