# Perplexity

Perplexity

Perplexity is a measurement in information theory. It is defined as 2 raised to the power of entropy, or more often as 2 raised to the power of cross-entropy. The latter definition is commonly used to compare probability models empirically.

Perplexity of a probability distribution

The perplexity of a discrete probability distribution "p" is defined as

:$2^\left\{H\left(p\right)\right\}=2^\left\{-sum_x p\left(x\right)log_2 p\left(x\right)\right\}$

where "H"("p") is the entropy of the distribution and "x" ranges over events.

One may also define the perplexity of a random variable "X"as the perplexity of the distribution over its possible values "x".

It is easy to see that in the special case where "p" models a fair "k"-sided die (a uniform distribution over "k" discrete events), its perplexity is "k". We may therefore regard "any" random variable with perplexity "k" as having the same uncertainty as a fair "k"-sided die. We are "k"-ways perplexed" about the value of such a random variable. (Unless it is also a fair "k"-sided die, more than "k" values will be possible, but the overall uncertainty is no greater because some of these values will have probability greater than 1/"k".)

Perplexity of a probability model

Often one tries to model an unknown probability distribution "p", based on a training sample that was drawn from "p". Given a proposed probability model "q", one may evaluate "q" by asking how well it predicts a separate test sample "x"1, "x"2, ..., "xN" also drawn from "p". The perplexity of the model "q" is defined as

:$2^\left\{-sum_\left\{i=1\right\}^N frac\left\{1\right\}\left\{N\right\} log_2 q\left(x_i\right)\right\}$

Better models "q" of the unknown distribution "p" will tend to assign higher probabilities "q"("xi") to the test events. Thus, they have lower perplexity: they are less surprised by the test sample.

The exponent above may be regarded as the average number of bits needed to represent a test event "xi" if one uses an optimal code based on "q". Low-perplexity models do a better job of compressing the test sample, requiring few bits per test element on average because "q"("xi") tends to be high.

The exponent may also be regarded as a cross-entropy,

:$H\left( ilde\left\{p\right\},q\right) = -sum_x ilde\left\{p\right\}\left(x\right) log_2 q\left(x\right)$

where $ilde\left\{p\right\}$ denotes the empirical distribution of the test sample (i.e., $ilde\left\{p\right\}\left(x\right) = n/N$ if "x" appeared "n" times in the test sample of size "N").

Perplexity per word

In natural language processing, perplexity is a common way of evaluating language models. A language model is a probability distribution over entire sentences or texts.

Using the definition above, one might find that the average sentence "xi" in the test sample could be coded in 190 bits (i.e., the test sentences had an average log-probability of -190). This would give an enormous model perplexity of 2190 per sentence. However, it is more common to normalize for sentence length and consider only the number of bits per word. Thus, if the test sample's sentences comprised a total of 1,000 words, and could be coded using a total of 7,950 bits, one could report a model perplexity of 27.95 = 247 "per word." In other words, the model is as confused on test data as if it had to choose uniformly and independently among 247 possibilities for each word.

The lowest perplexity that has been published on the Brown Corpus (1 million words of American English of varying topics and genres) is indeed about 247 per word, corresponding to a cross-entropy of log2247 = 7.95 bits per word or 1.75 bits per letter [cite journal |last=Brown |first=Peter E. |authorlink= |coauthors=et al|year=1992 |month=March |title= An Estimate of an Upper Bound for the Entropy of English|journal=Computational Linguistics |volume=18 |issue=1 |pages= |id= |url=http://acl.ldc.upenn.edu/J/J92/J92-1002.pdf |accessdate=2007-02-07] using a trigram model. It is often possible to achieve lower perplexity on more specialized corpora, as they are more predictable.

References

Wikimedia Foundation. 2010.

Synonyms:

### Look at other dictionaries:

• Perplexity — Per*plex i*ty, n.; pl. {Perplexities}. [L. perplexitas: cf. F. perplexit[ e].] The quality or state of being perplexed or puzzled; complication; intricacy; entanglement; distraction of mind through doubt or difficulty; embarrassment;… …   The Collaborative International Dictionary of English

• perplexity — index cloud (suspicion), complication, confusion (ambiguity), dilemma, doubt (indecision), enig …   Law dictionary

• perplexity — (n.) c.1300, from L.L. perplexitas, from L. perplexus confused, involved, from per completely + plexus entangled, pp. of plectere to twine (see COMPLEX (Cf. complex) (adj.)) …   Etymology dictionary

• perplexity — [pər plek′sə tē] n. [ME perplexite < MFr perplexité < LL perplexitas] 1. the condition of being perplexed; bewilderment; confusion 2. pl. perplexities something that perplexes 3. something that is perplexed, or complicated …   English World dictionary

• perplexity — UK [pə(r)ˈpleksətɪ] / US [pərˈpleksətɪ] noun Word forms perplexity : singular perplexity plural perplexities 1) [uncountable] a confused feeling that you have because you cannot understand something They stared in perplexity at the map. 2)… …   English dictionary

• perplexity — [[t]pə(r)ple̱ksɪti[/t]] perplexities 1) N UNCOUNT Perplexity is a feeling of being confused and frustrated because you do not understand something. He began counting them and then, with growing perplexity, counted them a second time. Syn:… …   English dictionary

• perplexity — per|plex|i|ty [ pər pleksəti ] noun 1. ) uncount a confused feeling because you cannot understand something: CONFUSION: They stared in perplexity at the map. 2. ) count usually plural something that makes a subject or situation difficult to… …   Usage of the words and phrases in modern English

• perplexity — noun 1) he scratched his head in perplexity Syn: confusion, bewilderment, puzzlement, bafflement, incomprehension, mystification, bemusement; informal bamboozlement, discombobulation 2) the perplexities of international relations Syn: complexity …   Thesaurus of popular words

• perplexity — perplex ► VERB ▪ cause to feel baffled; puzzle greatly. DERIVATIVES perplexity noun (pl. perplexities) . ORIGIN from Latin perplexus entangled …   English terms dictionary

• perplexity — noun (plural ties) Etymology: Middle English perplexite, from Middle French perplexité, from Late Latin perplexitat , perplexitas, from Latin perplexus Date: 14th century 1. the state of being perplexed ; bewilderment 2. something that perplexes… …   New Collegiate Dictionary