- Min-entropy
-
In probability theory or information theory, the min-entropy of a discrete random event x with possible states (or outcomes) 1... n and corresponding probabilities p1... pn is
The base of the logarithm is just a scaling constant; for a result in bits, use a base-2 logarithm. Thus, a distribution has a min-entropy of at least b bits if no possible state has a probability greater than 2-b.
The min-entropy is always less than or equal to the Shannon entropy; it is equal when all the probabilities pi are equal. Min-entropy is important in the theory of randomness extractor.
The notation derives from a parameterized family of Shannon-like entropy measures, Rényi entropy,
k=1 is the Shannon entropy. As k is increased, more weight is given to the larger probabilities, and in the limit as k→∞, only the largest p_i has any effect on the result.
See also
- Rényi entropy
- Leftover hash-lemma, Extractor
References
This probability-related article is a stub. You can help Wikipedia by expanding it.