Entropy power inequality

Entropy power inequality

In mathematics, the entropy power inequality is a result in probability theory that relates to so-called "entropy power" of random variables. It shows that the entropy power of suitably well-behaved random variables is a superadditive function. The entropy power inequality was proved in 1948 by Claude Shannon in his seminal paper "A Mathematical Theory of Communication". Shannon also provided a sufficient condition for equality to hold; Stam (1959) showed that the condition is in fact necessary.

tatement of the inequality

For a random variable "X" : Ω → R"n" with probability density function "f" : R"n" → R, the information entropy of "X", denoted "h"("X"), is defined to be

:h(X) = - int_{mathbb{R}^{n f(x) log f(x) , mathrm{d} x

and the entropy power of "X", denoted "N"("X"), is defined to be

:N(X) = frac1{2 pi e} exp left( frac{2}{n} h(X) ight).

In particular,"N"("X") = |"K"| 1/n when "X" ~ "Φ"K.

Let "X" and "Y" be independent random variables with probability density functions in the "L""p" space "L""p"(R"n") for some "p" > 1. Then

:N(X + Y) geq N(X) + N(Y). ,

Moreover, equality holds if and only if "X" and "Y" are multivariate normal random variables with proportional covariance matrices.

References

* cite journal
last = Dembo
first = Amir
coauthors = Cover, Thomas M. and Thomas, Joy A.
title = Information-theoretic inequalities
journal = IEEE Trans. Inform. Theory
volume = 37
year = 1991
issue = 6
pages = 1501–1518
issn = 0018-9448
doi = 10.1109/18.104312
MathSciNet|id=1134291
* cite journal
last=Gardner
first=Richard J.
title=The Brunn-Minkowski inequality
journal=Bull. Amer. Math. Soc. (N.S.)
volume=39
issue=3
year=2002
pages=355–405 (electronic)
doi=10.1090/S0273-0979-02-00941-2

* cite journal
last = Shannon
first = Claude E.
authorlink = Claude Shannon
title = A mathematical theory of communication
journal = Bell System Tech. J.
volume = 27
year = 1948
pages = 379–423, 623–656

* cite journal
last = Stam
first = A.J.
title = Some inequalities satisfied by the quantities of information of Fisher and Shannon
journal = Information and Control
volume = 2
year = 1959
pages = 101–112
doi = 10.1016/S0019-9958(59)90348-1


Wikimedia Foundation. 2010.

Игры ⚽ Нужен реферат?

Look at other dictionaries:

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Young's inequality — In mathematics, the standard form of Young s inequality states that if a and b are nonnegative real numbers and p and q are positive real numbers such that 1/ p + 1/ q = 1 then we have:ab le frac{a^p}{p} + frac{b^q}{q}.Equality holds if and only… …   Wikipedia

  • List of mathematics articles (E) — NOTOC E E₇ E (mathematical constant) E function E₈ lattice E₈ manifold E∞ operad E7½ E8 investigation tool Earley parser Early stopping Earnshaw s theorem Earth mover s distance East Journal on Approximations Eastern Arabic numerals Easton s… …   Wikipedia

  • Inequalities in information theory — Inequalities are very important in the study of information theory. There are a number of different contexts in which these inequalities appear.hannon type inequalitiesConsider a finite collection of finitely (or at most countably) supported… …   Wikipedia

  • List of inequalities — This page lists Wikipedia articles about named mathematical inequalities. Inequalities in pure mathematics =Analysis= * Askey–Gasper inequality * Bernoulli s inequality * Bernstein s inequality (mathematical analysis) * Bessel s inequality *… …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

  • Systolic geometry — In mathematics, systolic geometry is the study of systolic invariants of manifolds and polyhedra, as initially conceived by Charles Loewner, and developed by Mikhail Gromov and others, in its arithmetic, ergodic, and topological manifestations.… …   Wikipedia

  • Gini coefficient — The Gini coefficient is a measure of statistical dispersion most prominently used as a measure of inequality of income distribution or inequality of wealth distribution. It is defined as a ratio with values between 0 and 1: A low Gini coefficient …   Wikipedia

  • Clausius theorem — Thermodynamics …   Wikipedia

  • Heat engine — Thermodynamics …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”