- Entropic vector
The entropic vector is a concept arising in
information theory . Shannon'sinformation entropy measures and their associated identities and inequalities (both constrained and unconstrained) have received a lot of attention over the past from the timeShannon introduced his concept of Information Entropy. A lot of inequalities and identities have been found and are available in standard Information Theory texts. But recent researchers have laid focus on trying to find all possible identities and inequalities (both constrained and unconstrained) on such entropies and characterize them. Entropic vector lays down the basic framework for such a study.Definition
Consider "n" jointly distributed
random variables with a jointprobability density function . Let be a subset of . Now we define where . Clearly there are "2" "n" "- 1" non-empty subsets of . Corresponding to each , we have the jointentropy defined as . A vector in consisting of as its elements for all non-empty subsets of . Such a vector is called an entropic vector.Example
Let "X","Y" be 2 independent binary random variables with probability of each symbol as one-half. Then : Note that
mutual information is then given by : This is because X and Y are independent. The entropic vector is thus :We note that is in as there exists random variables with the entries in the vector as its entropies.Open problem
Given a vector , is it possible to say if there exists random variables such that their joint entropies are given by ? It turns out that for the problem has been solved. But for , it still remains unsolved. Defining the set of all such vectors that can be constructed from a set of random variables as , we see that a complete characterization of this space remains an unsolved mystery.
References
* Thomas M. Cover, Joy A. Thomas. "Elements of information theory" New York: Wiley, 1991. ISBN 0-471-06259-6
Wikimedia Foundation. 2010.