- Entropic vector
The entropic vector is a concept arising in
information theory . Shannon'sinformation entropy measures and their associated identities and inequalities (both constrained and unconstrained) have received a lot of attention over the past from the timeShannon introduced his concept of Information Entropy. A lot of inequalities and identities have been found and are available in standard Information Theory texts. But recent researchers have laid focus on trying to find all possible identities and inequalities (both constrained and unconstrained) on such entropies and characterize them. Entropic vector lays down the basic framework for such a study.Definition
Consider "n" jointly distributed
random variables X_1, X_2, ldots, X_n with a jointprobability density function p(X_1, X_2, ldots, X_n). Let alpha} be a subset of N = lbrace 1,2,ldots,n brace. Now we define p(X_{alpha}) = p(X_{i_1}, X_{i_2}, ldots, X_{i_k}) where alpha = lbrace i_1, i_2, ldots, i_k brace . Clearly there are "2" "n" "- 1" non-empty subsets of N. Corresponding to each alpha} , we have the jointentropy defined as H left (X_{alpha} ight ). A vector in R^{2^n -1} consisting of H left (X_{alpha} ight ) as its elements for all non-empty subsets alpha} of N. Such a vector is called an entropic vector.Example
Let "X","Y" be 2 independent binary random variables with probability of each symbol as one-half. Then :H left (X ight ) = H(Y) = 1, H(X,Y) = 2 Note that
mutual information is then given by :I left (X;Y ight ) = H(X) + H(Y) - H(X,Y) = 0 This is because X and Y are independent. The entropic vector is thus :v = left ( 1,1,2 ight )^TWe note that v in R^3 is in Gamma}^{*}_2 as there exists random variables with the entries in the vector as its entropies.Open problem
Given a vector v in R^{2^n -1}, is it possible to say if there exists n random variables such that their joint entropies are given by v? It turns out that for n=2,3 the problem has been solved. But for n geq 4, it still remains unsolved. Defining the set of all such vectors v in R^{2^n -1} that can be constructed from a set of n random variables as Gamma}^{*}_n, we see that a complete characterization of this space remains an unsolved mystery.
References
* Thomas M. Cover, Joy A. Thomas. "Elements of information theory" New York: Wiley, 1991. ISBN 0-471-06259-6
Wikimedia Foundation. 2010.