- Entropy rate
The entropy rate of a
stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with acountable index, the entropy rate is the limit of thejoint entropy of members of the process divided by , as tends toinfinity ::
when the limit exists. An alternative, related quantity is:
:
For
strongly stationary stochastic processes, .Entropy rates for Markov chains
Since a stochastic process defined by a
Markov chain which isirreducible andaperiodic has astationary distribution , the entropy rate is independent of the initial distribution.For example, for such a Markov chain "Y""k" defined on a
countable number of states, given thetransition matrix "P""ij", "H"("Y") is given by::
where "μ""i" is the
stationary distribution of the chain.A simple consequence of this definition is that the entropy rate of an i.i.d.
stochastic process has an entropy rate that is the same as theentropy of any individual member of the process.ee also
*
Information source (mathematics)
*Markov information source References
* Cover, T. and Thomas, J. (1991) Elements of Information Theory, John Wiley and Sons, Inc., ISBN 0471062596 [http://www3.interscience.wiley.com/cgi-bin/bookhome/110438582?CRETRY=1&SRETRY=0]
External links
* [http://www.eng.ox.ac.uk/samp Systems Analysis, Modelling and Prediction (SAMP), University of Oxford]
MATLAB code for estimating information-theoretic quantities for stochastic processes.
Wikimedia Foundation. 2010.