Entropy rate

Entropy rate

The entropy rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H(X) is the limit of the joint entropy of n members of the process X_k divided by n, as n tends to infinity:

:H(X) = lim_{n o infty} frac{1}{n} H(X_1, X_2, dots X_n)

when the limit exists. An alternative, related quantity is:

:H'(X) = lim_{n o infty} H(X_n|X_{n-1}, X_{n-2}, dots X_1)

For strongly stationary stochastic processes, H(X) = H'(X).

Entropy rates for Markov chains

Since a stochastic process defined by a Markov chain which is irreducible and aperiodic has a stationary distribution, the entropy rate is independent of the initial distribution.

For example, for such a Markov chain "Y""k" defined on a countable number of states, given the transition matrix "P""ij", "H"("Y") is given by:

:displaystyle H(Y) = - sum_{ij} mu_i P_{ij} log P_{ij}

where "μ""i" is the stationary distribution of the chain.

A simple consequence of this definition is that the entropy rate of an i.i.d. stochastic process has an entropy rate that is the same as the entropy of any individual member of the process.

ee also

* Information source (mathematics)
* Markov information source

References

* Cover, T. and Thomas, J. (1991) Elements of Information Theory, John Wiley and Sons, Inc., ISBN 0471062596 [http://www3.interscience.wiley.com/cgi-bin/bookhome/110438582?CRETRY=1&SRETRY=0]

External links

* [http://www.eng.ox.ac.uk/samp Systems Analysis, Modelling and Prediction (SAMP), University of Oxford] MATLAB code for estimating information-theoretic quantities for stochastic processes.


Wikimedia Foundation. 2010.

Игры ⚽ Нужна курсовая?

Look at other dictionaries:

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Rate–distortion theory — is a major branch of information theory which provides the theoretical foundations for lossy data compression; it addresses the problem of determining the minimal amount of entropy (or information) R that should be communicated over a channel, so …   Wikipedia

  • Rate–distortion optimization — (also RDO or RD) is a method of improving video quality in video compression. The name refers to the optimization of the amount of distortion (loss of video quality) against the amount of data required to encode the video, the rate . While it is… …   Wikipedia

  • Entropy encoding — In information theory an entropy encoding is a lossless data compression scheme that is independent of the specific characteristics of the medium. One of the main types of entropy coding creates and assigns a unique prefix free code to each… …   Wikipedia

  • Entropy (disambiguation) — Additional relevant articles may be found in the following categories: Thermodynamic entropy Entropy and information Quantum mechanical entropy Entropy, in thermodynamics, is a measure of the energy in a thermodynamic system not available to do… …   Wikipedia

  • Entropy (general concept) — In many branches of science, entropy refers to a certain measure of the disorder of a system. Entropy is particularly notable as it has a broad, common definition that is shared across physics, mathematics and information science. Although the… …   Wikipedia

  • Entropy (anesthesiology) — In anesthesiology, entropy is a neurophysiologic monitor of the patient s cerebral cortical function, designed to express the likelihood of consciousness, intraoperative awareness, or awareness with recall during a surgical procedure.Other Vital… …   Wikipedia

  • entropy — [ ɛntrəpi] noun 1》 Physics a thermodynamic quantity representing the unavailability of a system s thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system. 2》 (in information… …   English new terms dictionary

  • entropy — n. 1 Physics a measure of the unavailability of a system s thermal energy for conversion into mechanical work. 2 Physics a measure of the disorganization or degradation of the universe. 3 a measure of the rate of transfer of information in a… …   Useful english dictionary

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”