Limiting density of discrete points

Limiting density of discrete points

The limiting density of discrete points is used to formulate an adjustment made by Edwin Thompson Jaynes to Claude Elwood Shannon's fundamental uniqueness theorem. The adjustment was created to allow Shannon's information measure formulation to be compatible with continuous distributions.

The limiting density of discrete points, as formulated by Jaynes, is proportional to the invariant measure m(x).: lim_{n o infty}frac{1}{n},(mbox{number of points in }a

Whereas the information entropy, H(X), of the discrete distribution p(x) is as shown below in (1):: H(X)= - sum_{i=1}^np(x_i)log p(x_i)qquadqquad (1)

the relative entropy of a continuous distribution p(x) would be as shown in (2)

: H(X)=-int p(x)logfrac{p(x)}{m(x)},dxqquadqquad (2)

References

*Jaynes E.T.,Prior Probabilities IEEE, Transaction on System Science and Cybernetics SSSC-4 (1968), 227.
*Jaynes, E. T., 1983, `Papers On Probability, Statistics and Statistical Physics,' Edited by R. D. Rosenkrantz, D. Reidel publishing Co., Dordrecth, Holland


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Differential entropy — (also referred to as continuous entropy) is a concept in information theory that extends the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Contents 1 Definition 2… …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • Principle of maximum entropy — This article is about the probability theoretic principle. For the classifier in machine learning, see maximum entropy classifier. For other uses, see maximum entropy (disambiguation). Bayesian statistics Theory Bayesian probability Probability… …   Wikipedia

  • List of mathematics articles (L) — NOTOC L L (complexity) L BFGS L² cohomology L function L game L notation L system L theory L Analyse des Infiniment Petits pour l Intelligence des Lignes Courbes L Hôpital s rule L(R) La Géométrie Labeled graph Labelled enumeration theorem Lack… …   Wikipedia

  • analysis — /euh nal euh sis/, n., pl. analyses / seez /. 1. the separating of any material or abstract entity into its constituent elements (opposed to synthesis). 2. this process as a method of studying the nature of something or of determining its… …   Universalium

  • Central limit theorem — This figure demonstrates the central limit theorem. The sample means are generated using a random number generator, which draws numbers between 1 and 100 from a uniform probability distribution. It illustrates that increasing sample sizes result… …   Wikipedia

  • Conditioning (probability) — Beliefs depend on the available information. This idea is formalized in probability theory by conditioning. Conditional probabilities, conditional expectations and conditional distributions are treated on three levels: discrete probabilities,… …   Wikipedia

  • Dirichlet distribution — Several images of the probability density of the Dirichlet distribution when K=3 for various parameter vectors α. Clockwise from top left: α=(6, 2, 2), (3, 7, 5), (6, 2, 6), (2, 3, 4). In probability and… …   Wikipedia

  • probability theory — Math., Statistics. the theory of analyzing and making statements concerning the probability of the occurrence of uncertain events. Cf. probability (def. 4). [1830 40] * * * Branch of mathematics that deals with analysis of random events.… …   Universalium

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”