Hartley function

Hartley function

The Hartley function is a measure of uncertainty, introduced by Ralph Hartley in 1928. If we pick a sample from a finite set A uniformly at random, the information revealed after we know the outcome is given by the Hartley function

 H_0(A) := \mathrm{log}_b \vert A \vert .

If the base of the logarithm is 2, then the uncertainty is measured in bits. If it is the natural logarithm, then the unit is nats. (Hartley himself used a base-ten logarithm, and this unit of information is sometimes called the hartley in his honor.) It is also known as the Hartley entropy.

Contents

Hartley function, Shannon's entropy, and Rényi entropy

The Hartley function coincides with the Shannon entropy (as well as with the Rényi entropies of all orders) in the case of a uniform probability distribution. It is actually a special case of the Rényi entropy since:

H_0(X) = \frac 1 {1-0} \log \sum_{i=1}^{|X|} p_i^0 = \log |X|.

But it can also be viewed as a primitive construction, since, as emphasized by Kolmogorov and Rényi (see George, J. Klirr's "Uncertainty and information", p.423), the Hartley function can be defined without introducing any notions of probability.

Characterization of the Hartley function

The Hartley function only depends on the number of elements in a set, and hence can be viewed as a function on natural numbers. Rényi showed that the Hartley function in base 2 is the only function mapping natural numbers to real numbers that satisfies

  1. H(mn) = H(m) + H(n) (additivity)
  2. H(m) \leq H(m+1) (monotonicity)
  3. H(2) = 1 (normalization)

Condition 1 says that the uncertainty of the Cartesian product of two finite sets A and B is the sum of uncertainties of A and B. Condition 2 says that larger set has larger uncertainty.

Derivation of the Hartley function

We want to show that the Hartley function, log2(n), is the only function mapping natural numbers to real numbers that satisfies

  1. H(mn) = H(m)+H(n)\, (additivity)
  2. H(m) \leq H(m+1)\, (monotonicity)
  3. H(2)=1\, (normalization)

Let ƒ be a function on positive integers that satisfies the above three properties. From the additive property, we can show that for any integer n and k,

f(n^k) = kf(n).\,

Let a, b, and t be any positive integers. There is a unique integer s determined by

a^s \leq b^t \leq a^{s+1}. \qquad(1)

Therefore,

s \log_2 a\leq t \log_2 b \leq (s+1) \log_2 a \,

and

\frac{s}{t} \leq \frac{\log_2 b}{\log_2 a} \leq \frac{s+1}{t}.

On the other hand, by monotonicity,

f(a^s) \leq f(b^t) \leq f(a^{s+1}). \,

Using Equation (1), we get

s f(a) \leq t f(b) \leq (s+1) f(a),\,

and

\frac{s}{t} \leq \frac{f(a)}{f(b)} \leq \frac{s+1}{t}.

Hence,

\Big\vert \frac{f(a)}{f(b)} - \frac{\log_2(a)}{\log_2(b)} \Big\vert \leq \frac{1}{t}.

Since t can be arbitrarily large, the difference on the left hand side of the above inequality must be zero,

\frac{f(a)}{f(b)} = \frac{\log_2(a)}{\log_2(b)}.

So,

f(a) = \mu \log_2(a)\,

for some constant μ, which must be equal to 1 by the normalization property.

See also

This article incorporates material from Hartley function on PlanetMath, which is licensed under the Creative Commons Attribution/Share-Alike License.

This article incorporates material from Derivation of Hartley function on PlanetMath, which is licensed under the Creative Commons Attribution/Share-Alike License.


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Hartley transform — In mathematics, the Hartley transform is an integral transform closely related to the Fourier transform, but which transforms real valued functions to real valued functions. It was proposed as an alternative to the Fourier transform by R. V. L.… …   Wikipedia

  • Hartley, David — ▪ British physician and philosopher born Aug. 8, 1705, Armley, Yorkshire, Eng. died Aug. 28, 1757, Bath, Somerset  English physician and philosopher credited with the first formulation of the psychological system known as associationism.… …   Universalium

  • Discrete Hartley transform — A discrete Hartley transform (DHT) is a Fourier related transform of discrete, periodic data similar to the discrete Fourier transform (DFT), with analogous applications in signal processing and related fields. Its main distinction from the DFT… …   Wikipedia

  • Redundancy (information theory) — Redundancy in information theory is the number of bits used to transmit a message minus the number of bits of actual information in the message. Informally, it is the amount of wasted space used to transmit certain data. Data compression is a way …   Wikipedia

  • Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… …   Wikipedia

  • Psychology (The separation of) from philosophy — The separation of psychology from philosophy Studies in the sciences of mind 1815–1879 Edward S.Reed THE IMPOSSIBLE SCIENCE Traditional metaphysics The consensus of European opinion during and immediately after the Napoleonic era was that… …   History of philosophy

  • Scientific phenomena named after people — This is a list of scientific phenomena and concepts named after people (eponymous phenomena). For other lists of eponyms, see eponym. NOTOC A* Abderhalden ninhydrin reaction Emil Abderhalden * Abney effect, Abney s law of additivity William de… …   Wikipedia

  • Subacromial bursitis — Infobox Disease Name = Subacromial bursitis Caption = DiseasesDB = ICD10 = ICD10|M|75|5|m|70 ICD9 = ICD9|726.19 ICDO = OMIM = MedlinePlus = eMedicineSubj = eMedicineTopic = MeshID = Subacromial bursitis is a condition caused by inflammation of… …   Wikipedia

  • History of information theory — The decisive event which established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon s classic paper A Mathematical Theory of Communication in the Bell System… …   Wikipedia

  • List of mathematics articles (H) — NOTOC H H cobordism H derivative H index H infinity methods in control theory H relation H space H theorem H tree Haag s theorem Haagerup property Haaland equation Haar measure Haar wavelet Haboush s theorem Hackenbush Hadamard code Hadamard… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”