Minimum Fisher information

Minimum Fisher information

In information theory, the principle of minimum Fisher information (MFI) is a variational principle which, when applied with the proper constraints needed to reproduce empirically known expectation values, determines the best probability distribution that characterizes the system. (See also Fisher information.)

Contents

Measures of information

Information measures (IM) are the most important tools of information theory. They measure either the amount of positive information or of "missing" information an observer possesses with regards to any system of interest. The most famous IM is the so-called Shannon-entropy (1948), which determines how much additional information the observer still requires in order to have all the available knowledge regarding a given system S, when all he/she has is a probability density function (PD) defined on appropriate elements of such system. This is then a "missing" information measure. The IM is a functional of the PD only. If the observer does not have such a PD, but only a finite set of empirically determined mean values of the system, then a fundamental scientific principle called the Maximum Entropy one (MaxEnt) asserts that the "best" PD is the one that, reproducing the known expectation values, maximizes otherwise Shannon's IM.

Fisher's information measure

Fisher's information (FIM), named after Ronald Fisher, (1925) is another kind of measure, in two respects, namely,

1) it reflects the amount of (positive) information of the observer,
2) it depends no only on the PD but also on its first derivatives, a property that makes it a local quantity (Shannon's is instead a global one).

The corresponding counterpart of MaxEnt is now the FIM-minimization, since Fisher's measure grows when Shannon's diminishes, and viceversa. The minimization here referred to (MFI) is an important theoretical tool in a manifold of disciplines, beginning with physics. In a sense it is clearly superior to MaxEnt because the later procedure yields always as the solution an exponential PD, while the MFI solution is a differential equation for the PD, which allows for greater flexibility and versatility.

Applications of the MFI

Thermodynamics

Much effort has been devoted to Fisher's information measure, shedding much light upon the manifold physical applications [1][2][3][4][5][6][7][8][9][10][11][12][13][14][15]. As a small sample, it can be shown that the whole field of thermodynamics (both equilibrium and non-equilibrium) can be derived from the MFI approach [16]. Here FIM is specialized to the particular but important case of translation families, i.e., distribution functions whose form does not change under translational transformations. In this case, Fisher measure becomes shift-invariant. Such minimizing of Fisher's measure leads to a Schrödinger-like equation for the probability amplitude, where the ground state describes equilibrium physics and the excited states account for non-equilibrium situations [17].

Scale-invariant phenomena

More recently, Zipf's law has been shown to arise as the variational solution of the MFI when scale invariance is introduced in the measure, leading for the first time an explanation of this regularity from first principles [18]. It has been also show that MFI can be used to formulate a thermodynamics based on scale invariance instead of translational invariance, allowing the definition of the Scale-Free Ideal Gas, the scale invariant equivalent of the Ideal Gas [19].

References

  1. ^ B. R. Frieden, Science from Fisher Information, Cambridge University Press, Cambridge, England, 2004.
  2. ^ B. R. Frieden, Am. J. Phys. 57 (1989) 1004.
  3. ^ B. R. Frieden, Phys. Lett. A 169 (1992) 123.
  4. ^ B. R. Frieden, in Advances in Imaging and Electron Physics, edited by P. W. Hawkes, Academic, New York, 1994, Vol. 90, pp. 123204.
  5. ^ B. R. Frieden, Physica A 198 (1993) 262.
  6. ^ B. R. Frieden and R. J. Hughes, Phys. Rev. E 49 (1994) 2644.
  7. ^ B. Nikolov and B. R. Frieden, Phys. Rev. E 49 (1994) 4815.
  8. ^ B. R. Frieden, Phys. Rev. A 41 (1990) 4265.
  9. ^ B. R. Frieden and B. H. Soffer, Phys. Rev. E 52 (1995) 2274.
  10. ^ B. R. Frieden, Found. Phys. 21 (1991) 757.
  11. ^ R. N. Silver, in E. T. Jaynes: Physics and Probability, edited by W. T. Grandy, Jr. and P. W. Milonni, Cambridge University Press, Cambridge, England, 1992.
  12. ^ A. Plastino, A. R. Plastino, H. G. Miller, and F. C. Khana, Phys. Lett. A 221 (1996) 29.
  13. ^ A. R. Plastino and A. Plastino, Phys. Rev. E 54 (1996) 4423.
  14. ^ A. R. Plastino, A. Plastino, and H. G. Miller, Phys. Rev. E 56 (1997) 3927.
  15. ^ A. Plastino, A. R. Plastino, and H. G. Miller, Phys. Lett. A 235 (1997) 129.
  16. ^ B. R. Frieden, A. Plastino, A. R. Plastino, and B. H. Soffer, Phys. Rev. E 60 (1999) 48.
  17. ^ B. R. Frieden, A. Plastino, A. R. Plastino, and B. H. Soffer, Phys. Rev. E 66 (2002) 046128.
  18. ^ Zipf’s law from a Fisher variational-principle A. Hernando, D. Puigdomènech, D. Villuendas, C. Vesperinas, A. Plastino, to be published in Physics Letters A (http://arxiv.org/pdf/0908.0501)
  19. ^ Fisher-information and the thermodynamics of scale-invariant systems, A. Hernando, C. Vesperinas, A. Plastino, to be published in Physica A (http://arxiv.org/pdf/0908.0504)

Wikimedia Foundation. 2010.

Игры ⚽ Поможем сделать НИР

Look at other dictionaries:

  • Minimum message length — (MML) is a formal information theory restatement of Occam s Razor: even when models are not equal in goodness of fit accuracy to the observed data, the one generating the shortest overall message is more likely to be correct (where the message… …   Wikipedia

  • Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… …   Wikipedia

  • Extreme physical information — (EPI) is a principle, first described and formulated in 1998 Frieden, B. Roy Physics from Fisher Information: A Unification , 1st Ed. Cambridge University Press, ISBN 0 521 63167 X, pp328, 1998 ( [ref name= Frieden6 ] shows 2nd Ed.)] by B. Roy… …   Wikipedia

  • Ronald Fisher — R. A. Fisher Born 17 February 1890(1890 02 17) East Finchley, London …   Wikipedia

  • Sam Fisher — General VG character name = Samuel Fisher caption = series = Tom Clancy s Splinter Cell firstgame = Tom Clancy s Splinter Cell creator = artist = voiceactor = Michael Ironside (English) Tesshō Genda (Japanese) Daniel Beretta (French) Martin… …   Wikipedia

  • Jimmie Lou Fisher — (born December 31, 1941 in Delight, Arkansas), grew up in Paragould, Arkansas. She attended school at Delaplaine School in Delaplaine, Arkansas. She, at a very early age, became interested in politics, and aspired to be successful. She graduated… …   Wikipedia

  • Sample maximum and minimum — Box plots of the Michelson–Morley experiment, showing sample maximums and minimums. In statistics, the maximum and sample minimum, also called the largest observation, and smallest observation, are the values of the greatest and least elements of …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

  • Cramér–Rao bound — In estimation theory and statistics, the Cramér–Rao bound (CRB) or Cramér–Rao lower bound (CRLB), named in honor of Harald Cramér and Calyampudi Radhakrishna Rao who were among the first to derive it,[1][2][3] expresses a lower bound on the… …   Wikipedia

  • Maximum likelihood — In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of a statistical model. When applied to a data set and given a statistical model, maximum likelihood estimation provides estimates for the model s… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”