Method of moments (probability theory)
- Method of moments (probability theory)
-
In probability theory, the method of moments is a way of proving convergence in distribution by proving convergence of a sequence of moment sequences.[1] Suppose X is a random variable and that all of the moments
exist. Further suppose the probability distribution of X is completely determined by its moments, i.e., there is no other probability distribution with the same sequence of moments (cf. the problem of moments). If
for all values of k, then the sequence {Xn} converges to X in distribution.
The method of moments was introduced by Pafnuty Chebyshev for proving the central limit theorem; Chebyshev cited earlier contributions by Irénée-Jules Bienaymé[2]. More recently, it has been applied by Eugene Wigner to prove Wigner's semicircle law, and has since found numerous applications in the theory of random matrices.[3]
Notes
- ^ Prokhorov, A.V.. "Moments, method of (in probability theory)". In M. Hazewinkel. Encyclopaedia of Mathematics (online). ISBN 1402006098. MR1375697. http://eom.springer.de/m/m064610.htm.
- ^ Fischer, H. (2011). "4. Chebyshev's and Markov's Contributions.". A history of the central limit theorem. From classical to modern probability theory.. Sources and Studies in the History of Mathematics and Physical Sciences. New York: Springer. ISBN 978-0-387-87856-0. MR2743162.
- ^ Anderson, G.W.; Guionnet, A.; Zeitouni, O. (2010). "2.1". An introduction to random matrices.. Cambridge: Cambridge University Press. ISBN 978-0-521-19452-5.
Wikimedia Foundation.
2010.
Look at other dictionaries:
Method of moments — may refer to: Method of moments (statistics), a method of parameter estimation in statistics Method of moments (probability theory), a way of proving convergence in distribution in probability theory Second moment method, a technique used in… … Wikipedia
Method of moments (statistics) — See method of moments (probability theory) for an account of a technique for proving convergence in distribution. In statistics, the method of moments is a method of estimation of population parameters such as mean, variance, median, etc. (which… … Wikipedia
Probability-generating function — In probability theory, the probability generating function of a discrete random variable is a power series representation (the generating function) of the probability mass function of the random variable. Probability generating functions are… … Wikipedia
Monte Carlo method — Not to be confused with Monte Carlo algorithm. Computational physics … Wikipedia
Estimation theory — is a branch of statistics and signal processing that deals with estimating the values of parameters based on measured/empirical data. The parameters describe an underlying physical setting in such a way that the value of the parameters affects… … Wikipedia
Stein's method — is a general method in probability theory to obtain bounds on the distance between two probability distributions with respect to a probability metric. It was introduced by Charles Stein, who first published it 1972,[1] to obtain a bound between… … Wikipedia
Frequency probability — Statistical probability redirects here. For the episode of Star Trek: Deep Space Nine, see Statistical Probabilities. John Venn Frequency probability is the interpretation of probability that defines an event s probability as the limit of its… … Wikipedia
List of probability topics — This is a list of probability topics, by Wikipedia page. It overlaps with the (alphabetical) list of statistical topics. There are also the list of probabilists and list of statisticians.General aspects*Probability *Randomness, Pseudorandomness,… … Wikipedia
Second moment method — The second moment method is a technique used in probability theory and analysis to show that a random variable has positive probability to be positive. The method is often quantitative, in that one can often deducea lower bound on the probability … Wikipedia
Taylor expansions for the moments of functions of random variables — In probability theory, it is possible to approximate the moments of a function f of a random variable X using Taylor expansions, provided that f is sufficiently differentiable and that the moments of X are finite. This technique is often used by… … Wikipedia