Stein's example

Stein's example

Stein's example, sometimes referred to as Stein's phenomenon or Stein's paradox, is a surprising effect observed in decision theory and estimation theory. Simply stated, the example demonstrates that when three or more parameters are estimated simultaneously, their combined estimator is more accurate than any method which handles the parameters separately. This is surprising since the parameters and the measurements might be totally unrelated. The phenomenon is named after its discoverer, Charles Stein.

Formal statement

Let {oldsymbol heta} be a vector consisting of n ge 3 unknown parameters. To estimate these parameters, a single measurement X_i is performed for each parameter heta_i, resulting in a vector {mathbf X} of length n. Suppose the measurements are independent, identically distributed, Gaussian random variables, with mean {oldsymbol heta} and variance 1, i.e., :{mathbf X} sim N({oldsymbol heta}, I).Thus, each parameter is estimated using a single noisy measurement, and each measurement is equally inaccurate.

Under such conditions, it is most intuitive (and most common) to use each measurement as an estimate of its corresponding parameter. This so-called "ordinary" decision rule can be written as:{hat oldsymbol heta} = {mathbf X}.

The quality of such an estimator is measured by its risk function. A commonly used risk function is the mean squared error, defined as:E left{ | {oldsymbol heta} - {hat oldsymbol heta} |^2 ight}.Surprisingly, it turns out that the "ordinary" estimator proposed above is suboptimal in terms of mean squared error. In other words, in the setting discussed here, there exist alternative estimators which "always" achieve lower mean squared error, no matter what the value of {oldsymbol heta} is.

More accurately, an estimator {hat oldsymbol heta}_1 is said to dominate another estimator {hat oldsymbol heta}_2 if, for all values of {oldsymbol heta}, the risk of {hat oldsymbol heta}_1 is lower than, or equal to, the risk of {hat oldsymbol heta}_2, "and" if the inequality is strict for some {oldsymbol heta}. An estimator is said to be admissible if no other estimator dominates it, otherwise it is "inadmissible". Thus, Stein's example can be simply stated as follows: "The ordinary decision rule for estimating the mean of a multivariate Gaussian distribution is inadmissible under mean squared error risk."

Many simple, practical estimators achieve better performance than the ordinary estimator. The best-known example is the James-Stein estimator.

For a sketch of the proof of this result, see Proof of Stein's example.


Stein's example is surprising, since the "ordinary" decision rule is intuitive and commonly used. In fact, numerous methods for estimator construction, including maximum likelihood estimation, best linear unbiased estimation, least squares estimation and optimal equivariant estimation, all result in the "ordinary" estimator. Yet, as discussed above, this estimator is suboptimal.

To demonstrate the unintuitive nature of Stein's example, consider the following real-world example. Suppose we are to estimate three unrelated parameters, such as the US wheat yield for 1993, the number of spectators at the Wimbledon tennis tournament in 2001, and the weight of a randomly chosen candy bar from the supermarket. Suppose we have independent Gaussian measurements of each of these quantities. Stein's example now tells us that we will get a better estimate for the three parameters by simultaneously using the three unrelated measurements.

At first sight it appears that somehow we get a better estimate for US wheat yield by measuring some other unrelated statistics such as the number of spectators at Wimbeldon and the weight of a candy bar. This is of course absurd; we have not obtained a better estimate for US wheat yield alone, but we have produced an estimate for the means of "all" of the random variables, which has a reduced "total" risk. So the cost of a bad estimate in one component can be compensated by a better estimate in another component.

Resolution of the "paradox"

One may ask how the simultaneous measurement of several parameters reduces the total error of the parameters. This stems from the fact that some properties of a distribution can be estimated more accurately when multiple observations are present, even if those observations are statistically independent. For example, consider the squared norm of the parameter vector, |{oldsymbol heta}|^2. One might consider estimating this value using |{mathbf X}|^2. However, the expectation of this estimate can be shown to be:E{ |{mathbf X}|^2 } = |{oldsymbol heta}|^2 + n,so that |{mathbf X}|^2 tends to be an overestimate of |{oldsymbol heta}|^2. Furthermore, |{oldsymbol heta}|^2 can be estimated more accurately when more parameters are present.

It follows from the above equation that the "ordinary" estimate tends to overestimate the norm of the parameters. This can be corrected by shrinking the ordinary estimator, using, for example, the James-Stein estimator.


A good introduction to the phenomenon:
* cite journal
last = Efron
first = B.
authorlink = Bradley Efron
coauthors = Morris, C.
title = Stein's paradox in statistics
journal = Scientific American
volume = 236
issue = 5
pages = 119–127
date = 1977
url =
A textbook with an extensive discussion of Stein-type estimators:
* cite book
last = Lehmann
first = E. L.
coauthors = Casella, G.
title = Theory of Point Estimation
date = 1998
pages = 2nd ed., ch. 5

Stein's original paper:
* cite conference
first = C.
last = Stein
authorlink = Charles Stein
title = Inadmissibility of the usual estimator for the mean of a multivariate distribution
booktitle = Proc. Third Berkeley Symp. Math. Statist. Prob.
pages = 1, 197-206
date = 1956

See also

* James-Stein estimator
* Decision theory

Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Proof of Stein's example — Stein s example is an important result in decision theory which can be stated as: The ordinary decision rule for estimating the mean of a multivariate Gaussian distribution is inadmissible under mean squared error risk in dimesion at least 3 .The …   Wikipedia

  • Stein's method — is a general method in probability theory to obtain bounds on the distance between two probability distributions with respect to a probability metric. It was introduced by Charles Stein, who first published it 1972,[1] to obtain a bound between… …   Wikipedia

  • Stein Eriksen — (born December 11, 1927) is a Norwegian American former professional alpine skier and Olympic gold medalist.ports careerEriksen won the gold medal in the Giant Slalom event at the 1952 Winter Olympics, which were held in Oslo, Norway. He also won …   Wikipedia

  • Stein, Karl, Reichsfreiherr vom und zum — ▪ prime minister of Prussia Introduction (imperial baron of) born Oct. 26, 1757, Nassau an der Lahn, Nassau [Germany] died June 29, 1831, Schloss Cappenberg, Westphalia [Germany]  Rhinelander born Prussian statesman, chief minister of Prussia… …   Universalium

  • Stein manifold — In mathematics, a Stein manifold in the theory of several complex variables and complex manifolds is a complex submanifold of the vector space of n complex dimensions. The name is for Karl Stein. Definition A complex manifold X of complex… …   Wikipedia

  • Stein-Neukirch — Infobox Ort in Deutschland image photo = Wappen = Wappen von Stein Neukirch.png lat deg = 50 |lat min = 40 |lat sec = 34 lon deg = 8 |lon min = 3 |lon sec = 17 Lageplan = Stein Neukirch im Westerwaldkreis.png Bundesland = Rheinland Pfalz… …   Wikipedia

  • Stein's unbiased risk estimate — In statistics, Stein s unbiased risk estimate (SURE) is an unbiased estimator of the mean squared error of a given estimator, in a deterministic estimation scenario. In other words, it provides an indication of the accuracy of a given estimator.… …   Wikipedia

  • Stein-Leventhal syndrome — ▪ medical disorder also called  polycystic ovary syndrome (PCOS)   disorder in women that is characterized by an elevated level of male hormones (androgens (androgen)) and infrequent or absent ovulation (anovulation). About 5 percent of women are …   Universalium

  • Charles Stein (statistician) — For other people named Charles Stein, see Charles Stein (disambiguation). Charles M. Stein (born March 22, 1920), an American mathematical statistician, is emeritus professor of statistics at Stanford University. He received his Ph.D in 1947 at… …   Wikipedia

  • Gertrude Stein — Infobox Writer name = Gertrude Stein imagesize = 200px caption = Gertrude Stein, photographed by Carl Van Vechten, 1935 pseudonym = birthdate = birth date|1874|2|3|mf=y birthplace = Allegheny, Pittsburgh, Pennsylvania, U.S. deathdate = death date …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”