Mean squared prediction error

Mean squared prediction error

In statistics the mean squared prediction error of a smoothing procedure is the expected sum of squared deviations of the fitted values \widehat{g} from the (unobservable) function g. If the smoothing procedure has operator matrix L, then

\operatorname{MSPE}(L)=\operatorname{E}\left[\sum_{i=1}^n\left( g(x_i)-\widehat{g}(x_i)\right)^2\right].

The MSPE can be decomposed into two terms just like mean squared error is decomposed into bias and variance; however for MSPE one term is the sum of squared biases of the fitted values and another the sum of variances of the fitted values:

\operatorname{MSPE}(L)=\sum_{i=1}^n\left(\operatorname{E}\left[\widehat{g}(x_i)\right]-g(x_i)\right)^2+\sum_{i=1}^n\operatorname{var}\left[\widehat{g}(x_i)\right].

Note that knowledge of g is required in order to calculate MSPE exactly.

Estimation of MSPE

For the model yi = g(xi) + σεi where \varepsilon_i\sim\mathcal{N}(0,1), one may write

\operatorname{MSPE}(L)=g'(I-L)'(I-L)g+\sigma^2\operatorname{tr}\left[L'L\right].

The first term is equivalent to

\sum_{i=1}^n\left(\operatorname{E}\left[\widehat{g}(x_i)\right]-g(x_i)\right)^2
=\operatorname{E}\left[\sum_{i=1}^n\left(y_i-\widehat{g}(x_i)\right)^2\right]-\sigma^2\operatorname{tr}\left[\left(I-L\right)'\left(I-L\right)\right].

Thus,

\operatorname{MSPE}(L)=\operatorname{E}\left[\sum_{i=1}^n\left(y_i-\widehat{g}(x_i)\right)^2\right]-\sigma^2\left(n-2\operatorname{tr}\left[L\right]\right).

If σ2 is known or well-estimated by \widehat{\sigma}^2, it becomes possible to estimate MSPE by

\operatorname{\widehat{MSPE}}(L)=\sum_{i=1}^n\left(y_i-\widehat{g}(x_i)\right)^2-\widehat{\sigma}^2\left(n-2\operatorname{tr}\left[L\right]\right).

Colin Mallows advocated this method in the construction of his model selection statistic Cp, which is a normalized version of the estimated MSPE:

C_p=\frac{\sum_{i=1}^n\left(y_i-\widehat{g}(x_i)\right)^2}{\widehat{\sigma}^2}-n+2\operatorname{tr}\left[L\right].

where p comes from that fact that the number of parameters p estimated for a parametric smoother is given by p=\operatorname{tr}\left[L\right], and C is in honor of Cuthbert Daniel.

See also


Wikimedia Foundation. 2010.

Игры ⚽ Поможем сделать НИР

Look at other dictionaries:

  • Mean squared error — In statistics, the mean squared error (MSE) of an estimator is one of many ways to quantify the difference between values implied by a kernel density estimator and the true values of the quantity being estimated. MSE is a risk function,… …   Wikipedia

  • Mean percentage error — In statistics, the mean percentage error (MPE) is the computed average of percentage errors by which estimated forecasts differ from actual values of the quantity being forecast. Formula for mean percentage error calculation is: where at is the… …   Wikipedia

  • Forecast error — In statistics, a forecast error is the difference between the actual or real and the predicted or forecast value of a time series or any other phenomenon of interest. In simple cases, a forecast is compared with an outcome at a single time point… …   Wikipedia

  • Mean absolute error — In statistics, the mean absolute error (MAE) is a quantity used to measure how close forecasts or predictions are to the eventual outcomes. The mean absolute error is given by As the name suggests, the mean absolute error is an average of the… …   Wikipedia

  • Minimum mean square error — In statistics and signal processing, a minimum mean square error (MMSE) estimator describes the approach which minimizes the mean square error (MSE), which is a common measure of estimator quality. The term MMSE specifically refers to estimation… …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

  • List of mathematics articles (M) — NOTOC M M estimator M group M matrix M separation M set M. C. Escher s legacy M. Riesz extension theorem M/M/1 model Maass wave form Mac Lane s planarity criterion Macaulay brackets Macbeath surface MacCormack method Macdonald polynomial Machin… …   Wikipedia

  • Regression toward the mean — In statistics, regression toward the mean (also known as regression to the mean) is the phenomenon that if a variable is extreme on its first measurement, it will tend to be closer to the average on a second measurement, and a fact that may… …   Wikipedia

  • Linear prediction — is a mathematical operation where future values of a discrete time signal are estimated as a linear function of previous samples.In digital signal processing, linear prediction is often called linear predictive coding (LPC) and can thus be viewed …   Wikipedia

  • MSPE — may refer to: Mean squared prediction error Mean squared pure error Medical School Performance Evaluation (or dean s letter): a formal, written evaluation of a medical student. Mercenaries, Spies and Private Eyes, a role playing game written by… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”