Residual sum of squares

Residual sum of squares

In statistics, the residual sum of squares (RSS) is the sum of squares of residuals. It is the discrepancy between the data and our estimation model. The smaller this discrepancy is, the better the estimation will be.

:RSS = sum_{i=1}^n (y_i - f(x_i))^2.

In a standard regression model y_i = a+bx_i+varepsilon_i,, where "a" and "b" are coefficients, "y" and "x" are the regressand and the regressor, respectively, and ε is the "error term." The sum of squares of residuals is the sum of squares of estimates of ε"i", that is

:RSS = sum_{i=1}^n (y_i - (a+bx_i))^2.

In general: total sum of squares = explained sum of squares + residual sum of squares.

ee also

*Sum of squares


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать реферат

Look at other dictionaries:

  • Residual Sum Of Squares - RSS — A statistical technique used to measure the amount of variance in a data set that is not explained by the regression model. The residual sum of squares is a measure of the amount of error remaining between the regression function and the data set …   Investment dictionary

  • Sum of squares — is a concept that permeates much of inferential statistics and descriptive statistics. More properly, it is the sum of the squared deviations . Mathematically, it is an unscaled, or unadjusted measure of dispersion (also called variability). When …   Wikipedia

  • Total sum of squares — The value of the total sum of squares (TSS) depends on the data being analyzed and the test that is being done.In statistical linear models, (particularly in standard regression models), the TSS is the sum of the squares of the difference of the… …   Wikipedia

  • Explained sum of squares — In statistics, an explained sum of squares (ESS) is the sum of squared predicted values in a standard regression model (for example y {i}=a+bx {i}+epsilon {i}), where y {i} is the response variable, x {i} is the explanatory variable, a and b are… …   Wikipedia

  • Ordinary least squares — This article is about the statistical properties of unweighted linear regression analysis. For more general regression analysis, see regression analysis. For linear regression on a single variable, see simple linear regression. For the… …   Wikipedia

  • Ordinary Least Squares — Die Regressionsanalyse ist ein statistisches Analyseverfahren. Ziel ist es, Beziehungen zwischen einer abhängigen und einer oder mehreren unabhängigen Variablen festzustellen. Allgemein wird eine metrische Variable Y betrachtet, die von einer… …   Deutsch Wikipedia

  • Linear least squares (mathematics) — This article is about the mathematics that underlie curve fitting using linear least squares. For statistical regression analysis using least squares, see linear regression. For linear regression on a single variable, see simple linear regression …   Wikipedia

  • Linear least squares/Proposed — Linear least squares is an important computational problem, that arises primarily in applications when it is desired to fit a linear mathematical model to observations obtained from experiments. Mathematically, it can be stated as the problem of… …   Wikipedia

  • Linear least squares — is an important computational problem, that arises primarily in applications when it is desired to fit a linear mathematical model to measurements obtained from experiments. The goals of linear least squares are to extract predictions from the… …   Wikipedia

  • Least squares — The method of least squares is a standard approach to the approximate solution of overdetermined systems, i.e., sets of equations in which there are more equations than unknowns. Least squares means that the overall solution minimizes the sum of… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”