- Explained sum of squares
In
statistics , an explained sum of squares (ESS) is thesum of squaredpredict ed values in a standardregression model (for example ), where is theresponse variable , is theexplanatory variable , and arecoefficient s, indexes the observations from to , and is theerror term. In general, the less the ESS, the better the model performs in its estimation.If and are the estimated
coefficient s, then:
is the predicted variable. The ESS is the sum of the squares of the differences of the predicted values and the grand mean:
:
In general:
total sum of squares = explained sum of squares +residual sum of squares .Type I SS
Type one estimates of the sum of squares explained by a model in a variable are obtained when sums of squares for a model are calculated sequentially (e.g. with the model "Y" = "aX"1 + "bX"2 + "cX"3). Sums of squares are calculated for a using the model "Y" = "aX"1 and sums of squares for "b" are calculated using the model "Y" = "aX"1 + "bX"2, and sums of squares for "c" are calculated using the model "Y" = "aX"1 + "bX"2 + "cX"3.
Type III SS
The type III sum of squares is calculated by comparing the full model, to the full model without the variable of interest. So it is considered to be the additional variability explained by adding the variable of interest. It is the same as the Type I ss when the variable is the last variable in the model.
ee also
*
Sum of squares
Wikimedia Foundation. 2010.