- Heteroscedasticity
In
statistics , asequence or a vector ofrandom variable s is heteroskedastic, or heteroscedastic, if the random variables have differentvariance s. The complementary concept is calledhomoskedasticity . The term means "differing variance" and comes from the Greek "hetero" ('different') and "skedasis" ('dispersion').When using some statistical techniques, such as ordinary least squares (OLS), a number of assumptions are typically made. One of these is that the error term has a constant
variance . This will be true if the observations of the error term are assumed to be drawn from identical distributions. Heteroskedasticity is a violation of this assumption.For example, the error term could vary or increase with each observation, something that is often the case with cross-sectional or
time series measurements. Heteroskedasticity is often studied as part ofeconometrics , which frequently deals with data exhibiting it.With the advent of robust standard errors allowing for inference without specifying the conditional second moment of error term, testing conditional homoskedasticity is not as important as in the past.Fact|date=August 2007
The
econometrician Robert Engle won the2003 Nobel Memorial Prize for Economics for his studies onregression analysis in the presence of heteroskedasticity, which led to his formulation of theARCH (AutoRegressive Conditional Heteroskedasticity) modeling technique.Consequences
Heteroskedasticity does not cause OLS coefficient estimates to be biased. However, the variance (and, thus, standard errors) of the coefficients tends to be underestimated, inflating t-scores and sometimes making insignificant variables appear to be statistically significant.
Detection
There are several methods to test for the presence of heteroskedasticity:
*
Park test
*Glejser test (1969)
*White test
*Breusch-Pagan test
*Goldfeld-Quandt test
*Cook- Weisberg test
*Harrison-McCabe test Fixes
There are two common corrections for heteroskedasticity:
* Use a different specification for the model (different X variables, or perhaps non-linear transformations of the X variables).
* Apply aweighted least squares estimation method, in which OLS is applied to transformed or weighted values of X and Y. The weights vary over observations, depending on the changing error variances.
* Heteroskedasticity-Consistent Standard Errors (HCSE), while still biased, improve upon OLS estimates (White 1980). Generally, HCSEs are greater than their OLS counterparts, resulting in lower t-scores and a reduced probability of statistically significant coefficients. The White method corrects for heteroskedasticity without altering the values of the coefficients. This method may be superior to regular OLS because if heteroskedasticity is present it corrects for it, however, if it is not present you have not made any error.Examples
Heteroskedasticity often occurs when there is a large difference among the sizes of the observations.
* The classic example of heteroskedasticity is that of income versus expenditure on meals. As one's income increases, the variability of food consumption will increase. A poorer person will spend a rather constant amount by always eating fast food; a wealthier person may occasionally buy fast food and other times eat an expensive meal. Those with higher incomes display a greater variability of food consumption.
* Imagine you are watching a rocket take off nearby and measuring the distance it has traveled once each second. In the first couple of seconds your measurements may be accurate to the nearest centimeter, say. However, 5 minutes later as the rocket recedes into space, the accuracy of your measurements may only be good to 100 m, because of the increased distance, atmospheric distortion and a variety of other factors. The data you collect would exhibit heteroskedasticity.ee also
*
Kurtosis (peakedness)
*Breusch-Pagan test of heteroskedasticity of the residuals of a linear regression
*Regression analysis
*Homoskedasticity
* Autoregressive conditional heteroskedasticity (ARCH)
*White test Further reading
Most statistics textbooks will include at least some material on heteroskedasticity. Some examples are:
#Studenmund, A.H. "Using Econometrics" 2nd Ed. ISBN 0-673-52125-7. (devotes a chapter to heteroskedasticity).
#Verbeek, Marno (2004): A Guide to Modern Econometrics, 2. ed., Chichester: John Wiley & Sons, 2004, pages
#Greene, W.H. (1993), Econometric Analysis, Prentice-Hall, ISBN 0-13-013297-7, an introductory but thorough general text, considered the standard for a pre-doctorate university Econometrics course;
#Hamilton, J.D. (1994), Time Series Analysis, Princeton University Press ISBN 0-691-04289-6, the text of reference for historical series analysis; it contains an introduction toARCH models.
#White, Halbert (1980): A Heteroskedasticity-Consistent Covariance Matrix Estimator and a Direct Test for Heteroscedasticity, in: Econometrica, Vol. 48, 1980, page 817-838Special subjects:
*Glejser test: Furno, Marilena (Universita di Cassino, Italy, 2005): The Glejser Test and the Median Regression, in: Sankhya - The Indian Journal of Statistics, Special Issue on Quantile Regression and Related Methods, 2005, Volume 67, Part 2, pp 335-358 : http://sankhya.isical.ac.in/search/67_2/2005015.pdf*Heteroskedasticity in QSAR Modeling: http://www.qsarworld.com/qsar-statistics-heteroscedasticity.php
Wikimedia Foundation. 2010.