Kolmogorov-Smirnov test

Kolmogorov-Smirnov test

In statistics, the Kolmogorov–Smirnov test (also called the K-S test for brevity) is a form of minimum distance estimation used as a nonparametric test of equality of one-dimensional probability distributions used to compare a sample with a reference probability distribution (one-sample K-S test), or to compare two samples (two-sample K-S test). The Kolmogorov-Smirnov statistic quantifies a distance between the empirical distribution function of the sample and the cumulative distribution function of the reference distribution, or between the empirical distribution functions of two samples. The null distribution of this statistic is calculated under the null hypothesis that the samples are drawn from the same distribution (in the two-sample case) or that the sample is drawn from the reference distribution (in the one-sample case). In each case, the distributions considered under the null hypothesis are continuous distributions but are otherwise unrestricted.

The two-sample KS test is one of the most useful and general nonparametric methods for comparing two samples, as it is sensitive to differences in both location and shape of the empirical cumulative distribution functions of the two samples.

The Kolmogorov-Smirnov test can be modified to serve as a goodness of fit test. For normality testing, samples are standarised and compared with a standard normal reference distribution. This is equivalent to setting the mean and variance of the reference distribution equal to the sample estimates, and it is known that using the sample to modify the null hypothesis reduces the power of a test. Correcting for this bias leads to the Lilliefors test. However, even Lilliefors' modification is less powerful than the Shapiro-Wilk test or Anderson-Darling test for testing normalityFact|date=March 2007.

Kolmogorov-Smirnov statistic

The empirical distribution function "F""n" for "n" iid observations "Xi" is defined as

:F_n(x)={1 over n}sum_{i=1}^n I_{X_ileq x}where I_{X_ileq x} is the indicator function.

The Kolmogorov-Smirnov statistic for a given function "F"("x") is

:D_n=sup_x |F_n(x)-F(x)|,

where sup S is the supremum of set S. By the Glivenko-Cantelli theorem, if the sample comes from distribution "F"("x"), then D_n converges to 0 almost surely. Kolmogorov strengthened this result, by effectively providing the rate of this convergence (see below). The Donsker theorem provides yet stronger result.

Kolmogorov distribution

The Kolmogorov distribution is the distribution of the random variable

:K=sup_{tin [0,1] }|B(t)|,

where B(t) is the Brownian bridge. The cumulative distribution function of "K" is given by

:operatorname{Pr}(Kleq x)=1-2sum_{i=1}^infty (-1)^{i-1} e^{-2i^2 x^2}=frac{sqrt{2pi{x}sum_{i=1}^infty e^{-(2i-1)^2pi^2/(8x^2)}.

Kolmogorov-Smirnov test

Under null hypothesis that the sample comes from the hypothesized distribution "F"("x"),

:sqrt{n}D_nxrightarrow{n oinfty}sup_t |B(F(t))|

in distribution, where "B"("t") is the Brownian bridge.

If "F" is continuous then under the null hypothesis sqrt{n}D_n converges to the Kolmogorov distribution, which does not depend on "F". This result may also be known as the Kolmogorov theorem; see Kolmogorov's theorem for disambiguation.

The "goodness-of-fit" test or the Kolmogorov-Smirnov test is constructed by using the critical values of the Kolmogorov distribution.

The null hypothesis is rejected at level alpha if

:sqrt{n}D_n>K_alpha,,

where K_alpha is found from

:operatorname{Pr}(Kleq K_alpha)=1-alpha.

The asymptotic power of this test is 1. If the form or parameters of F(x) are determined from the X_i, the inequality may not hold. In this case, Monte Carlo or other methods are required to determine the rejection level alpha.

A more familiar form of the test is::D_n> frac{K_alpha}{sqrt{n,

found on different references.

Two-sample Kolmogorov-Smirnov test

The Kolmogorov-Smirnov test may also be used to test whether two underlying one-dimensional probability distributions differ. In this case, the Kolmogorov-Smirnov statistic is

:D_{n,n'}=sup_x |F_n(x)-F_{n'}(x)|.

and the null hypothesis is rejected at level alpha if

:sqrt{frac{n n'}{n + n'D_{n,n'}>K_alpha.

etting confidence limits for the shape of a distribution function

While the Kolmogorov-Smirnov test is usually used to test whether a given F(x) is the underlying probability distribution of F_n(x), the procedure may be inverted to give confidence limits on F(x) itself. If one chooses a critical value of the test statistic D_alpha such that P(D_n > D_alpha) = alpha, then a band of width ±D_alpha around F_n(x) will entirely contain F(x) with probability 1-alpha.

ee also

* Non-parametric statistics
* Andrey Kolmogorov
* Lilliefors test
* Jarque-Bera test
* Shapiro-Wilk test
* Anderson-Darling test
* Kuiper's test
* Cramér-von-Mises test
* Donsker theorem
* Siegel-Tukey test

References

* cite book
last = Eadie
first = W.T.
coauthors = D. Drijard, F.E. James, M. Roos and B. Sadoulet
title = Statistical Methods in Experimental Physics
publisher = North-Holland
date = 1971
location = Amsterdam
pages = 269-271

* cite book
last = Stuart
first = Alan
coauthors = Keith Ord and Steven Arnold
title = Kendall's Advanced Theory of Statistics
volume = 2A
publisher = Arnold, a member of the Hodder Headline Group
date = 1999
location = London
pages = 25.37-25.43

External links

* [http://www.physics.csbsju.edu/stats/KS-test.html Short introduction]
* [http://www.analyse-it.com/blog/2008/8/testing-the-assumption-of-normality.aspx Testing the assumption of normality] .
* [http://www.itl.nist.gov/div898/handbook/eda/section3/eda35g.htm KS test explanation]
* [http://www.ciphersbyritter.com/JAVASCRP/NORMCHIK.HTM JavaScript implementation of one- and two-sided tests]
* [http://jumk.de/statistic-calculator/ Online calculator with the K-S test]
* Open-source C++ code to compute the [http://root.cern.ch/root/html/TMath.html#TMath:KolmogorovProb Kolmogorov distribution] and perform the [http://root.cern.ch/root/html/TMath.html#TMath:KolmogorovTest K-S test]


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Kolmogorov-Smirnov-Test — Der Kolmogorow Smirnow Anpassungstest, KS Test oder KSA Test (nach Andrei Nikolajewitsch Kolmogorow und Nikolaj Wassiljewitsch Smirnow) ist ein statistischer Test auf Übereinstimmung zweier Wahrscheinlichkeitsverteilungen. Das kann ein Vergleich… …   Deutsch Wikipedia

  • Kolmogorov-Smirnov test — Kol·mo·gor·ov Smir·nov test (kol″mo gorґof smērґnof) [Andrei Nicolaievich Kolmogorov, Russian mathematician, 1903–1987; Nicolai Vasilievich Smirnov, Russian mathematician, 1900–1966] see under test …   Medical dictionary

  • Kolmogorov-Smirnov test — a statistical test of goodness of fit of a sample to a specified theoretical distribution function, based on the size of the maximum difference between the cumulative distribution functions of the sample and theoretical distributions and using… …   Medical dictionary

  • Test de kolmogorov-smirnov — En statistiques, le test de Kolmogorov Smirnov est un test d hypothèse utilisé pour déterminer si un échantillon suit bien une loi donnée connue par sa fonction de répartition continue, ou bien si deux échantillons suivent la même loi. Sommaire 1 …   Wikipédia en Français

  • Test de Kolmogorov-Smirnov — En statistiques, le test de Kolmogorov Smirnov est un test d hypothèse utilisé pour déterminer si un échantillon suit bien une loi donnée connue par sa fonction de répartition continue, ou bien si deux échantillons suivent la même loi. Sommaire 1 …   Wikipédia en Français

  • Kolmogorov-Smirnov-Anpassungstest — Der Kolmogorow Smirnow Anpassungstest, KS Test oder KSA Test (nach Andrei Nikolajewitsch Kolmogorow und Nikolaj Wassiljewitsch Smirnow) ist ein statistischer Test auf Übereinstimmung zweier Wahrscheinlichkeitsverteilungen. Das kann ein Vergleich… …   Deutsch Wikipedia

  • Prueba de Kolmogórov-Smirnov — En estadística, la prueba de Kolmogórov Smirnov (también prueba K S) es una prueba no paramétrica que se utiliza para determinar la bondad de ajuste de dos distribuciones de probabilidad entre sí. En el caso de que queramos verificar la… …   Wikipedia Español

  • Kolmogorov's theorem — is any of several different results by Andrey Kolmogorov:;In statistics * Kolmogorov Smirnov test;In probability theory * Hahn Kolmogorov theorem * Kolmogorov existence theorem * Kolmogorov continuity theorem * Kolmogorov s three series theorem * …   Wikipedia

  • Test (statistique) — Pour les articles homonymes, voir Test. En statistiques, un test d hypothèse est une démarche consistant à rejeter ou à ne pas rejeter (rarement accepter) une hypothèse statistique, appelée hypothèse nulle, en fonction d un jeu de données… …   Wikipédia en Français

  • Test d'hypothese — Test d hypothèse En statistiques, un test d hypothèse est une démarche consistant à rejeter (ou plus rarement à accepter) une hypothèse statistique, appelée hypothèse nulle, en fonction d un jeu de données (échantillon). On cherche par exemple à… …   Wikipédia en Français

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”