- Edgeworth series
The

**Gram-Charlier A series**and the**Edgeworth series**, named in honor ofFrancis Ysidro Edgeworth , are series that approximate aprobability distribution in terms of itscumulant s. The series are the same; but, the arrangement of terms (and thus the accuracy of truncating the series) differ.**Gram-Charlier A series**The key idea of these expansions is to write the

characteristic function of the distribution whoseprobability density function is "F" to be approximated in terms of the characteristic function of a distribution with known and suitable properties, and to recover "F" through the inverseFourier transform .Let "f" be the characteristic function of the distribution whose density function is "F", and κ

_{"r"}itscumulant s. We expand in terms of a known distribution with probability density function $Psi$, characteristic function $psi$, and standardized cumulants γ_{"r"}. The density $Psi$ is generally chosen to be that of thenormal distribution , but other choices are possible as well. By the definition of the cumulants, we have the following formal identity::$f(t)=expleft\; [sum\_\{r=1\}^infty(kappa\_r-gamma\_r)frac\{(it)^r\}\{r!\}\; ight]\; psi(t),.$

By the properties of the Fourier transform, ("it")

^{"r"}ψ("t") is the Fourier transform of (−1)^{"r"}"D"^{"r"}$Psi$("x"), where "D" is the differential operator with respect to "x". Thus, we find for "F" the formal expansion:$F(x)\; =\; expleft\; [sum\_\{r=1\}^infty(kappa\_r\; -\; gamma\_r)frac\{(-D)^r\}\{r!\}\; ight]\; Psi(x),.$

If $Psi$ is chosen as the normal density with mean and variance as given by "F", that is, mean μ = κ

_{1}and variance σ^{2}= κ_{2}, then the expansion becomes:$F(x)\; =\; expleft\; [sum\_\{r=3\}^inftykappa\_rfrac\{(-D)^r\}\{r!\}\; ight]\; frac\{1\}\{sqrt\{2pi\}sigma\}expleft\; [-frac\{(x-mu)^2\}\{2sigma^2\}\; ight]\; ,.$

By expanding the exponential and collecting terms according to the order of the derivatives, we arrive at the Gram-Charlier A series. If we include only the first two correction terms to the normal distribution, we obtain

:$F(x)\; =\; frac\{1\}\{sqrt\{2pi\}sigma\}expleft\; [-frac\{(x-mu)^2\}\{2sigma^2\}\; ight]\; left\; [1+frac\{kappa\_3\}\{3!sigma^3\}H\_3left(frac\{x-mu\}\{sigma\}\; ight)+frac\{kappa\_4\}\{4!sigma^4\}H\_4left(frac\{x-mu\}\{sigma\}\; ight)\; ight]\; ,,$

with "H"

_{3}("x") = "x"^{3}− 3"x" and "H"_{4}("x") = "x"^{4}− 6"x"^{2}+ 3 (these areHermite polynomials ).Note that this expression is not guaranteed to be positive, and is therefore not a valid probability distribution. The Gram-Charlier A series diverges in many cases of interest—it converges only if "F"("x") falls off faster than exp(−"x"

^{2}/4) at infinity (Cramér 1957). When it does not converge, the series is also not a trueasymptotic expansion , because it is not possible to estimate the error of the expansion. For this reason, the Edgeworth series (see next section) is generally preferred over the Gram-Charlier A series.**Edgeworth series**Edgeworth developed a similar expansion as an improvement to the

central limit theorem . The advantage of the Edgeworth series is that the error is controlled, so that it is a trueasymptotic expansion .Let "X"

_{"i"}be a sequence ofindependent identically distributed random variable s, and "Y"_{"n"}the standardized sum:$Y\_n\; =\; frac\{sum\_\{i=1\}^n(X\_i-mbox\{E\}\; [X\_i]\; )\}\{sqrt\{sum\_\{i=1\}^nmbox\{var\}\; [X\_i]\; .$

Further, let "F"

_{"n"}be the probability density function of the variables "Y"_{"n"}. By the central limit theorem,:$lim\_\{n\; ightarrowinfty\}\; F\_n(x)\; =\; frac\{1\}\{sqrt\{2piexp(-\; frac\{1\}\{2\}x^2)$

for every "x", as long as the means and variances are finite and the sum of variances diverges to infinity. (Generally, the conclusion of the central limit theorem is about the limit of

cumulative distribution function s, not of probability density functions, and therefore applies to discrete distributions as well. But discrete distributions are not contemplated in the present context).Now assume that the random variables "X"

_{"i"}have mean μ, variance σ^{2}, and higher cumulants κ_{"r"}=σ^{"r"}λ_{"r"}. If we expand in terms of the unit normal distribution, that is, if we set:$Psi(x)=frac\{1\}\{sqrt\{2piexp(-\; frac\{1\}\{2\}x^2)$

then the cumulant differences in the formal expression of the characteristic function "f"

_{"n"}(t) of "F"_{"n"}are:$kappa\_1-gamma\_1\; =\; 0,,$

:$kappa\_2-gamma\_2\; =\; 0,,$

:$kappa\_r-gamma\_r\; =\; frac\{lambda\_r\}\{n^\{r/2-1;\; qquad\; rgeq\; 3,.$

The Edgeworth series is developed similarly to the Gram-Charlier A series, only that now terms are collected according to powers of "n". Thus, we have

:$f\_n(t)=left\; [1+sum\_\{j=1\}^infty\; frac\{P\_j(it)\}\{n^\{j/2\; ight]\; exp(-t^2/2),,$

where "P"

_{"j"}("x") is apolynomial of degree 3"j". Again, after inverse Fourier transform, the density function "F"_{"n"}follows as:$F\_n(x)\; =\; Psi(x)\; +\; sum\_\{j=1\}^infty\; frac\{P\_j(-D)\}\{n^\{j/2\; Psi(x),.$

The first three terms of the expansion are (Cramér 1957)

:$F\_n(x)approx\; Psi(x)\; -\; frac\{lambda\_3\; Psi^\{(3)\}(x)\}\{6sqrt\{n\; +frac\{1\}\{n\}left\; [frac\{lambda\_4\; Psi^\{(4)\}(x)\}\{24\}+frac\{lambda\_3^2\; Psi^\{(6)\}(x)\}\{72\}\; ight]\; +O(1/n^\{3/2\}),.$

Here, $Psi$

^{("j")}("x") is the "j"th derivative of $Psi$("x") with respect to "x". Blinnikov and Moessner (1998) have given a simple algorithm to calculate higher-order terms of the expansion.**Further reading*** H. Cramér (1957). "Mathematical Methods of Statistics". Princeton University Press, Princeton.

* D. L. Wallace (1958). "Asymptotic approximations to distributions". "Ann. Math. Stat." 29:635-654.

* P. McCullagh (1987). "Tensor Methods in Statistics". Chapman and Hall, London.

* D. R. Cox and O. E. Barndorff-Nielsen (1989). "Asymptotic Techniques for Use in Statistics". Chapman and Hall, London.

* P. Hall (1992). "The Bootstrap and Edgeworth Expansion". Springer, New York.

* S. Blinnikov and R. Moessner (1998). "Expansions for nearly Gaussian distributions". "Astron. Astrophys. Suppl. Ser." 130:193-205.

* J. E. Kolassa (2006). "Series Approximation Methods in Statistics, 3rd Edition". (Lecture Notes in Statistics #88). Springer, New York.

*Wikimedia Foundation.
2010.*