Law of total cumulance

Law of total cumulance

In probability theory and mathematical statistics, the law of total cumulance is a generalization to cumulants of the law of total probability, the law of total expectation, and the law of total variance. It has applications in the analysis of time series. It was introduced by David Brillinger (see "References" below).

It is most transparent when stated in its most general form, for "joint" cumulants, rather than for cumulants of a specified order for just one random variable. In general, we have

:kappa(X_1,dots,X_n)=sum_pi kappa(kappa(X_i : iin B mid Y) : B in pi),

where

* κ("X"1, ..., "X""n") is the joint cumulant of "n" random variables "X"1, ..., "X""n", and

* the sum is over all partitions pi of the set { 1, ..., "n" } of indices, and

* "B" ∈ π" means "B" runs through the whole list of "blocks" of the partition π, and

* κ("X""i" : "i" ∈ "B" | "Y") is a conditional cumulant given the value of the random variable "Y". It is therefore a random variable in its own right—a function of the random variable "Y".

Examples


=The special case of just one random variable and "n" = 2 or 3=

Only in case "n" = either 2 or 3 is the "n"th cumulant the same as the "n"th central moment. The case "n" = 2 is well-known (see law of total variance). Below is the case "n" = 3. The notation μ3 means the third central moment.

:mu_3(X)=E(mu_3(Xmid Y))+mu_3(E(Xmid Y))+3,operatorname{cov}(E(Xmid Y),operatorname{var}(Xmid Y)).,

General 4th-order joint cumulants

For general 4th-order cumulants, the rule gives a sum of 15 terms, as follows::kappa(X_1,X_2,X_3,X_4),

::=kappa(kappa(X_1,X_2,X_3,X_4mid Y)),

:::left.egin{matrix}& {}+kappa(kappa(X_1,X_2,X_3mid Y),kappa(X_4mid Y)) \ \& {}+kappa(kappa(X_1,X_2,X_4mid Y),kappa(X_3mid Y)) \ \& {}+kappa(kappa(X_1,X_3,X_4mid Y),kappa(X_2mid Y)) \ \& {}+kappa(kappa(X_2,X_3,X_4mid Y),kappa(X_1mid Y))end{matrix} ight}(mathrm{partitions} mathrm{of} mathrm{the} 3+1 mathrm{form})

:::left.egin{matrix}& {}+kappa(kappa(X_1,X_2mid Y),kappa(X_3,X_4mid Y)) \ \& {}+kappa(kappa(X_1,X_3mid Y),kappa(X_2,X_4mid Y)) \ \& {}+kappa(kappa(X_1,X_4mid Y),kappa(X_2,X_3mid Y))end{matrix} ight}(mathrm{partitions} mathrm{of} mathrm{the} 2+2 mathrm{form})

:::left.egin{matrix}& {}+kappa(kappa(X_1,X_2mid Y),kappa(X_3mid Y),kappa(X_4mid Y)) \ \& {}+kappa(kappa(X_1,X_3mid Y),kappa(X_2mid Y),kappa(X_4mid Y)) \ \& {}+kappa(kappa(X_1,X_4mid Y),kappa(X_2mid Y),kappa(X_3mid Y)) \ \& {}+kappa(kappa(X_2,X_3mid Y),kappa(X_1mid Y),kappa(X_4mid Y)) \ \& {}+kappa(kappa(X_2,X_4mid Y),kappa(X_1mid Y),kappa(X_3mid Y)) \ \& {}+kappa(kappa(X_3,X_4mid Y),kappa(X_1mid Y),kappa(X_2mid Y))end{matrix} ight}(mathrm{partitions} mathrm{of} mathrm{the} 2+1+1 mathrm{form})

:::{}+kappa(kappa(X_1mid Y),kappa(X_2mid Y),kappa(X_3mid Y),kappa(X_4mid Y)).,

Cumulants of compound Poisson random variables

Suppose "Y" has a Poisson distribution with expected value 1, and "X" is the sum of "Y" independent copies of "W".

:X=sum_{y=1}^Y W_y.,

All of the cumulants of the Poisson distribution are equal to each other, and so in this case are equal to 1. Also recall that if random variables "W"1, ..., "W""m" are independent, then the "n"th cumulant is additive:

:kappa_n(W_1+cdots+W_m)=kappa_n(W_1)+cdots+kappa_n(W_m).,

We will find the 4th cumulant of "X". We have:

:kappa_4(X)=kappa(X,X,X,X),

::=kappa_1(kappa_4(Xmid Y))+4kappa(kappa_3(Xmid Y),kappa_1(Xmid Y))+3kappa_2(kappa_2(Xmid Y)),

:::{}+6kappa(kappa_2(Xmid Y),kappa_1(Xmid Y),kappa_1(Xmid Y))+kappa_4(kappa_1(Xmid Y)),

::=kappa_1(Ykappa_4(W))+4kappa(Ykappa_3(W),Ykappa_1(W))+3kappa_2(Ykappa_2(W)),

:::{}+6kappa(Ykappa_2(W),Ykappa_1(W),Ykappa_1(W))+kappa_4(Ykappa_1(W)),

::=kappa_4(W)kappa_1(Y)+4kappa_3(W)kappa_1(W)kappa_2(Y)+3kappa_2(W)^2 kappa_2(Y),

:::{}+6kappa_2(W) kappa_1(W)^2 kappa_3(Y)+kappa_1(W)^4 kappa_4(Y),

::=kappa_4(W)+4kappa_3(W)kappa_1(W)+3kappa_2(W)^2+6kappa_2(W) kappa_1(W)^2+kappa_1(W)^4.,

::=E(W^4), (the punch line—see the explanation below).

We recognize this last sum as the sum over all partitions of the set { 1, 2, 3, 4 }, of the product over all blocks of the partition, of cumulants of "W" of order equal to the size of the block. That is precisely the 4th raw moment of "W" (see cumulant for a more leisurely discussion of this fact). Hence the moments of "W" are the cumulants of "X".

In this way we see that every moment sequence is also a cumulant sequence (the converse cannot be true, since cumulants of even order ≥ 4 are in some cases negative, and also because the cumulant sequence of the normal distribution is not a moment sequence of any probability distribution).

Conditioning on a Bernoulli random variable

Suppose "Y" = 1 with probability "p" and "Y" = 0 with probability "q" = 1 − "p". Suppose the conditional probability distribution of "X" given "Y" is "F" if "Y" = 1 and "G" if "Y" = 0. Then we have

:kappa_n(X)=pkappa_n(F)+qkappa_n(G)+sum_{pi kappa_{left|pi ight(Y)prod_{Binpi}(kappa_{left|B ight(F)-kappa_{left|B ight(G))

where pi means π is a partition of the set { 1, ..., "n" } that is finer than the coarsest partition -- the sum is over all partitions except that one. For example, if "n" = 3, then we have

:kappa_3(X)=pkappa_3(F)+qkappa_3(G)+3pq(kappa_2(F)-kappa_2(G))(kappa_1(F)-kappa_1(G))+pq(q-p)(kappa_1(F)-kappa_1(G))^3.,

References

* David Brillinger, "The calculation of cumulants via conditioning", "Annals of the Institute of Statistical Mathematics", Vol. 21 (1969), pp. 215-218.


Wikimedia Foundation. 2010.

Игры ⚽ Поможем решить контрольную работу

Look at other dictionaries:

  • Law of total probability — In probability theory, the law (or formula) of total probability is a fundamental rule relating marginal probabilities to conditional probabilities. Contents 1 Statement 2 Applications 3 Other names 4 See …   Wikipedia

  • Law of total variance — In probability theory, the law of total variance or variance decomposition formula states that if X and Y are random variables on the same probability space, and the variance of X is finite, then:operatorname{var}(X)=operatorname{E}(operatorname{v… …   Wikipedia

  • Cumulant — In probability theory and statistics, the cumulants κn of a probability distribution are a set of quantities that provide an alternative to the moments of the distribution. The moments determine the cumulants in the sense that any two probability …   Wikipedia

  • Conditioning (probability) — Beliefs depend on the available information. This idea is formalized in probability theory by conditioning. Conditional probabilities, conditional expectations and conditional distributions are treated on three levels: discrete probabilities,… …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

  • List of mathematics articles (L) — NOTOC L L (complexity) L BFGS L² cohomology L function L game L notation L system L theory L Analyse des Infiniment Petits pour l Intelligence des Lignes Courbes L Hôpital s rule L(R) La Géométrie Labeled graph Labelled enumeration theorem Lack… …   Wikipedia

  • List of probability topics — This is a list of probability topics, by Wikipedia page. It overlaps with the (alphabetical) list of statistical topics. There are also the list of probabilists and list of statisticians.General aspects*Probability *Randomness, Pseudorandomness,… …   Wikipedia

  • Conditional expectation — In probability theory, a conditional expectation (also known as conditional expected value or conditional mean) is the expected value of a real random variable with respect to a conditional probability distribution. The concept of conditional… …   Wikipedia

  • Compound Poisson distribution — In probability theory, a compound Poisson distribution is the probability distribution of the sum of a Poisson distributed number of independent identically distributed random variables. In the simplest cases, the result can be either a… …   Wikipedia

  • List of partition topics — This is a list of partition topics, in the mathematical sense. Partition (disambiguation) lists meanings in other fields. In mathematics, a partition may be a partition of a set or an ordered partition of a set, or a partition of a graph, or a… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”