- Law of total cumulance
In
probability theory and mathematicalstatistics , the law of total cumulance is a generalization tocumulant s of thelaw of total probability , thelaw of total expectation , and thelaw of total variance . It has applications in the analysis oftime series . It was introduced by David Brillinger (see "References" below).It is most transparent when stated in its most general form, for "joint" cumulants, rather than for cumulants of a specified order for just one
random variable . In general, we have:
where
* κ("X"1, ..., "X""n") is the joint cumulant of "n" random variables "X"1, ..., "X""n", and
* the sum is over all partitions of the set { 1, ..., "n" } of indices, and
* "B" ∈ π" means "B" runs through the whole list of "blocks" of the partition π, and
* κ("X""i" : "i" ∈ "B" | "Y") is a conditional cumulant given the value of the random variable "Y". It is therefore a random variable in its own right—a function of the random variable "Y".
Examples
=The special case of just one random variable and "n" = 2 or 3=Only in case "n" = either 2 or 3 is the "n"th cumulant the same as the "n"th
central moment . The case "n" = 2 is well-known (seelaw of total variance ). Below is the case "n" = 3. The notation μ3 means the third central moment.:
General 4th-order joint cumulants
For general 4th-order cumulants, the rule gives a sum of 15 terms, as follows::
::
:::
:::
:::
:::
Cumulants of compound Poisson random variables
Suppose "Y" has a
Poisson distribution withexpected value 1, and "X" is the sum of "Y" independent copies of "W".:
All of the cumulants of the Poisson distribution are equal to each other, and so in this case are equal to 1. Also recall that if random variables "W"1, ..., "W""m" are independent, then the "n"th cumulant is additive:
:
We will find the 4th cumulant of "X". We have:
:
::
:::
::
:::
::
:::
::
:: (the punch line—see the explanation below).
We recognize this last sum as the sum over all partitions of the set { 1, 2, 3, 4 }, of the product over all blocks of the partition, of cumulants of "W" of order equal to the size of the block. That is precisely the 4th raw moment of "W" (see
cumulant for a more leisurely discussion of this fact). Hence the moments of "W" are the cumulants of "X".In this way we see that every moment sequence is also a cumulant sequence (the converse cannot be true, since cumulants of even order ≥ 4 are in some cases negative, and also because the cumulant sequence of the
normal distribution is not a moment sequence of any probability distribution).Conditioning on a Bernoulli random variable
Suppose "Y" = 1 with probability "p" and "Y" = 0 with probability "q" = 1 − "p". Suppose the conditional probability distribution of "X" given "Y" is "F" if "Y" = 1 and "G" if "Y" = 0. Then we have
:
where
Wikimedia Foundation. 2010.