Table of contents
1 Cumulants of probability distributions
2 Some properties of cumulants
3 Cumulants of particular probability distributions
4 Joint cumulants
5 History
6 "Formal" cumulants
7 One well-known example
8 Cumulants of a polynomial sequence of binomial type

Cumulants of probability distributions

In probability theory and statistics, the cumulants κn of a probability distribution are given by

where X is any random variable whose probability distribution is the one whose cumulants are taken. In other words, κn/n! is the nth coefficient in the power series representation of the logarithm of the moment-generating function. The logarithm of the moment-generating function is therefore called the cumulant-generating function.

The "problem of cumulants" seeks characterizations of sequences that are cumulants of some probability distribution.

Some properties of cumulants

Invariance and equivariance

The first cumulant is shift-equivariant; all of the others are shift-invariant. To state this less tersely, denote by κn(X) the nth cumulant of the probability distribution of the random variable X. The statement is that if c is constant then κ1(X + c) = κ1(X) + c and κn(X + c) = κn(X) for n≥ 2, i.e., c is added to the first cumulant, but all higher cumulants are unchanged.

Homogeneity

The nth cumulant is homogeneous of degree n, i.e. if c is any constant, then

Additivity

If X and Y are independent random variables then κn(X + Y) = κn(X) + κn(Y).

Cumulants and moments

The cumulants are related to the moments by the following recursion formula:

The nth moment μ′n is an nth-degree polynomial in the first n cumulants, thus:

The "prime" distinguishes the moments μ′n from the central moments μn. To express the central moments as functions of the cumulants, just drop from these polynomials all terms in which κ1 appears as a factor.

The coefficients are precisely those that occur in Faà di Bruno's formula.

Cumulants and set-partitions

These polynomials have a remarkable combinatorial interpretation: the coefficients count certain partitions of sets. A general form of these polynomials is

where

  • π runs through the list of all partitions of a set of size n;

  • "B ∈ π" means B is one of the "blocks" into which the set is partitioned; and

  • |B| is the size of the set B.

Thus each monomial is a constant times a product of cumulants in which the sum of the indices is n (e.g., in the term κ3 κ22 κ1, the sum of the indices is 3 + 2 + 2 + 1 = 8; this appears in the polynomial that expresses the 8th moment as a function of the first eight cumulants). A partition of the integer n corresponds to each term. The coefficient in each term is the number of partitions of a set of n members that collapse to that partition of the integer n when the members of the set become indistinguishable.

Cumulants of particular probability distributions

The cumulants of the normal distribution with expected value μ and variance σ2 are κ1 = μ, κ2 = σ2, and κn = 0 for n > 2.

All of the cumulants of the Poisson distribution are equal to the expected value.

Joint cumulants

The joint cumulant of several random variables X1, ..., Xn is

where π runs through the list of all partitions of { 1, ..., n }, and B runs through the list of all block of the partition π. For example,

The joint cumulant of just one random variable is its expected value, and that of two random variables is their
covariance. If some of the random variables are idependent of all of the others, then the joint cumulant is zero. If all n random variables are the same, then the joint cumulant is the nth ordinary cumulant.

The combinatorial meaning of the expression of moments in terms of cumulants is easier to understand than that of cumulants in terms of moments:

where κB(X1, ..., Xn) is the joint cumulant of those among the random variables X1, ..., Xn whose indices are included in the block B. For example:

Conditional cumulants

The law of total expectation and the law of total variance generalize naturally to conditional cumulants. The case n = 3, expressed in the language of (central) moments rather than that of cumulants, says

Time limitations cause us to leave it as an exercise for the reader to find the general law for higher cumulants. Hint: Think about partitions.

History

Cumulants were first introduced by the Danish astronomer, actuary, mathematician, and statistician Thorvald N. Thiele (1838 - 1910) in 1889. Thiele called them half-invariants. They were first called cumulants in a 1931 paper, The derivation of the pattern formulae of two-way partitions from those of simpler patterns, Proceedings of the London Mathematical Society, Series 2, v. 33, pp. 195-208, by the great statistical geneticist Sir Ronald Fisher and the statistician John Wishart, eponym of the Wishart distribution. In another paper published in 1929, Fisher had called them cumulative moment functions.

"Formal" cumulants

More generally, the cumulants of a sequence { mn : n = 1, 2, 3, ... }, not necessarily the moments of any probability distribution, are given by

where the values of κn for n = 1, 2, 3, ... are found "formally", i.e., by algebra alone, in disregard of questions of whether any series converges. All of the difficulties of the "problem of cumulants" are absent when one works "formally". The simplest example is that the second cumulant of a probability distribution must always be nonnegative, and is zero only if all of the higher cumulants are zero. "Formal" cumulants are subject to no such constraints.

One well-known example

In combinatorics, the nth Bell number is the number of partitions of a set of size n. All of the cumulants of the sequence of Bell numbers are equal to 1. The Bell numbers are the moments of the Poisson distribution with expected value 1.

Cumulants of a polynomial sequence of binomial type

For any sequence { κn : n = 1, 2, 3, ... } of scalars in a field of characteristic zero, being considered formal cumulants, there is a corresponding sequence { μ ′ : n = 1, 2, 3, ... } of formal moments, given by the polynomials above. For those polynomials, construct a polynomial sequence in the following way. Out the polynomial

make a new polynomial in these plus one additional variable x:

... and generalize the pattern. The pattern is that the numbers of blocks in the aforementioned partitions are the exponents on x. Each coefficient is a polynomial in the cumulants; these are the Bell polynomials, named after Eric Temple Bell.

This sequence of polynomials is of binomial type. In fact, no other sequences of binomial type exist; every polynomial sequence of binomial type is completely determined by its sequence of cumulants.