A probability distribution assigns to every interval of the real numbers a probability, so that the probability axioms are satisfied. In technical terms, a probability distribution is a probability measure whose domain is the Borel algebra on the reals.

A probability distribution is a special case of the more general notion of a probability measure, which is a function that assigns probabilities satisfying the Kolmogorov axioms to the measurable sets of a measurable space.

Every random variable gives rise to a probability distribution, and this distribution contains most of the important information about the variable. If X is a random variable, the corresponding probability distribution assigns to the interval [a, b] the probability Pr[aXb], i.e. the probability that the variable X will take a value in the interval [a, b].

The probability distribution of the variable X can be uniquely described by its cumulative distribution function F(x), which is defined by

for any x in R.

A distribution is called discrete if its cumulative distribution function consists of a sequence of finite jumps, which means that it belongs to a discrete random variable X: a variable which can only attain values from a certain finite or countable set. A distribution is called continuous if its cumulative distribution function is continuous, which means that it belongs to a random variable X for which Pr[ X = x ] = 0 for all x in R.

The so-called absolutely continuous distributions can be expressed by a probability density function: a non-negative Lebesgue integrable function f defined on the reals such that

for all a and b. That discrete distributions do not admit such a density is unsurprising, but there are continuous distributions like the devil's staircase that also do not admit a density.

The support of a distribution is the smallest closed set whose complement has probability zero.

List of important probability distributions

Several probability distributions are so important in theory or applications that they have been given specific names:

  • Discrete distributions
    • With finite support
      • The degenerate distribution at x0, where X is certain to take the value x0. This does not look random, but it satisfies the definition of random variable. This is useful because it puts deterministic variables and random variables in the same formalism.
      • The discrete uniform distribution, where all elements of a finite set are equally likely. This is supposed to be the distribution of a balanced coin, an unbiased die, a casino roulette or a well-shuffled deck. Also, one can use measurements of quantum states to generate uniform random variables. All these are "physical" or "mechanical" devices, subject to design flaws or perturbations, so the uniform distribution is only an approximation of their behaviour. In digital computers, pseudo-random number generators are used to produced a statistically random discrete uniform distribution.
      • The Bernoulli distribution, which takes value 1 with probability p and value 0 with probability q=1-p.
      • The binomial distribution, which describes the number of successes in a series of independent Yes/No experiments.
      • The hypergeometric distribution, which describes the number of successes in the first m of a series of n independent Yes/No experiments, if the total number of successes is known.
    • With infinite support
  • Continuous distributions
    • Supported on a finite interval
      • The uniform distribution on [a,b], where all points in a finite interval are equally likely.
      • The Beta distribution on [0,1], of which the uniform distribution is a special case, and which is useful in estimating success probabilities.
      • The Triangular distribution on [a, b]
    • Supported on semi-infinite intervals, usually [0,∞)
      • The exponential distribution, which describes the time between rare random events.
      • The Gamma distribution, which describes the time until n rare random events occur.
      • The Log-normal distribution, describing variables which can be modelled as the product of many small independent positive variables.
      • The Weibull distribution, of which the exponential distribution is a special case, is used to model the lifetime of technical devices.
      • The chi-square distribution, which is the sum of the squares of n independent Gaussian random variables. It is a special case of the Gamma distribution, and it is used in goodness-of-fit tests in statistics.
      • The F-distribution, which is the distribution of the ratio of two normally distributed random variables, used in the analysis of variance.
    • Supported on the whole real line
      • The normal distribution, also called the Gaussian or the bell curve. It is ubiquitous in nature and statistics due to the central limit theorem: every variable that can be modelled as a sum of many small independent variables is approximately normal.
      • Student's t-distribution, useful for estimating unknown means of Gaussian populations.
      • The Cauchy distribution, an example of a distribution which does not have an expected value or a variance. In physics it is usually called a Lorentzian, and it is the distribution of the energy of an unstable state in quantum mechanics. In particle physics, the extremely short-lived particles associated to unstable states are called resonances.
  • Joint distributions
    • Two or more random variables on the same sample space
      • Bivariate distribution
      • Conditional distribution
      • Multivariate distribution
      • Multinomial distribution, a generalization of the binomial distribution.
  • Matrix-valued distributions

See also

list of statistical topics -- random variable -- cumulative distribution function -- probability density function -- likelihood