# Probability metric

Probability metric

A probability metric is a function defining a distance between random variables or vectors. In particular the probability metric does not satisfy the identity of indiscernibles condition required to be satisfied by the metric of the metric space.

Probability metric of random variables

A probability metric "D" between two random variables "X" and "Y" may be defined as:

:$D\left(X, Y\right) = E\left(|X - Y|\right).,$

If the joint probability distribution is absolutely continuous, this is the same as

:$int_\left\{-infty\right\}^infty int_\left\{-infty\right\}^infty |x-y|F\left(x, y\right) , dx, dy,$

where "F"("x", "y") denotes the joint probability density function of random variables "X" and "Y". Obviously, if "X" and "Y" are independent from each other, the equation above transforms into:

:$D\left(X, Y\right) = int_\left\{-infty\right\}^infty int_\left\{-infty\right\}^infty |x-y|f\left(x\right)g\left(y\right) , dx, dy$

where "f"("x") and "g"("y") are the probability density functions of "X" and "Y" respectively.

One may easily show that such probability metrics do not satisfy the identity of indiscernibles condition of the metric or satisfies it if and only if both of its arguments "X", "Y" are certain events described by Dirac delta density probability distribution functions. In this case:

:$D_\left\{deltadelta\right\}\left(X, Y\right) = int_\left\{-infty\right\}^infty int_\left\{-infty\right\}^infty |x-y|delta\left(x-mu_x\right)delta\left(y-mu_y\right) , dx, dy = |mu_x-mu_y|$

the probability metric simply transforms into the metric between expected values $mu_x$, $mu_y$ of the variables "X" and "Y" and obviously:

:$D_\left\{deltadelta\right\}\left(X, X\right) = int_\left\{-infty\right\}^infty int_\left\{-infty\right\}^infty |x-x\text{'}|delta\left(x-mu_x\right)delta\left(x\text{'}-mu_x\right) , dx, dx\text{'} = |mu_x-mu_x| = 0.$

For all other cases:

:$Dleft\left(X, X ight\right) > 0.$

[ Probability metric between two random variables "X" and "Y", both having normal distributions and the same standard deviation $sigma = 0, sigma = 0.2, sigma = 0.4, sigma = 0.6, sigma = 0.8, sigma = 1$ (beginning with the bottom curve).$m_\left\{xy\right\} = |mu_x-mu_y|$ denotes a distance between means of "X" and "Y".]

Example: two continuous random variables with normal distributions (NN)

If both probability distribution functions of random variables "X" and "Y" are normal distributions (N) having the same standard deviation &sigma;, and moreover "X" and "Y" are independent, then evaluating "D"("X", "Y") yields

:$D_\left\{NN\right\}\left(X, Y\right) = mu_\left\{xy\right\} + frac\left\{2sigma\right\}\left\{sqrtpi\right\}operatorname\left\{exp\right\}left\left(-frac\left\{mu_\left\{xy\right\}^2\right\}\left\{4sigma^2\right\} ight\right)-mu_\left\{xy\right\} operatorname\left\{erfc\right\} left\left(frac\left\{mu_\left\{xy\left\{2sigma\right\} ight\right)$

where:$mu_\left\{xy\right\} = left|mu_x-mu_y ight|$,

erfc("x") is the complementary error function and subscripts NN indicate the type of the metric.

In this case "zero value" of the probability metric $D_\left\{NN\right\}\left(X, Y\right)$ amounts:

:$lim_\left\{mu_\left\{xy\right\} o 0\right\} D_\left\{NN\right\}\left(X, Y\right) = D_\left\{NN\right\}\left(X, X\right) = frac\left\{2sigma\right\}\left\{sqrtpi\right\}.$

Example: two continuous random variables with uniform distributions (RR)

In case both random variables "X" and "Y" are characterized by uniform distributions ("R") of the same standard deviation &sigma;, integrating "D"("X", "Y") yields:

:

The minimal value of this kind of probability metric amounts:

:$D_\left\{RR\right\}\left(X, X\right) = frac\left\{2sigma\right\}\left\{sqrt\left\{3.$

Probability metric of discrete random variables

In case random variables "X" and "Y" are characterized by discrete probability distribution the probability metric "D" may be defined as: :$D\left(X, Y\right) = sum_\left\{i\right\} sum_\left\{j\right\} |x_i-y_j|P\left(X=x_i\right)P\left(Y=y_j\right),$.

For example for two discrete Poisson-distributed random variables "X" and "Y" the equation above transforms into:

:$D_\left\{PP\right\}\left(X, Y\right) = sum_\left\{x=0\right\}^nsum_\left\{y=0\right\}^n |x-y|frac sin\left\{left\left(frac\left\{m pi x\right\}\left\{L\right\} ight\right)\right\}, ,$

:$psi_n\left(y\right) = sqrt\left\{frac\left\{2\right\}\left\{L sin\left\{left\left(frac\left\{n pi y\right\}\left\{L\right\} ight\right)\right\}, ,$

may be defined in terms of probability metric of independent random variables as:

:

The distance between particles "X" and "Y" is obviously minimum for "m" = 1 i "n" = 1, that is for the minimum energy levels of these particles and amounts:

:$min\left(D\left(X, Y\right)\right) = Lleft\left(frac\left\{4\right\}\left\{3\right\}-frac\left\{4\right\}\left\{pi^2\right\} ight\right) approx 0.93L ,.$

According to the probability metric properties the minimum distance is nonzero. In fact it is close to the length "L" of the potential well. For other energy levels it is even greater than the length of the well.

External references

* [http://www.springerlink.com/content/y4fbdb0m0r12701p/ A new concept of probability metric and its applications in approximation of scattered data sets]

Wikimedia Foundation. 2010.

### Look at other dictionaries:

• probability and statistics — ▪ mathematics Introduction       the branches of mathematics concerned with the laws governing random events, including the collection, analysis, interpretation, and display of numerical data. Probability has its origin in the study of gambling… …   Universalium

• Metric derivative — In mathematics, the metric derivative is a notion of derivative appropriate to parametrized paths in metric spaces. It generalizes the notion of speed or absolute velocity to spaces which have a notion of distance (i.e. metric spaces) but not… …   Wikipedia

• Probabilistic metric space — A probabilistic metric space is a generalization of metric spaces where the distance is no longer defined on positive real numbers, but on distribution functions. Let D + be the set of all probability distribution functions F such that F (0) = 0… …   Wikipedia

• Wasserstein metric — In mathematics, the Wasserstein (or Vasershtein) metric is a distance function defined between probability distributions on a given metric space M. Intuitively, if each distribution is viewed as a unit amount of dirt piled on M, the metric is the …   Wikipedia

• Lévy-Prokhorov metric — In mathematics, the Lévy Prokhorov metric (sometimes known just as the Prokhorov metric) is a metric (i.e. a definition of distance) on the collection of probability measures on a given metric space. It is named after the French mathematician… …   Wikipedia

• Prior probability — Bayesian statistics Theory Bayesian probability Probability interpretations Bayes theorem Bayes rule · Bayes factor Bayesian inference Bayesian network Prior · Posterior · Likelihood …   Wikipedia

• Sequential probability ratio test — The sequential probability ratio test (SPRT) is a specific sequential hypothesis test, developed by Abraham Wald. [cite journal first=Abraham last=Wald title=Sequential Tests of Statistical Hypotheses journal=Annals of Mathematical Statistics… …   Wikipedia

• Standard probability space — In probability theory, a standard probability space (called also Lebesgue Rokhlin probability space) is a probability space satisfying certain assumptions introduced by Vladimir Rokhlin in 1940  . He showed that the unit interval endowed with… …   Wikipedia

• Fisher information metric — In information geometry, the Fisher information metric is a particular Riemannian metric which can be defined on a smooth statistical manifold, i.e., a smooth manifold whose points are probability measures defined on a common probability space.… …   Wikipedia

• Hutchinson metric — In mathematics, the Hutchinson metric is a function which measures the discrepancy between two images for use in fractal image processing and can also be applied to describe the similarity between DNA sequences expressed as real or complex… …   Wikipedia