Quantum mutual information

Quantum mutual information

In quantum information theory, quantum mutual information, or von Neumann mutual information, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual information.

Motivation

For simplicity, it will be assumed that all objects in the article are finite dimensional.

The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables "p"("x", "y"), the two marginal distributions are

:p(x) = sum_{y} p(x,y); , ; p(y) = sum_{x} p(x,y).

The classical mutual information "I"("X", "Y") is defined by

:;I(X,Y) = S(p(x)) + S(p(y)) - S(p(x,y))

where "S"("q") denotes the Shannon entropy of the probability distribution "q".

One can calculate directly

:; S(p(x)) + S(p(y))

:; = sum_x p_x log p(x) + sum_y p_y log p(y) :; = sum_x ; ( sum_{y'} p(x,y') log sum_{y'} p(x,y') ) + sum_y ( sum_{x'} p(x',y) log sum_{x'} p(x',y))

:; = sum_{x,y} p(x,y) (log sum_{y'} p(x,y') + log sum_{x'} p(x',y))

:; = sum_{x,y} p(x,y) log p(x) p(y) .

So the mutual information is

:I(X,Y) = sum_{x,y} p(x,y) log frac{p(x) p(y)}{p(x,y)}.

But this is precisely the relative entropy between "p"("x", "y") and "p"("x")"p"("y"). In other words, if we assume the two variables "x" and "y" to be uncorrelated, mutual information is the "discrepancy in uncertainty" resulting from this (possibly erroneous) assumption.

It follows from the property of relative entropy that "I"("X","Y") ≥ 0 and equality holds if and only if "p"("x", "y") = "p"("x")"p"("y").

Definition

The quantum mechanical counterpart of classical probability distributions are density matrices.

Consider a composite quantum system whose state space is the tensor product

:H = H_A otimes H_B.

Let "ρ""AB" be a density matrix acting on "H". The von Neumann entropy of "ρ", which is the quantum mechanical analaog of the Shannon entropy, is given by

:S( ho^{AB}) = - operatorname{Tr} ho^{AB} log ho^{AB}.

For a probability distribution "p"("x","y"), the marginal distributions are obtained by integrating away the variables "x" or "y". The corresponding operation for density matrices is the partial trace. So one can assign to "ρ" a state on the subsystem "A" by

: ho^A = operatorname{Tr}_B ; ho^{AB}

where Tr"B" is partial trace with respect to system "B". This is the reduced state of "ρAB" on system "A". The reduced von Neumann entropy' of "ρAB" with respect to system "A" is

:;S( ho^A).

"S"("ρB") is defined in the same way.

"Technical Note:" In mathematical language, passing from the classical to quantum setting can be described as follows. The "algebra of observables" of a physical system is a C*-algebra and states are unital linear functionals on the algebra. Classical systems are described by commutative C*-algebras, therefore classical states are probability measures. Quantum mechanical systems have non-commutative observable algebras. In concrete considerations, quantum states are density operators. If the probability measure "μ" is a state on classical composite system consisting of two subsystem "A" and "B", we project "μ" onto the system "A" to obtain the reduced state. As stated above, the quantum analog of this is the partial trace operation, which can be viewed as projection onto a tensor component. "End of note"

We can see now the appropriate definition of quantum mutual information should be

:; I( ho^{AB}) = S( ho^A) + S( ho^B) - S( ho^{AB}).

Quantum mutual information can be interpretated the same way as in the classical case: it can be shown that

:I( ho^{AB}) = S( ho^{AB} | ho^A otimes ho^B)

where S(cdot | cdot) denotes quantum relative entropy.


Wikimedia Foundation. 2010.

Игры ⚽ Нужно сделать НИР?

Look at other dictionaries:

  • Mutual information — Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y). In probability theory and information theory, the mutual information (sometimes known by the archaic term… …   Wikipedia

  • Information — as a concept has a diversity of meanings, from everyday usage to technical settings. Generally speaking, the concept of information is closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning,… …   Wikipedia

  • Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… …   Wikipedia

  • Joint quantum entropy — The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory. Intuitively, given two quantum states ho and sigma, represented as density operators that are subparts of a quantum system, the joint… …   Wikipedia

  • Quantum decoherence — Quantum mechanics Uncertainty principle …   Wikipedia

  • Quantum operation — In quantum mechanics, a quantum operation is a mathematical formalism used to describe a broad class of transformations that a quantum mechanical system can undergo. This formalism describes not only time evolution or symmetry transformations of… …   Wikipedia

  • Coherent information — The coherent information is an entropy measure used in quantum information theory. It is a property of a quantum state ρ and a quantum channel ; intuitively, it attempts to describe how much of the quantum information in the state will remain… …   Wikipedia

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • Introduction to quantum mechanics — This article is an accessible, non technical introduction to the subject. For the main encyclopedia article, see Quantum mechanics. Quantum mechanics …   Wikipedia

  • Masanori Ohya — (大矢 雅則, Ōya Masanori?, born 1947) is a Japanese mathematician. After he revieved a Ph.D. in Mathematical Physics and Information Science and Dr.Sc., he continuously worked on operator algebra, Quantum Entropy, quantum information theory and bio… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”