Conditional quantum entropy

Conditional quantum entropy

The conditional quantum entropy is an entropy measure used in quantum information theory. It is a generalization of the conditional entropy of classical information theory. The conditional entropy is written S(ρ | σ), or H(ρ | σ), depending on the notation being used for the von Neumann entropy.

For the remainder of the article, we use the notation S(ρ) for the von Neumann entropy.

Definition

Given two quantum states ρ and σ, the von Neumann entropies are S(ρ) and S(σ). The von Neumann entropy measures how uncertain we are about the value of the state; how much the state is a mixed state. The joint quantum entropy S(ρ,σ) measures our uncertainty about the joint system which contains both states.

By analogy with the classical conditional entropy, one defines the conditional quantum entropy as S(\rho|\sigma) \ \stackrel{\mathrm{def}}{=}\  S(\rho,\sigma) - S(\sigma).

An equivalent (and more intuitive) operational definition of the quantum conditional entropy (as a measure of the quantum communication cost or surplus when performing quantum state merging) was given by Michał Horodecki, Jonathan Oppenheim, and Andreas Winter in their paper "Quantum Information can be negative" [1].

Properties

Unlike the classical conditional entropy, the conditional quantum entropy can be negative. This is true even though the (quantum) von Neumann entropy of single variable is never negative.

References

Nielsen, Michael A. and Isaac L. Chuang (2000). Quantum Computation and Quantum Information. Cambridge University Press, ISBN 0-521-63503-9.


Wikimedia Foundation. 2010.

Игры ⚽ Поможем решить контрольную работу

Look at other dictionaries:

  • Joint quantum entropy — The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory. Intuitively, given two quantum states ho and sigma, represented as density operators that are subparts of a quantum system, the joint… …   Wikipedia

  • Conditional entropy — Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y). In information theory, the conditional entropy (or equivocation) quantifies the remaining entropy (i.e.… …   Wikipedia

  • Quantum information — For the journal with this title, see Historical Social Research. In quantum mechanics, quantum information is physical information that is held in the state of a quantum system. The most popular unit of quantum information is the qubit, a two… …   Wikipedia

  • Quantum dot cellular automaton — Quantum Dot Cellular Automata (sometimes referred to simply as quantum cellular automata, or QCA) Any device designed to represent data and perform computation, regardless of the physics principles it exploits and materials used to build it, must …   Wikipedia

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • Entropy of mixing — The entropy of mixing is the change in the configuration entropy, an extensive thermodynamic quantity, when two different chemical substances or components are mixed. This entropy change must be positive since there is more uncertainty about the… …   Wikipedia

  • Joint entropy — The joint entropy is an entropy measure used in information theory. The joint entropy measures how much entropy is contained in a joint system of two random variables. If the random variables are X and Y, the joint entropy is written H(X,Y). Like …   Wikipedia

  • Von Neumann entropy — In quantum statistical mechanics, von Neumann entropy refers to the extension of classical entropy concepts to the field of quantum mechanics.John von Neumann rigorously established the correct mathematical framework for quantum mechanics with… …   Wikipedia

  • Rényi entropy — In information theory, the Rényi entropy, a generalisation of Shannon entropy, is one of a family of functionals for quantifying the diversity, uncertainty or randomness of a system. It is named after Alfréd Rényi. The Rényi entropy of order α,… …   Wikipedia

  • List of information theory topics — This is a list of information theory topics, by Wikipedia page.*A Mathematical Theory of Communication *algorithmic information theory *arithmetic encoding *channel capacity *Communication Theory of Secrecy Systems *conditional entropy… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”