Von Neumann entropy

Von Neumann entropy

In quantum statistical mechanics, von Neumann entropy refers to the extension of classical entropy concepts to the field of quantum mechanics.

John von Neumann rigorously established the correct mathematical framework for quantum mechanics with his work "Mathematische Grundlagen der Quantenmechanik". He provided in this work a theory of measurement, where the usual notion of wave collapse is described as an irreversible process (the so called von Neumann or projective measurement).

Description

The density matrix was introduced, with different motivations, by von Neumann and by Lev Landau. The motivation that inspired Landau was the impossibility of describing a subsystem of a composite quantum system by a state vector. On the other hand, von Neumann introduced the density matrix in order to develop both quantum statistical mechanics and a theory of quantum measurements. The density matrix formalism was developed to extend the tools of classical statistical mechanics to the quantum domain. In the classical framework we compute the partition function of the system in order to evaluate all possible thermodynamic quantities. Von Neumann introduced the density matrix in the context of states and operators in a Hilbert space. The knowledge of the statistical density matrix operator would allow us to compute all average quantities in a conceptually similar, but mathematically different way. Let us suppose we have a set of wave functions |Psi angle which depend parametrically on a set of quantum numbers n_1, n_2,..., n_N. The natural variable which we have is the amplitude with which aparticular wavefunction of the basic set participates in the actual wavefunction of the system. Let us denote the square of this amplitude by p(n_1, n_2, ..., n_N). The goal is to turn this quantity p into the classical density function in phase space. We have to verify that p goes over into the density function in the classical limit and that it has ergodic properties. After checking that p(n_1, n_2, ..., n_N) is a constant of motion, an ergodic assumption for the probabilities p(n_1, n_2, ..., n_N) makes p a function of the energy only .

After this procedure, one finally arrives at the density matrix formalism when seeking a form where p(n_1, n_2, ..., n_N) is invariant with respect to the representation used. In the form it is written, it will only yield the correct expectation values for quantities which are diagonal with respect to the quantum numbers n_1, n_2, ..., n_N. Expectation values of operators which are not diagonal involve the phases of the quantum amplitudes. Suppose we encode the quantum numbers n_1, n_2, ...,n_N into the single index i or j. Then our wave function has the form

:|Psi angle ,=,sum_i a_i, | psi_i angle.

The expectation value of an operator B which is not diagonal in these wave functions, so

: E(B) ,=,sum_{i,j} a_i^{*}a_j, langle i| B |j angle.

The role which was originally reserved for the quantities |a_i|^2 is thus taken over by the density matrix of your system S.

: langle j|, ho , |i angle ,=,a_j, a_i^{*}.

Therefore EB reads as

: E(B) ,=, Tr ( ho , B).

The invariance of the above term is described by matrix theory. We described a mathematical framework where the expectation value of quantum operators, as described by matrices, is obtained by taking the trace of the product of the density operator hat ho and an operator hat B (Hilbert scalar product between operators). The matrix formalism here is in the statistical mechanics framework, although it applies as well for finite quantum systems, which is usually the case, where the state of the system cannot be described by a pure state, but as a statistical operator hat ho of the form ( ef{qp1}). Mathematically, hat ho is a positive, semidefinite hermitian matrix with unit trace.

Given the density matrix ρ , von Neumann defined the entropy as

: S( ho) ,=,-{ m Tr} ( ho , { m ln} ho),

which is a proper extension of the Gibbs entropy (and the Shannon entropy) to the quantum case. To compute ( ef{Sv}) one has to find a basis in which ρ possesses a diagonal representation. We note that the entropy S( ho) times the Boltzmann constant k_B equals the thermodynamical or physical entropy. If the system is finite (finite dimensional matrix representation) the entropy ( ef{Sv}) describes the departure of our system from a pure state. In other words, it measures the degree of mixture of our state describing a given finite system. Properties of the von Neumann entropy

* S( ho) is only zero for pure states.
* S( ho) is maximal and equal to ln N for a maximally mixed state, N being the dimension of the Hilbert space.
* S( ho) is invariant under changes in the basis of ho, that is, S( ho)=S(U, ho , U^{dagger}), with U a unitary transformation.
* S( ho) is concave, that is, given a collection of positive numbers lambda_i and density operators ho_i, we have: Sigg(sum_{i=1}^k lambda_i , ho_i igg) ,geq, sum_{i=1}^k lambda_i , S( ho_i).

* S( ho) is additive. Given two density matrices ho_A, ho_B describing independent systems A and B, then S( ho_A otimes ho_B)=S( ho_A)+S( ho_B).

Instead, if ho_A, ho_B are the reduced density matrices of the general state ho_{AB}, then

: |S( ho_A),-,S( ho_B)|,leq , S( ho_{AB}) , leq , S( ho_A),+,S( ho_B).

This right hand inequality is known as "subadditivity". The two inequalitiestogether are sometimes known as the "triangle inequality." They were proved in 1970by Huzihiro Araki and Elliott H. Lieb. [ Huzihiro Araki and Elliott H. Lieb, "Entropy Inequalities,"Communications in Mathematical Physics, vol 18, 160-170 (1970).] While in Shannon's theory the entropy of a composite system can never be lower than the entropy of any of its parts, in quantum theory this is not the case, i.e., it is possible thatS( ho_{AB}) =0 while S( ho_{A}) >0 and S( ho_{B}) > 0.Actually, this can be seen as an indicator of an entangled state ho_{AB}.

*The von Neumann entropy is also "strongly subadditive." Given three Hilbert spaces, A,B,C ,

:S( ho_{ABC}) , + , S( ho_{B}) , leq , S( ho_{AB}) ,+, S( ho_{BC}).

This is a much more difficult theorem and was proved in 1973 by
Elliott H. Lieb and Mary Beth Ruskai, [ Elliott H. Lieb and Mary Beth Ruskai, "Proof of the Strong Subadditivity of Quantum-Mechanical Entropy," Journal of Mathematical Physics, vol 14, 1938-1941 (1973).] using amatrix inequality of Elliott H. Lieb [Elliott H. Lieb, "Convex Trace Functions and the Wigner-Yanase-Dyson""Conjecture," Advances in Mathematics, vol 67, 267-288 (1973).] proved in 1973.

The von Neumann entropy is being extensively used in different forms (conditional entropies, relative entropies, etc.) in the framework of quantum information theory. Entanglement measures are based upon some quantity directly related to the von Neumann entropy. However, there have appeared in the literature several papers dealing with the possible inadequacy of the Shannon information measure, and consequently of the von Neumann entropy as an appropriate quantum generalization of Shannon entropy. The main argument is that in classical measurement the Shannon information measure is a natural measure of our ignorance about the properties of a system, whose existence is independent of measurement. Conversely, quantum measurement cannot be claimed to reveal the properties of a system that existed before the measurement was made. This controversy has encouraged some authors to introduce the non-additivity property of Tsallis' entropy (a generalization of the standard Boltzmann-Gibbs entropy) as the main reason for recovering a true quantal information measure in the quantum context, claiming that non-local correlations ought to be described because of the particularity of Tsallis' entropy.

In 2004 A. Stotland, A.A. Pomeransky, E. Bachmat and D. Cohen have introduced a new definition of entropy that reflects the inherent uncertainty of quantum mechanical states. This proper definition allows to distinguish between the minimum uncertainty entropy of pure states, and the excess statistical entropy of mixtures. Furthermore this proper definition satisfies the basic inequalities of information theory.

References

*cite book |last=Von Neumann |first=John |authorlink=John von Neumann |coauthors= |title=Mathematische Grundlagen der Quantenmechanik (Mathematical Foundations of Quantum Mechanics)|year=1955 |publisher=Springer |location=Berlin |id=ISBN 3540592075

* P. Pluch "Theory for Quantum Probability," PhD Thesis, Klagenfurt University (2006)
* [http://arxiv.org/abs/quant-ph/0401021 The information entropy of quantum mechanical states] , Europhysics Letters 67, 700 (2004).


Wikimedia Foundation. 2010.

Игры ⚽ Нужно сделать НИР?

Look at other dictionaries:

  • Von Neumann (disambiguation) — von Neumann may refer to:* von Neumann (crater), a lunar impact crater * von Neumann (surname), a German surnameee also* von Neumann algebra * von Neumann architecture * von Neumann conjecture * von Neumann entropy * von Neumann machine * von… …   Wikipedia

  • John von Neumann — Von Neumann redirects here. For other uses, see Von Neumann (disambiguation). The native form of this personal name is Neumann János. This article uses the Western name order. John von Neumann …   Wikipedia

  • Von Neumann algebra — In mathematics, a von Neumann algebra or W* algebra is a * algebra of bounded operators on a Hilbert space that is closed in the weak operator topology and contains the identity operator. They were originally introduced by John von Neumann,… …   Wikipedia

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • Entropy (disambiguation) — Additional relevant articles may be found in the following categories: Thermodynamic entropy Entropy and information Quantum mechanical entropy Entropy, in thermodynamics, is a measure of the energy in a thermodynamic system not available to do… …   Wikipedia

  • Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the …   Wikipedia

  • Entropy (general concept) — In many branches of science, entropy refers to a certain measure of the disorder of a system. Entropy is particularly notable as it has a broad, common definition that is shared across physics, mathematics and information science. Although the… …   Wikipedia

  • Joint quantum entropy — The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory. Intuitively, given two quantum states ho and sigma, represented as density operators that are subparts of a quantum system, the joint… …   Wikipedia

  • Conditional quantum entropy — The conditional quantum entropy is an entropy measure used in quantum information theory. It is a generalization of the conditional entropy of classical information theory. The conditional entropy is written S(ρ | σ), or H(ρ | σ), depending on… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”