Matrix determinant lemma

Matrix determinant lemma

In mathematics, in particular linear algebra, the matrix determinant lemma[1][2] computes the determinant of the sum of an invertible matrix A and the dyadic product, u vT, of a column vector u and a row vector vT.

Contents

Statement

Suppose A is an invertible square matrix and u, v are column vectors. Then the matrix determinant lemma states that

\det(\mathbf{A}+\mathbf{uv}^\mathrm{T}) = (1 + \mathbf{v}^\mathrm{T}\mathbf{A}^{-1}\mathbf{u})\,\det(\mathbf{A}).

Here, uvT is the dyadic product of two vectors u and v.

Proof

First the proof of the special case A = I follows from the equality:[3]


\begin{pmatrix} \mathbf{I} & 0 \\ \mathbf{v}^\mathrm{T} & 1 \end{pmatrix}
\begin{pmatrix} \mathbf{I}+\mathbf{uv}^\mathrm{T} & \mathbf{u} \\ 0 & 1 \end{pmatrix}
\begin{pmatrix} \mathbf{I} & 0 \\ -\mathbf{v}^\mathrm{T} & 1 \end{pmatrix} =
\begin{pmatrix} \mathbf{I} & \mathbf{u} \\ 0 & 1 + \mathbf{v}^\mathrm{T}\mathbf{u} \end{pmatrix}.

The determinant of the left hand side is the product of the determinants of the three matrices. Since the first and third matrix are triangle matrices with unit diagonal, their determinants are just 1. The determinant of the middle matrix is our desired value. The determinant of the right hand side is simply (1 + vTu). So we have the result:

\det(\mathbf{I}+\mathbf{uv}^\mathrm{T}) = (1 + \mathbf{v}^\mathrm{T}\mathbf{u}).

Then the general case can be found as:


\begin{align}
\det(\mathbf{A} + \mathbf{uv}^\mathrm{T}) &= \det(\mathbf{A}) \det(\mathbf{I} + \mathbf{A}^{-1}\mathbf{uv}^\mathrm{T})\\
&= \det(\mathbf{A}) (1 + \mathbf{v}^\mathrm{T} \mathbf{A}^{-1}\mathbf{u}).
\end{align}

Application

If the determinant and inverse of A are already known, the formula provides a numerically cheap way to compute the determinant of A corrected by the matrix uvT. The computation is relatively cheap because the determinant of A+uvT does not have to be computed from scratch (which in general is expensive). Using unit vectors for u and/or v, individual columns, rows or elements[4] of A may be manipulated and a correspondingly updated determinant computed relatively cheaply in this way.

When the matrix determinant lemma is used in conjunction with the Sherman-Morrison formula, both the inverse and determinant may be conveniently updated together.

Generalization

Suppose A is an invertible n-by-n matrix and U, V are n-by-m matrices. Then

\operatorname{det}(\mathbf{A}+\mathbf{UV}^\mathrm{T}) = \operatorname{det}(\mathbf{I} + \mathbf{V}^\mathrm{T}\mathbf{A}^{-1}\mathbf{U})\operatorname{det}(\mathbf{A}).

In the special case \mathbf{A}=\mathbf{I} this is Sylvester's theorem for determinants.

Given additionally an invertible m-by-m matrix W, the relationship can also be expressed as

\operatorname{det}(\mathbf{A}+\mathbf{UWV}^\mathrm{T}) = \det(\mathbf{W}^{-1} + \mathbf{V}^\mathrm{T}\mathbf{A}^{-1}\mathbf{U})\det(\mathbf{W})\det(\mathbf{A}).

See also

  • The Sherman-Morrison formula, which shows how to update the inverse, A−1, to obtain (A+uvT)−1.
  • The Woodbury formula, which shows how to update the inverse, A−1, to obtain (A+UCVT)−1.

References

  1. ^ Harville, D. A. (1997). Matrix Algebra From a Statistician’s Perspective. Springer-Verlag. 
  2. ^ Brookes, M. (2005). "The Matrix Reference Manual [online"]. http://www.ee.ic.ac.uk/hp/staff/dmb/matrix/intro.html. 
  3. ^ Ding, J., Zhou, A. (2007). "Eigenvalues of rank-one updated matrices with some applications". Applied Mathematics Letters 20 (12): 1223–1226. doi:10.1016/j.aml.2006.11.016. ISSN 0893-9659. http://www.sciencedirect.com/science/article/B6TY9-4N3P02W-5/2/b7f582211325150af4c44674b5e06dd1. 
  4. ^ William H. Press, Brian P. Flannery, Saul A. Teukolsky, William T. Vetterling (1992). Numerical Recipes in C: The Art of Scientific Computing. Cambridge University Press. pp. 73. ISBN 0-521-43108-5. 

Wikimedia Foundation. 2010.

Игры ⚽ Нужно решить контрольную?

Look at other dictionaries:

  • Determinant — This article is about determinants in mathematics. For determinants in epidemiology, see Risk factor. In linear algebra, the determinant is a value associated with a square matrix. It can be computed from the entries of the matrix by a specific… …   Wikipedia

  • Woodbury matrix identity — In mathematics (specifically linear algebra), the Woodbury matrix identity, named after Max A. Woodbury[1][2] says that the inverse of a rank k correction of some matrix can be computed by doing a rank k correction to the inverse of the original… …   Wikipedia

  • Matrix exponential — In mathematics, the matrix exponential is a matrix function on square matrices analogous to the ordinary exponential function. Abstractly, the matrix exponential gives the connection between a matrix Lie algebra and the corresponding Lie group.… …   Wikipedia

  • Invertible matrix — In linear algebra an n by n (square) matrix A is called invertible (some authors use nonsingular or nondegenerate) if there exists an n by n matrix B such that where In denotes the n by n identity matrix and the multiplication used is ordinary… …   Wikipedia

  • Schwartz-Zippel lemma and testing polynomial identities — Polynomial identity testing is the problem of determining whether a given multivariate polynomial is the0 polynomial or identically equal to 0. The input to the problem is an n variable polynomial over a fieldF. It can occur in the following… …   Wikipedia

  • Nakayama lemma — In mathematics, more specifically modern algebra and commutative algebra, Nakayama s lemma also known as the Krull–Azumaya theorem[1] governs the interaction between the Jacobson radical of a ring (typically a commutative ring) and its finitely… …   Wikipedia

  • List of mathematics articles (M) — NOTOC M M estimator M group M matrix M separation M set M. C. Escher s legacy M. Riesz extension theorem M/M/1 model Maass wave form Mac Lane s planarity criterion Macaulay brackets Macbeath surface MacCormack method Macdonald polynomial Machin… …   Wikipedia

  • List of lemmas — This following is a list of lemmas (or, lemmata , i.e. minor theorems, or sometimes intermediate technical results factored out of proofs). See also list of axioms, list of theorems and list of conjectures. 0 to 9 *0/1 Sorting Lemma ( comparison… …   Wikipedia

  • Binomial inverse theorem — In mathematics, the binomial inverse theorem is useful for expressing matrix inverses in different ways.If A, U, B, V are matrices of sizes p × p , p × q , q × q , q × p , respectively, then:left(mathbf{A}+mathbf{UBV} ight)^{ 1}=mathbf{A}^{ 1}… …   Wikipedia

  • Sherman–Morrison formula — In mathematics, in particular linear algebra, the Sherman–Morrison formula,Jack Sherman and Winifred J. Morrison, [http://projecteuclid.org/euclid.aoms/1177729940 Adjustment of an Inverse Matrix Corresponding to Changes in the Elements of a Given …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”