- Triangular matrix
In the mathematical discipline of
linear algebra , a triangular matrix is a special kind ofsquare matrix where the entries either below or above themain diagonal are zero. Because matrix equations with triangular matrices are easier to solve they are very important innumerical analysis . TheLU decomposition gives an algorithm to decompose anyinvertible matrix "A" into a normed lower triangle matrix "L" and an upper triangle matrix "U".Description
A matrix of the form :mathbf{L}=egin{bmatrix}l_{1,1} & & & & 0 \l_{2,1} & l_{2,2} & & & \l_{3,1} & l_{3,2} & ddots & & \vdots & vdots & ddots & ddots & \l_{n,1} & l_{n,2} & ldots & l_{n,n-1} & l_{n,n}end{bmatrix}
is called lower triangular matrix or left triangular matrix, and analogously a matrix of the form:mathbf{U} =egin{bmatrix}u_{1,1} & u_{1,2} & u_{1,3} & ldots & u_{1,n} \ & u_{2,2} & u_{2,3} & ldots & u_{2,n} \ & & ddots & ddots & vdots \ & & & ddots & u_{n-1,n}\ 0 & & & & u_{n,n}end{bmatrix}
is called upper triangular matrix or right triangular matrix.
The standard operations on triangular matrices conveniently preserve the triangular form: the sum and product of two upper triangular matrices is again upper triangular. The inverse of an upper triangular matrix is also upper triangular, and of course we can multiply an upper triangular matrix by a constant and it will still be upper triangular. This means that the upper triangular matrices form a
subalgebra of the ring of square matrices for any given size. The analogous result holds for lower triangular matrices. Note, however, that the product of a "lower" triangular with an "upper" triangular matrix does "not" preserve triangularity.pecial forms
Strictly triangular matrix
A triangular matrix with zero entries on the
main diagonal is strictly upper or lower triangular. All strictly triangular matrices are nilpotent.Unitriangular matrix
If the entries on the main diagonal are 1, the matrix is called (upper or lower) unitriangular. All unitriangular matrices are
unipotent . Other names used for these matrices are unit (upper or lower) triangular (of which "unitriangular" might be a contraction), or normed upper/lower triangular (very rarely used). However a "unit" triangular matrix is not the same as the "unit matrix", and a "normed" triangular matrix has nothing to do with the notion ofmatrix norm .Gauss matrix
A
Gauss matrix is a special form of a unitriangular matrix, where all of the off-diagonal entries are zero, except for the entries in one column. Such a matrix is also called atomic upper/lower triangular or Gauss (transformation) matrix. So an atomic lower triangular matrix is of the form:mathbf{L}_{i} =egin{bmatrix} 1 & & & & & & & 0 \ 0 & ddots & & & & & & \ 0 & ddots & 1 & & & & & \ 0 & ddots & 0 & 1 & & & & \ & & 0 & l_{i+1,i} & 1 & & & \vdots & & 0 & l_{i+2,i} & 0 & ddots & & \ & & vdots & vdots & vdots & ddots & 1 & \ 0 & dots & 0 & l_{n,i} & 0 & dots & 0 & 1 \end{bmatrix}.The inverse of an atomic triangular matrix is again atomic triangular. Indeed, we have:mathbf{L}_{i}^{-1} =egin{bmatrix} 1 & & & & & & & 0 \ 0 & ddots & & & & & & \ 0 & ddots & 1 & & & & & \ 0 & ddots & 0 & 1 & & & & \ & & 0 & -l_{i+1,i} & 1 & & & \vdots & & 0 & -l_{i+2,i} & 0 & ddots & & \ & & vdots & vdots & vdots & ddots & 1 & \ 0 & dots & 0 & -l_{n,i} & 0 & dots & 0 & 1 \end{bmatrix},i.e., the the single column of off-diagonal entries are replaced in the inverse matrix by their additive inverses.pecial properties
A matrix which is simultaneously upper and lower triangular is diagonal. The
identity matrix is the only matrix which is both upper and lower unitriangular.A matrix which is simultaneously triangular and normal, is also diagonal. This can be seen by looking at the diagonal entries of "A"*"A" and "AA"*, where "A" is a normal, triangular matrix.
The
transpose of an upper triangular matrix is a lower triangular matrix and vice versa.The
determinant of a triangular matrix equals the product of the diagonal entries. Since for any triangular matrix "A" the matrix x I-A, whose determinant is thecharacteristic polynomial of "A", is also triangular, the diagonal entries of "A" in fact give themultiset ofeigenvalue s of "A" (an eigenvalue with multiplicity "m" occurs exactly "m" times as diagonal entry).Sheldon Axler, "Linear Algebra Done Right", Springer-Verlag, 1996, ISBN 0-387-98258-2, pp. 86-87, 169.]Any complex square matrix is similar to a triangular matrix. In fact, a matrix "A" over a field containing all of the eigenvalues of "A" (for example, any matrix over an
algebraically closed field ) is similar to a triangular matrix. A more precise statement is given by theJordan normal form theorem, which states that in this situation, "A" is similar to an upper triangular matrix of a very particular form. The simpler triangularization result is often sufficient however, and in any case used in proving the Jordan normal form theorem.I. N. Herstein, "Topics in Algebra" (2nd edition), John Wiley and Sons, 1975, ISBN 0-471-01090-1, pp. 285-290.]In the case of complex matrices, it is possible to say more about triangularization, namely, that any square matrix "A" has a
Schur decomposition . This means that "A" is unitarily equivalent (i.e. similar, using aunitary matrix as change of basis) to an upper triangular matrix.The variable "L" is commonly used for lower triangular matrix, standing for lower/left, while the variable "U" or "R" is commonly used for upper triangular matrix, standing for upper/right.
Many operations can be performed more easily on triangular matrices than on general matrices. Notably matrix inversion (when possible) can be done by back substitution, avoiding the complications of general
Gaussian elimination .Generalizations
Because the product of two upper triangular matrices is again upper triangular, the set of upper triangular matrices forms an algebra. Algebras of upper triangular matrices have a natural generalization in
functional analysis which yieldsnest algebra s onHilbert space s.The set of invertible triangular matrices of a given kind (upper or lower) forms a group, which is a subgroup of the
general linear group of all invertible matrices. The group of 2 by 2 upper unitriangular matrices isisomorphic to theadditive group of the field of scalars; in the case of complex numbers it corresponds to a group formed of parabolicMöbius transformation s; the 3 by 3 upper unitriangular matrices form theHeisenberg group .The upper triangular matrices are precisely those that stabilize the standard flag. The invertible ones among them form a subgroup of the general linear group, whose conjugate subgroups are those defined as the stabilizer of some (other) complete flag. These subgroups are
Borel subgroup s. The group of invertible lower triangular matrices is such a subgroup, since it is the stabilizer of the standard flag associated to the standard basis in reverse order.The stabilizer of a partial flag obtained by forgetting some parts of the standard flag can be described as a set of block upper triangular matrices (but its elements are "not" all triangular matrices). The conjugates of such a group are the subgroups defined as the stabilizer of some partial flag. These subgroups are called
parabolic subgroup s.Examples
The matrix:egin{bmatrix}1 & 4 & 2 \0 & 3 & 4 \0 & 0 & 1 \end{bmatrix}is upper triangular and:egin{bmatrix}1 & 0 & 0 \2 & 8 & 0 \4 & 9 & 7 \end{bmatrix}is lower triangular.
The matrix:egin{bmatrix}1 & 0 & 0 & 0 \0 & 1 & 0 & 0 \0 & 4 & 1 & 0 \0 & 2 & 0 & 1 \end{bmatrix}is atomic lower triangular. Its inverse is:egin{bmatrix}1 & 0 & 0 & 0 \0 & 1 & 0 & 0 \0 & -4 & 1 & 0 \0 & -2 & 0 & 1 \end{bmatrix}.
Forward and Back Substitution
A matrix equation in the form mathbf{L}mathbf{x} = mathbf{b} or mathbf{U} mathbf{x} = mathbf{b} is very easy to solve [Assuming that L (respectively, U) is invertible (has non-zero diagonal entries), otherwise it cannot be solved in general).] by an iterative process called forward substitution for lower triangular matrices and analogously back substitution for upper triangular matrices.The process is so called because for lower triangular matrices, one first computes x_1, then substitutes that "forward" into the "next" equation to solve for x_2, and repeats through to x_n. In an upper triangular matrix, one works "backwards," first computing x_n, then substituting that "back" into the "previous" equation to solve for x_{n-1}, and repeating through x_1.
Notice that this does not require inverting the matrix.
Forward substitution
The matrix equation L"x" = "b" can be written as a system of linear equations
:egin{matrix}l_{1,1} x_1 & & & & & = & b_1 \l_{2,1} x_1 & + & l_{2,2} x_2 & & & = & b_2 \ vdots & & vdots & ddots & & & vdots \l_{m,1} x_1 & + & l_{m,2} x_2 & + dotsb + & l_{m,m} x_m & = & b_m \end{matrix}
Observe that the first equation (l_{1,1} x_1 = b_1) only involves x_1, and thus one can solve for x_1 directly. The second equation only involves x_1 and x_2, and thus can be solved once one substitutes in the already solved value for x_1. Continuing in this way, the k-th equation only involves x_1,dots,x_k, and one can solve for x_k using the previously solved values for x_1,dots,x_{k-1}.
The resulting formulas are::x_1 = frac{b_1}{l_{1,1, :x_2 = frac{b_2 - l_{2,1} x_1}{l_{2,2, ::vdots :x_m = frac{b_m - sum_{i=1}^{m-1} l_{m,i}x_i}{l_{m,m.
A matrix equation with an upper triangular matrix U can be solved in an analogous way, only working backwards.
Applications
Forward substitution is used in financial bootstrapping to construct a
yield curve .See also
*
Gaussian elimination
*QR decomposition
*Hessenberg matrix
*Tridiagonal matrix
*Invariant subspace References
Wikimedia Foundation. 2010.