Tensor product

Tensor product

In mathematics, the tensor product, denoted by otimes, may be applied in different contexts to vectors, matrices, tensors, vector spaces, algebras, topological vector spaces, and modules. In each case the significance of the symbol is the same: the most general bilinear operation. In some contexts, this product is also referred to as outer product. The term "tensor product" is also used in relation to monoidal categories.

Example:

: egin{bmatrix} a_{11} & a_{12} \ a_{21} & a_{22} \ end{bmatrix}otimes egin{bmatrix} b_{11} & b_{12} \ b_{21} & b_{22} \ end{bmatrix}= egin{bmatrix} a_{11} egin{bmatrix} b_{11} & b_{12} \ b_{21} & b_{22} \ end{bmatrix} & a_{12} egin{bmatrix} b_{11} & b_{12} \ b_{21} & b_{22} \ end{bmatrix} \ & \ a_{21} egin{bmatrix} b_{11} & b_{12} \ b_{21} & b_{22} \ end{bmatrix} & a_{22} egin{bmatrix} b_{11} & b_{12} \ b_{21} & b_{22} \ end{bmatrix} \ end{bmatrix}= egin{bmatrix} a_{11} b_{11} & a_{11} b_{12} & a_{12} b_{11} & a_{12} b_{12} \ a_{11} b_{21} & a_{11} b_{22} & a_{12} b_{21} & a_{12} b_{22} \ a_{21} b_{11} & a_{21} b_{12} & a_{22} b_{11} & a_{22} b_{12} \ a_{21} b_{21} & a_{21} b_{22} & a_{22} b_{21} & a_{22} b_{22} \ end{bmatrix}

Resultant rank = 4, resultant dimension = 4×4.

Here rank denotes the tensor rank (number of requisite indices), while dimension counts the number of degrees of freedom in the resulting array; the matrix rank is 1.

A representative case is the Kronecker product of any two rectangular arrays, considered as matrices. A dyadic product is the special case of the tensor product between two vectors of the same dimension.

Tensor product of vector spaces

The tensor product "V" ⊗ "W" of two vector spaces "V" and "W" over a field "K" can be defined by the method of "generators and relations".

To construct "V" ⊗ "W", one begins with the set of ordered pairs in the Cartesian product "V" × "W". For the purposes of this construction, regard this Cartesian product as a set rather than a vector space. The free vector space on "V" × "W" is defined by taking the vector space in which the elements of "V" × "W" are a basis. Symbolically,

:F(V imes W) = left{sum_{i=1}^n alpha_i e_{(v_i imes w_i)}mid ninmathbb{N}, alpha_iin K, (v_i imes w_i)in V imes W ight},

where we have used the symbol "e"(v × w) to emphasize that these are taken to be linearly independent for distinct ("v" × "w") ∈ "V" × "W".

The tensor product arises by defining the following three equivalence relations in "F"("V" × "W"):

* e_{(v_1+v_2) imes w} sim e_{v_1 imes w}+e_{v_2 imes w}!
* e_{v imes (w_1+w_2)} sim e_{v imes w_1}+e_{v imes w_2}!
* ce_{v imes w}sim e_{(cv) imes w} sim e_{v imes (cw)} !

where "v", "v"i, "w", and "w"i are vectors from "V" and "W" (respectively), and "c" is from the underlying field "K". Denoting by "R" the space generated by these three equivalence relations, the tensor product of the two vector spaces "V" and "W" is then the quotient space

:V otimes W = F(V imes W) / R

It is also called the tensor product space of "V" and "W" and is a vector space (which can be verified by directly checking the vector space axioms). The tensor product of two elements "v" and "w" is the equivalence class ("e"("v" × "w") + "R") of "e"(v × w) in "V" ⊗ "W", denoted "v" ⊗ "w". This notation can somewhat obscure the fact that tensors are always cosets: manipulations performed via the representatives "(v,w)" must always be checked that they do not depend on the particular choice of representative.

The space "R" is mapped to zero in "V" ⊗ "W", so that the above three equivalence relations become equalities in the tensor product space

* (v_1+v_2)otimes w - (v_1otimes w+v_2otimes w) = 0
* votimes (w_1+w_2) - (votimes w_1+votimes w_2) = 0
* cvotimes w=votimes cw=c(votimes w).

Given bases {vi} and {wi} for "V" and "W" respectively, the tensors {vi ⊗ wj}form a basis for "V" ⊗ "W" (generally ordered so that vi ⊗ wj+1 comes before vi+1 ⊗ wj). The dimension of the tensor product therefore is the product of dimensions of the original spaces; for instance RmRn will have dimension "mn".

Elements of "V" ⊗ "W" are sometimes referred to as tensors, although this term refers to many other related concepts as well. [See tensor or tensor (intrinsic definition).] An element of "V" ⊗ "W" of the form "v" ⊗ "w" is called a pure or simple tensor. In general, an element of the tensor product space is not a pure tensor, but rather a finite linear combination of pure tensors. To wit, if "v"1 and "v"2 are linearly independent, and "w"1 and "w"2 are also linearly independent, then "v"1 ⊗ "w"1 + "v"2 ⊗ "w"2 cannot be written as a pure tensor.

Characterization by a universal property

The tensor product is characterized by a universal property. Consider the problem of mapping the Cartesian product "V" × "W" into a vector space "X" via a bilinear map "ψ". The tensor product construction "V" ⊗ "W", together with the natural bilinear embedding map "φ" : "V" × "W" → "V" ⊗ "W" given by

:varphi (u,w)= u otimes w , is the "universal" solution to this problem in the following sense. For any other such pair ("X", "ψ"), where "X" is a vector space, and ψ a bilinear mapping "V" × "W" → "X", there exists a "unique" linear map

:T : V otimes W ightarrow X

such that

:psi = T circ varphi.

As with any universal property, this characterizes the tensor product uniquely up to unique isomorphism.

An immediate consequence is the identification of the bilinear maps from "V" × "W" to "X"

:L^2(V imes W, X),

and the linear maps

:L(V otimes W, X)

obtained by sending "ψ" to "T".

As a functor

The tensor product also operates on linear maps between vector spaces. Specifically, given two linear maps "S" : "V" → "X" and "T" : "W" → "Y" between vector spaces, the tensor product of the two linear maps "S" and "T" is a linear
Sotimes T:Votimes W ightarrow Xotimes Ydefined by:(Sotimes T)(votimes w)=S(v)otimes T(w).In this way, the tensor product becomes a bifunctor from the category of vector spaces to itself, covariant in both arguments.

The Kronecker product of two matrices is the matrix of the tensor product of the two corresponding linear maps under a standard choice of bases of the tensor products (see the article on Kronecker products).

More than two vector spaces

The construction and the universal property of the tensor product can be extended to allow for more than two vector spaces. For example, suppose that "V"1, "V"2, and "V"3 are three vector spaces. The tensor product "V"1⊗"V"2⊗"V"3 is defined along with a trilinear mapping from the direct product

:varphi : V_1 imes V_2 imes V_3 o V_1otimes V_2otimes V_3

so that, any trilinear map "F" from the direct product to a vector space "W"

:F:V_1 imes V_2 imes V_3 o W

factors uniquely as

:F = Lcircvarphi

where "L" is a linear map. The tensor product is uniquely characterized by this property, up to a unique isomorphism.

This construction is related to repeated tensor products of two spaces. For example, if "V"1, "V"2, and "V"3 are three vector spaces, then there are (natural) isomorphisms:V_1otimes V_2otimes V_3cong V_1otimes(V_2otimes V_3)cong (V_1otimes V_2)otimes V_3.

More generally, the tensor product of an arbitrary indexed family "V""i", "i" ∈ "I", is defined to be universal with respect to multilinear mappings of the direct product scriptstyle{prod_{iin I} V_i}.

Tensor powers and braiding

Let "n" be a non-negative integer. The "n"th tensor power of the vector space "V" is the "n"-fold tensor product of "V" with itself. That is:V^{otimes n} stackrel{def}{=} underbrace{Votimescdotsotimes V}_{n}.

A permutation σ of the set {1,2,…,"n"} determines a mapping of the "n"th Cartesian power of "V":sigma : V^n o V^ndefined by:sigma(v_1,v_2,dots,v_n) = (v_{sigma 1}, v_{sigma 2},dots,v_{sigma n}).Let:varphi:V^n o V^{otimes n}be the natural multilinear embedding of the Cartesian power of "V" into the tensor power of "V". Then, by the universal property, there is a unique isomorphism: au_sigma : V^{otimes n} o V^{otimes n}such that:varphicircsigma = au_sigmacircvarphi.The isomorphism τσ is called the braiding map associated to the permutation σ.

Tensor product of two tensors

A "tensor on "V" (generally abbreviated to simply a "tensor") is an element of a vector space of the form

:: egin{matrix} T^r_s(V) & = & underbrace{ Votimes dots otimes V} & otimes & underbrace{ V^*otimes dots otimes V^*} & = & V^{otimes m}otimes V^{otimes n}\ & & r & & s end{matrix}

for non-negative integers "r" and "s". There is a general formula for the components of a (tensor) product of two (or more) tensors. For example, if "F" and "G" are two covariant tensors of rank "m" and "n" (respectively) (i.e. "F" ∈ "T"m0, and "G" ∈ "T"n0), then the components of their tensor product are given by

:(Fotimes G)_{i_1i_2...i_{m+n = F_{i_{1}i_{2}...i_{mG_{i_{m+1}i_{m+2}i_{m+3}...i_{m+n. [Analogous formulas also hold for contravariant tensors, as well as tensors of mixed variance. Although in many cases such as when there is an inner product defined, the distinction is irrelevant.] In this example, it is assumed that there is a chosen basis "B" of the vector space "V", and the basis on any tensor space "T"sr is tacitly assumed to be the standard one(this basis is described in the article on Kronecker products).Thus, the components of the tensor product of two tensors are the ordinary product of the components of each tensor.

Note that in the tensor product, the factor "F" consumes the first rank("F") indices, and the factor "G" consumes the next rank("G") indices, so:mathrm{rank}( F otimes G )=mathrm{rank}(F)+mathrm{rank}(G)

Example

Let U be a tensor of type (1,1) with components "Uαβ", and let V be a tensor of type (1,0) with components "Vγ". Then: U^alpha {}_eta V^gamma = (U otimes V)^alpha {}_eta {}^gamma and: V^mu U^ u {}_sigma = (V otimes U)^{mu u} {}_sigma .

The tensor product inherits all the indices of its factors.

See also: Classical treatment of tensors

Kronecker product of two matrices

With matrices this operation is usually called the "Kronecker product", a term used to make clear that the result has a particular block structure imposed upon it, in which each element of the first matrix is replaced by the second matrix, scaled by that element. For matrices U and V this is:

:U otimes V = egin{bmatrix} u_{11}V & u_{12}V & cdots \ u_{21}V & u_{22}V \ vdots & & ddots end{bmatrix} = egin{bmatrix} u_{11}v_{11} & u_{11}v_{12} & cdots & u_{12}v_{11} & u_{12}v_{12} & cdots \ u_{11}v_{21} & u_{11}v_{22} & & u_{12}v_{21} & u_{12}v_{22} \ vdots & & ddots \ u_{21}v_{11} & u_{21}v_{12} \ u_{21}v_{21} & u_{21}v_{22} \ vdots end{bmatrix}.

Tensor product of multilinear maps

Given multilinear maps f (x_1,...x_k) and g (x_1,... x_m)their tensor product is the multilinear function : (f otimes g) (x_1,...,x_{k+m})=f(x_1,...,x_k)g(x_{k+1},...,x_{k+m})

Tensor product of Hilbert spaces

The tensor product of two Hilbert spaces is another Hilbert space, which is defined as described below.

Definition

The discussion so far has been purely algebraic. In light of the extra structure on Hilbert spaces, one would like to introduce an inner product, and therefore a topology, on the tensor product that arise naturally from those of the factors. Let "H"1 and "H"2 be two Hilbert spaces with inner products langle cdot,cdot angle_1 and langle cdot,cdot angle_2, respectively. Construct the tensor product of "H"1 and "H"2 as vector spaces as explained above. We can turn this vector space tensor product into an inner product space by defining: langlephi_1otimesphi_2,psi_1otimespsi_2 angle = langlephi_1,psi_1 angle_1 , langlephi_2,psi_2 angle_2 quad mbox{for all } phi_1,psi_1 in H_1 mbox{ and } phi_2,psi_2 in H_2 and extending by linearity. That this inner product is the natural one is justified by the identification of scalar-valued bilinear maps on "H"1 × "H"2 and linear functionals on their vector space tensor product. Finally, take the completion under this inner product. The resulting Hilbert space is the tensor product of "H"1 and "H"2.

Properties

If "H"1 and "H"2 have orthonormal bases"k"} and {ψ"l"}, respectively, then {φ"k" ⊗ ψ"l"} is an orthonormal basis for "H"1 ⊗ "H"2.

Examples and applications

The following examples show how tensor products arise naturally.

Given two measure spaces "X" and "Y", with measures μ and ν respectively, one may look at L2("X" × "Y"), the space of functions on "X" × "Y" that are square integrable with respect to the product measure μ × ν. If "f" is a square integrable function on "X", and "g" is a square integrable function on "Y", then we can define a function "h" on "X" × "Y" by "h"("x","y") = "f"("x") "g"("y"). The definition of the product measure ensures that all functions of this form are square integrable, so this defines a bilinear mapping L2("X") × L2("Y") → L2("X" × "Y"). Linear combinations of functions of the form "f"("x") "g"("y") are also in L2("X" × "Y"). It turns out that the set of linear combinations is in fact dense in L2("X" × "Y"), if L2("X") and L2("Y") are separable. This shows that L2("X") ⊗ L2("Y") is isomorphic to L2("X" × "Y"), and it also explains why we need to take the completion in the construction of the Hilbert space tensor product.

Similarly, we can show that L2("X"; "H"), denoting the space of square integrable functions "X" → "H", is isomorphic to L2("X") ⊗ "H" if this space is separable. The isomorphism maps "f"("x") ⊗ φ ∈ L2("X") ⊗ "H" to "f"("x")φ ∈ L2("X"; "H"). We can combine this with the previous example and conclude that L2("X") ⊗ L2("Y") and L2("X" × "Y") are both isomorphic to L2("X"; L2("Y")).

Tensor products of Hilbert spaces arise often in quantum mechanics. If some particle is described by the Hilbert space "H"1, and another particle is described by "H"2, then the system consisting of both particles is described by the tensor product of "H"1 and "H"2. For example, the state space of a quantum harmonic oscillator is L2(R), so the state space of two oscillators is L2(R) ⊗ L2(R), which is isomorphic to L2(R2). Therefore, the two-particle system is described by wave functions of the form φ("x"1, "x"2). A more intricate example is provided by the Fock spaces, which describe a variable number of particles.

Relation with the dual space

In the discussion on the universal property, replacing "X" by the underlying scalar field of "V" and "W" yields that the space (V otimes W)^star (the dual space of V otimes W, containing all linear functionals on that space) is naturally identified with the space of allbilinear functionals on V imes W. In other words, every bilinear functional is a functionalon the tensor product, and vice versa.

Whenever V and W are finite dimensional, there is a natural isomorphism between V^star otimes W^star and (V otimes W)^star, whereas for vector spaces of arbitrary dimension we only have an inclusion V^star otimes W^starsubset (V otimes W)^star.So, the tensors of the linear functionals are bilinear functionals. Thisgives us a new way to look at the space of bilinear functionals, as a tensorproduct itself.

Types of tensors, e.g., alternating

Linear subspaces of the bilinearoperators (or in general, multilinear operators) determine natural quotient spaces of the tensor space, which are frequently useful. See wedge product for the first major example. Another would be the treatment of algebraic forms as symmetric tensors.

Over more general rings

The notation otimes_R refers to a tensor product of modules over a ring "R".

Tensor product for computer programmers


=Array programming languages=

Array programming languages may have this pattern built in. For example, in APL the tensor product is expressed as circ . imes (for example A circ . imes B or A circ . imes B circ . imes C). In J the tensor product is the dyadic form of */ (for example a */ b or a */ b */ c).

Note that J's treatment also allows the representation of some tensor fields (as a and b may be functions instead of constants -- the result is then a derived function, and if a and b are differentiable, then a*/b is differentiable).

However, these kinds of notation are not universally present in array languages. Other array languages may require explicit treatment of indices (for example, Matlab), and/or may not support higher-order functions such as the Jacobian derivative (for example, Fortran/APL).

Notes

ee also

*Outer product
*Tensor product of modules
*Tensor product of R-algebras
*Tensor product of fields
*Topological tensor product
*Tensor product of line bundles
*Tensor product of graphs
*Tensor product of quadratic forms
*Dyadic product

References

*.
*.
*.
*.


Wikimedia Foundation. 2010.

Игры ⚽ Поможем решить контрольную работу

Look at other dictionaries:

  • tensor product — tenzorinė sandauga statusas T sritis fizika atitikmenys: angl. tensor product vok. tensorielles Produkt, n; Tensorprodukt, n rus. тензорное произведение, n pranc. produit tensoriel, m …   Fizikos terminų žodynas

  • Tensor product of modules — In mathematics, the tensor product of modules is a construction that allows arguments about bilinear maps (roughly speaking, multiplication ) to be carried out in terms of linear maps (module homomorphisms). The module construction is analogous… …   Wikipedia

  • Tensor product of graphs — In graph theory, the tensor product G × H of graphs G and H is a graph such that * the vertex set of G × H is the Cartesian product V(G) × V(H) ; and * any two vertices (u,u ) and (v,v ) are adjacent in G × H if and only if u is adjacent with v… …   Wikipedia

  • Tensor product of fields — In abstract algebra, the theory of fields lacks a direct product: the direct product of two fields, considered as a ring is never itself a field. On the other hand it is often required to join two fields K and L, either in cases where K and L are …   Wikipedia

  • Tensor product of algebras — In mathematics, the tensor product of two R algebras is also an R algebra in a natural way. This gives us a tensor product of algebras. The special case R = Z gives us a tensor product of rings, since rings may be regarded as Z algebras.Let R be… …   Wikipedia

  • Tensor product of quadratic forms — The tensor product of quadratic forms is most easily understood when one views the quadratic forms as quadratic spaces . So, if (V, q 1) and (W, q 2) are quadratic spaces, which V,W vector spaces, then the tensor product is a quadratic form q on… …   Wikipedia

  • Tensor product network — A tensor product network, in neural networks, is a network that exploits the properties of tensors to model associative concepts such as variable assignment. Orthonormal vectors are chosen to model the ideas (such as variable names and target… …   Wikipedia

  • Topological tensor product — In mathematics, there are usually many different ways to construct a topological tensor product of two topological vector spaces. For Hilbert spaces or nuclear spaces there is a simple well behaved theory of tensor products (see Tensor product of …   Wikipedia

  • Relative tensor product — Let A be a right R moduleand B be a left R module (see the definition of left and right module in module (mathematics)).The relative tensor product Aotimes R B is defined to be the quotient:Aotimes B/Iwhere I is the ideal generated by all… …   Wikipedia

  • Tensor — For other uses, see Tensor (disambiguation). Note that in common usage, the term tensor is also used to refer to a tensor field. Stress, a second order tensor. The tensor s components, in a three dimensional Cartesian coordinate system, form the… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”