Tensor products provide a systematic way to combine vector spaces into a larger space that encodes bilinear structure. They appear throughout modern mathematics, physics, numerical analysis, differential geometry, representation theory, and machine learning.
The tensor product transforms multilinear problems into linear problems. This reduction is the central reason for its importance.
If ordinary vectors describe quantities with one index, tensors describe quantities with multiple indices. Tensor products give the algebraic framework underlying this idea.
98.1 Motivation
Suppose and are vector spaces over a field .
We often encounter maps
that are linear in each variable separately.
Such maps are called bilinear.
For example, the dot product
is bilinear because
and
The same properties hold in the second argument.
The tensor product constructs a vector space
such that every bilinear map from factors uniquely through a linear map from .
This converts bilinear algebra into ordinary linear algebra.
98.2 Bilinear Maps
A map
is bilinear if
and
for all , , and scalars .
Similarly,
and
Thus bilinearity means linearity in each argument independently.
Examples include:
| Bilinear map | Domain | Codomain |
|---|---|---|
| Dot product | ||
| Matrix multiplication | ||
| Polynomial multiplication | ||
| Inner product |
Bilinear maps arise naturally whenever two independent linear inputs interact.
98.3 Definition of the Tensor Product
The tensor product of and is a vector space together with a bilinear map
satisfying the following universal property:
For every vector space and every bilinear map
there exists a unique linear map
such that
This definition may be represented by the commutative diagram
The composition equals the original bilinear map.
The tensor product therefore represents all bilinear maps simultaneously.
98.4 Pure Tensors
The image of a pair under is written
Such elements are called pure tensors or simple tensors.
The tensor product space is generated by pure tensors.
The bilinearity conditions imply:
and
These identities define the algebraic structure of tensor products.
A general tensor is usually a sum of pure tensors:
Not every tensor is itself pure.
98.5 Construction of the Tensor Product
The tensor product can be constructed explicitly.
Start with the free vector space generated by all pairs . Then impose the bilinearity relations:
The quotient space obtained after imposing these relations is .
This construction ensures that bilinearity is built directly into the space itself.
98.6 Basis of a Tensor Product
Suppose
is a basis for , and
is a basis for .
Then
is a basis for .
Therefore,
If and , then
This resembles the size of Cartesian products, but tensor products encode linear structure rather than ordered pairs.
98.7 Coordinates in Tensor Products
Suppose
Then
The coefficients multiply because of bilinearity.
This behavior explains why tensors naturally represent multidimensional arrays.
For example, if
then
The tensor product records all pairwise products of coordinates.
98.8 Tensor Product of Linear Maps
Suppose
and
are linear maps.
The tensor product map
is defined by
This definition extends linearly to all tensors.
Tensor products therefore preserve linear structure at the level of transformations as well as spaces.
If matrices represent and , then the matrix representing is the Kronecker product of the matrices.
98.9 Kronecker Products
If
and
then the Kronecker product is
Explicitly,
Kronecker products appear in signal processing, quantum mechanics, numerical PDEs, and tensor computation.
98.10 Tensor Rank
A tensor has rank if it is pure:
The rank of a tensor is the smallest number of pure tensors needed to express it:
Tensor rank generalizes matrix rank.
Unlike matrix rank, tensor rank is often difficult to compute. Many problems involving tensor rank are computationally hard.
Low-rank tensor approximation is important in machine learning and scientific computing because it compresses large multidimensional datasets.
98.11 Dual Spaces and Tensor Products
Let denote the dual space of .
The tensor product
can be identified with the space of linear maps
If
then the tensor
acts on by
This is a rank-one linear operator.
More generally,
This identification is fundamental throughout multilinear algebra.
98.12 Symmetric and Alternating Tensors
Tensor products contain important subspaces.
A tensor is symmetric if swapping indices does not change it.
For example,
is symmetric.
A tensor is alternating if swapping indices changes the sign.
For example,
is alternating.
These ideas lead to:
| Structure | Associated algebra |
|---|---|
| Symmetric tensors | Symmetric algebra |
| Alternating tensors | Exterior algebra |
| Mixed tensors | General tensor algebra |
Alternating tensors are central in differential geometry and topology.
98.13 Tensor Algebra
The tensor algebra of is
Here
The tensor algebra contains tensors of every order.
Elements of:
| Space | Interpretation |
|---|---|
| Vectors | |
| Second-order tensors | |
| Third-order tensors | |
| -tensors |
Tensor algebras provide the foundation for exterior algebras, Clifford algebras, and representation theory.
98.14 Tensors in Physics
Tensors describe physical quantities that remain meaningful under coordinate transformations.
Examples include:
| Tensor | Physical meaning |
|---|---|
| Stress tensor | Internal forces in materials |
| Metric tensor | Geometry of spacetime |
| Electromagnetic tensor | Electric and magnetic fields |
| Inertia tensor | Rotational dynamics |
| Curvature tensor | Spacetime curvature |
In relativity, tensors encode laws of physics independently of coordinates.
This coordinate-independent formulation is one of the major achievements of tensor calculus.
98.15 Tensors in Machine Learning
Modern machine learning frequently represents data as tensors.
Examples include:
| Data type | Tensor order |
|---|---|
| Vector | 1 |
| Matrix | 2 |
| RGB image | 3 |
| Video | 4 |
| Transformer attention weights | Higher-order |
Tensor decompositions generalize matrix factorizations such as the singular value decomposition.
Popular decompositions include:
| Decomposition | Purpose |
|---|---|
| CP decomposition | Rank decomposition |
| Tucker decomposition | Multilinear compression |
| Tensor train | Efficient high-dimensional representation |
Large-scale tensor computation is fundamental in deep learning systems.
98.16 Universal Properties
The universal property defines tensor products abstractly and uniquely.
This viewpoint is important because it avoids dependence on coordinates or explicit constructions.
Many advanced mathematical structures are defined by universal properties.
The tensor product is one of the simplest and most important examples.
The key principle is:
Bilinear maps from correspond exactly to linear maps from .
This correspondence is natural and canonical.
98.17 Example
Let
with standard basis
Then
has basis
Hence
If
then
Using bilinearity,
This expansion illustrates how tensor products distribute across linear combinations.
98.18 Summary
Tensor products combine vector spaces into larger spaces that encode multilinear structure.
The tensor product:
| Concept | Meaning |
|---|---|
| Space generated by bilinear combinations | |
| Pure tensor | |
| Universal property | Bilinear maps become linear |
| Basis tensors | Products of basis vectors |
| Kronecker product | Matrix realization of tensor products |
| Tensor rank | Minimal pure decomposition |
Tensor products unify algebra, geometry, analysis, physics, and computation. They provide the language for multilinear structures and higher-dimensional interactions. Much of modern mathematics can be viewed as the systematic study of structures built from tensor products.