An invariant subspace is a subspace that is preserved by a linear operator.
Let be a vector space over a field , and let
be a linear operator. A subspace is called invariant under , or -invariant, if
Equivalently,
The operator may move vectors inside , but it does not move them outside . This is the standard definition of an invariant subspace.
43.1 Basic Meaning
An invariant subspace is a smaller space on which the operator acts by itself.
If
is -invariant, then the restriction
is a well-defined linear operator on .
This is the main reason invariant subspaces are important. They allow us to study a large operator by studying its action on smaller subspaces.
For example, if preserves a plane , then the behavior of on that plane can be analyzed separately from the rest of the space.
43.2 First Examples
Every operator has at least two invariant subspaces:
and
The zero subspace is invariant because
The whole space is invariant because
These are called trivial invariant subspaces. The interesting question is whether an operator has nontrivial invariant subspaces, meaning subspaces other than and .
43.3 Kernel and Image
The kernel and image of a linear operator are invariant subspaces.
Let
be linear.
The kernel is
If , then
Since
we have
Thus
is -invariant.
The image is
If , then
for some . Applying again gives
Since is still an output of , it lies in . Therefore
is -invariant. Kernel, range, and eigenspaces are standard examples of invariant subspaces.
43.4 Eigenspaces
Eigenspaces are invariant subspaces.
Let be an eigenvalue of . The eigenspace associated with is
Thus
If , then
Since is a subspace, and , we have
Therefore
So
is invariant under .
This means that an eigenvector does more than give a special direction. The whole eigenspace is preserved by the operator.
43.5 Example: A Diagonal Matrix
Let
The coordinate axes are invariant under . For example,
is invariant because
Similarly,
and
are invariant.
The coordinate planes are also invariant. For example,
If
then
This remains in . Hence is invariant.
Diagonal matrices make invariant subspaces visible because each coordinate direction is scaled independently.
43.6 Example: A Shear
Let
Then
The -axis
is invariant, because
The -axis is not invariant. Indeed,
which does not lie on the -axis.
Thus a subspace may look natural geometrically but fail to be invariant under a given operator.
43.7 Restriction to an Invariant Subspace
If is invariant under , then the restriction
is defined by
The invariant condition ensures that the output lies in .
The restriction is linear because is linear. For ,
Since
this is addition inside . Similarly,
Thus invariant subspaces allow us to form smaller linear operators from larger ones.
43.8 Matrix Form
Let be linear, and suppose is -invariant.
Choose a basis
of , and extend it to a basis of :
Since is invariant, each
is a linear combination only of
Therefore the matrix of in this basis has block form
The zero block appears because vectors in have no component outside after applying .
This block upper triangular form is one of the main uses of invariant subspaces.
43.9 Reducing Subspaces
A stronger condition occurs when both a subspace and its complement are invariant.
Suppose
and both and are invariant under . Then decomposes into two independent operators:
and
In a basis adapted to the decomposition, the matrix of has block diagonal form
This is stronger than block upper triangular form. It means the operator does not mix the two subspaces in either direction.
Such a decomposition allows the study of to split into the study of smaller operators.
43.10 Invariant Lines
A one-dimensional invariant subspace is an invariant line through the origin.
Let
Then is invariant under if and only if
This means there exists a scalar such that
Therefore invariant lines are exactly eigenspaces generated by eigenvectors.
Thus finding one-dimensional invariant subspaces is the same as finding eigenvectors.
43.11 Invariant Planes
A two-dimensional invariant subspace is an invariant plane through the origin.
Let
The subspace is invariant under if and only if both
and
belong to .
This condition is enough because every vector in has the form
Then
which lies in if both basis images lie in .
Thus to check invariance of a finite-dimensional subspace, it is enough to check a basis.
43.12 Cyclic Subspaces
Given a vector , the cyclic subspace generated by under is
This subspace is invariant under .
Indeed, if
then
This again lies in
Cyclic subspaces appear in Krylov methods, rational canonical form, and the study of minimal polynomials.
43.13 Invariant Subspaces and Polynomials
If is invariant under , then is invariant under every polynomial in .
Let
Then
If , then
Repeated application gives
for every . Since is closed under linear combinations,
Therefore is invariant under .
This fact connects invariant subspaces with minimal polynomials and canonical forms.
43.14 Generalized Eigenspaces
Let be linear, and let be an eigenvalue. The generalized eigenspace associated with is
for a sufficiently large positive integer .
This subspace is invariant under .
To see this, let
Since commutes with , we have
If
then
Therefore
So
Generalized eigenspaces are the invariant pieces used in Jordan canonical form.
43.15 Direct Sums of Invariant Subspaces
Suppose
and each is invariant under .
Then splits into operators
In a basis formed by concatenating bases of the , the matrix of is block diagonal:
This is the algebraic reason direct sum decompositions are powerful. They convert one large operator into several smaller operators.
43.16 Invariant Subspaces and Triangular Form
A chain of invariant subspaces gives a triangular matrix.
Suppose has a basis
such that
is invariant under for each .
Then the matrix of in this basis is upper triangular.
Indeed, invariance of means
Therefore the -th column of the matrix has zero entries below row .
Thus nested invariant subspaces correspond to triangular representations.
43.17 Invariant Subspaces and Diagonalization
Diagonalization is the best-case invariant subspace decomposition.
If has a basis of eigenvectors
then each line
is invariant under .
Moreover,
In this basis, the matrix of is diagonal:
Thus diagonalization means decomposing the vector space into one-dimensional invariant subspaces.
43.18 Invariant Subspaces Over Complex Fields
Over an algebraically closed field, every finite-dimensional operator on a nonzero vector space has at least one eigenvalue. Therefore it has at least one one-dimensional invariant subspace.
For complex vector spaces, this means every operator
with
has an invariant line.
This follows from the characteristic polynomial: over , it has a root . Then
contains a nonzero vector and is invariant.
This fact is one reason complex linear algebra has especially clean spectral theory.
43.19 Invariant Subspaces Over Real Fields
Over , a finite-dimensional operator may have no invariant line.
For example, the rotation
has no real eigenvectors. Therefore it has no one-dimensional real invariant subspace.
However, the whole plane is invariant. More generally, real operators may have invariant planes corresponding to pairs of complex conjugate eigenvalues.
Thus real linear algebra often replaces invariant lines by invariant planes.
43.20 Nontrivial Invariant Subspaces
In finite-dimensional complex vector spaces, nontrivial invariant subspaces are common. If
then an eigenline gives a nontrivial invariant subspace.
In infinite-dimensional spaces, the situation is much more delicate. The invariant subspace problem asks, in one classical form, whether every bounded operator on a complex separable Hilbert space of dimension greater than one has a nontrivial closed invariant subspace. This problem remains unsolved in that setting.
This infinite-dimensional problem belongs to functional analysis, but it shows how central the idea of invariance is beyond finite-dimensional linear algebra.
43.21 How to Test Invariance
To test whether a subspace is invariant under , use a basis of .
Let
Then is invariant under if and only if
for every
For a matrix , this means checking whether each is a linear combination of the basis vectors of .
Equivalently, if is the matrix with columns , then is invariant under if there exists a matrix such that
The matrix is the matrix of the restricted operator in the chosen basis of .
43.22 Summary
An invariant subspace of a linear operator
is a subspace satisfying
Equivalently, every vector in is sent back into .
The restriction
is then a well-defined linear operator.
Important examples include
eigenspaces, generalized eigenspaces, and cyclic subspaces.
Invariant subspaces explain block triangular form, block diagonal form, diagonalization, Jordan form, and rational canonical form. They are the subspaces on which an operator can be studied separately.
The central idea is that an invariant subspace is a part of the vector space that the operator does not leave.