An eigenspace is the subspace formed by all eigenvectors associated with a fixed eigenvalue, together with the zero vector.
Eigenvalues describe scaling factors. Eigenvectors describe directions. Eigenspaces collect all directions that share the same scaling factor.
If is a square matrix and is an eigenvalue of , then the eigenspace corresponding to is
Equivalently,
This identity is the main computational form of an eigenspace: it is the null space of . Hence an eigenspace is a vector subspace.
62.1 From Eigenvectors to Eigenspaces
Suppose is an matrix. A nonzero vector is an eigenvector of with eigenvalue if
The equation may be rewritten as
Since
we obtain
Thus the eigenvectors for are precisely the nonzero solutions of this homogeneous system.
The eigenspace includes those nonzero solutions and also includes the zero vector:
The zero vector is included so that the collection becomes a subspace. The zero vector itself is not called an eigenvector.
62.2 Definition
Let be an matrix over a field . Let be an eigenvalue of .
The eigenspace of corresponding to is
Equivalently,
Thus
When the matrix is clear from context, we usually write .
62.3 Why an Eigenspace Is a Subspace
An eigenspace is a null space. Every null space is a subspace.
We can also prove this directly.
Let . Then
and
By linearity,
Substitute the eigenvalue equations:
Factor:
Thus
Now let be a scalar. Since ,
Then
Therefore
The eigenspace is closed under addition and scalar multiplication, and it contains the zero vector. Hence it is a subspace.
62.4 Computing an Eigenspace
To compute an eigenspace, use the following procedure.
| Step | Operation |
|---|---|
| 1 | Find an eigenvalue . |
| 2 | Form . |
| 3 | Solve . |
| 4 | Write the solution set as a span. |
The result is a subspace, usually described by a basis.
62.5 Example with Two One-Dimensional Eigenspaces
Let
The eigenvalues are
First compute the eigenspace for .
Solve
That is,
The equation is
Hence
Therefore
So
Now compute the eigenspace for .
Solve
That is,
The equation is
Hence
Therefore
The two eigenspaces are two different lines through the origin.
62.6 Eigenspaces as Invariant Subspaces
An eigenspace is invariant under the matrix .
A subspace is invariant under if
for every .
If , then
Since is closed under scalar multiplication,
Thus
So each eigenspace is an invariant subspace.
In fact, on , the transformation acts in the simplest possible way: it is just scalar multiplication by .
62.7 Dimension of an Eigenspace
The dimension of is called the geometric multiplicity of .
Since
we have
By the rank-nullity theorem,
Thus the dimension of an eigenspace can be computed by row-reducing .
62.8 Algebraic Multiplicity and Geometric Multiplicity
Let be an eigenvalue of .
The algebraic multiplicity of is its multiplicity as a root of the characteristic polynomial.
The geometric multiplicity of is
These numbers satisfy
The lower bound holds because is an eigenvalue, so at least one nonzero eigenvector exists.
The upper bound is deeper. It expresses a limit on how many independent eigenvectors can belong to a repeated root of the characteristic polynomial.
62.9 Example with a Defective Eigenspace
Let
The characteristic polynomial is
Thus has algebraic multiplicity .
Now compute the eigenspace:
Solve
That is,
The equation is
Hence
Therefore
The eigenspace has dimension , although the eigenvalue has algebraic multiplicity . This matrix does not have enough eigenvectors to be diagonalized.
62.10 Example with a Full Eigenspace
Let
Then for every vector ,
Thus every nonzero vector is an eigenvector with eigenvalue .
The eigenspace is
Its dimension is .
This shows that a repeated eigenvalue may have a large eigenspace. The behavior depends on the matrix, not only on the characteristic polynomial.
62.11 Eigenspaces for Diagonal Matrices
Let
If the diagonal entries are distinct, then each standard basis vector spans one eigenspace:
Thus
If a diagonal value is repeated, its eigenspace is spanned by all standard basis vectors whose diagonal entries equal that value.
For example,
Then
and
62.12 Eigenspaces and Direct Sums
Eigenspaces corresponding to distinct eigenvalues intersect only at the zero vector.
Suppose
where
Then
and
Therefore
So
Since
we must have
Thus
This means that distinct eigenspaces do not overlap except at the origin.
More generally, eigenspaces belonging to distinct eigenvalues form a direct sum.
62.13 Eigenspaces and Diagonalization
A matrix is diagonalizable if the whole space has a basis made of eigenvectors of .
Equivalently, is diagonalizable if the direct sum of its eigenspaces is the whole space.
For an matrix, this means
where the sum is taken over all distinct eigenvalues of .
If this condition holds, we can choose a basis from the eigenspaces. In that basis, the matrix of the transformation is diagonal.
The diagonal entries are the corresponding eigenvalues.
62.14 Example of Diagonalization from Eigenspaces
Let
We found
and
The dimensions add to
Since the ambient space is , these eigenspaces provide a basis.
Let
Let
Then
The columns of are chosen from the eigenspaces. The diagonal entries of are the corresponding eigenvalues.
62.15 Eigenspaces over Different Fields
The field matters.
A real matrix may have no real eigenspaces for some complex eigenvalues.
Consider
This matrix rotates the plane by . Its characteristic polynomial is
Over , this polynomial has no roots. Therefore there are no real eigenspaces.
Over , the eigenvalues are
The corresponding eigenspaces are subspaces of , not .
Thus eigenspaces must always be understood relative to the chosen scalar field.
62.16 Eigenspaces of Linear Transformations
The definition does not require matrices.
Let
be a linear transformation. If is an eigenvalue of , then the eigenspace corresponding to is
Equivalently,
This definition applies to finite-dimensional vector spaces, polynomial spaces, function spaces, and many other settings.
For example, consider the differentiation operator
on a suitable function space. The function
satisfies
Thus exponential functions are eigenvectors of differentiation. In this context, they are usually called eigenfunctions.
62.17 Eigenspaces and Coordinates
When a vector is expressed in an eigenbasis, the action of the matrix becomes simple.
Suppose
Then every vector can be written uniquely as
where
Applying ,
Since ,
Therefore
So each eigenspace component is scaled independently.
This is the structural meaning of diagonalization.
62.18 Eigenspaces in Applications
Eigenspaces often represent modes, directions, or states that behave uniformly under a transformation.
| Area | Meaning of eigenspace |
|---|---|
| Differential equations | Set of solutions with the same exponential rate |
| Mechanics | Modes with the same natural frequency |
| Statistics | Principal directions with the same variance |
| Graph theory | Structural modes of an adjacency or Laplacian matrix |
| Markov chains | Long-term or transient state spaces |
| Quantum mechanics | States with the same measured value |
| Numerical analysis | Subspaces controlling convergence |
When an eigenvalue has eigenspace dimension greater than , there are several independent directions with the same scaling behavior.
62.19 Common Errors
The first common error is to call the zero vector an eigenvector. The zero vector belongs to every eigenspace, but it is not an eigenvector.
The second common error is to confuse an eigenvalue with an eigenspace. The eigenvalue is a scalar. The eigenspace is a subspace.
The third common error is to compute only one eigenvector and forget the full span. An eigenspace contains all scalar multiples and all linear combinations of its basis eigenvectors.
The fourth common error is to ignore the field. A matrix may have complex eigenspaces even when all its entries are real.
62.20 Summary
For a square matrix and an eigenvalue , the eigenspace is
Equivalently,
An eigenspace is a subspace. Its nonzero vectors are eigenvectors. Its dimension is the geometric multiplicity of the eigenvalue.
Eigenspaces organize eigenvectors into linear subspaces. They determine whether a matrix has enough eigenvectors to be diagonalized and provide the natural coordinates in which a linear transformation acts by independent scaling.