# Chapter 62. Eigenspaces

# Chapter 62. Eigenspaces

An eigenspace is the subspace formed by all eigenvectors associated with a fixed eigenvalue, together with the zero vector.

Eigenvalues describe scaling factors. Eigenvectors describe directions. Eigenspaces collect all directions that share the same scaling factor.

If \(A\) is a square matrix and \(\lambda\) is an eigenvalue of \(A\), then the eigenspace corresponding to \(\lambda\) is

$$
E_\lambda = \{v : Av = \lambda v\}.
$$

Equivalently,

$$
E_\lambda = \ker(A-\lambda I).
$$

This identity is the main computational form of an eigenspace: it is the null space of \(A-\lambda I\). Hence an eigenspace is a vector subspace.

## 62.1 From Eigenvectors to Eigenspaces

Suppose \(A\) is an \(n \times n\) matrix. A nonzero vector \(v\) is an eigenvector of \(A\) with eigenvalue \(\lambda\) if

$$
Av = \lambda v.
$$

The equation may be rewritten as

$$
Av-\lambda v=0.
$$

Since

$$
\lambda v = \lambda I v,
$$

we obtain

$$
(A-\lambda I)v=0.
$$

Thus the eigenvectors for \(\lambda\) are precisely the nonzero solutions of this homogeneous system.

The eigenspace includes those nonzero solutions and also includes the zero vector:

$$
E_\lambda = \ker(A-\lambda I).
$$

The zero vector is included so that the collection becomes a subspace. The zero vector itself is not called an eigenvector.

## 62.2 Definition

Let \(A\) be an \(n \times n\) matrix over a field \(F\). Let \(\lambda\) be an eigenvalue of \(A\).

The eigenspace of \(A\) corresponding to \(\lambda\) is

$$
E_\lambda(A)=\{v\in F^n : Av=\lambda v\}.
$$

Equivalently,

$$
E_\lambda(A)=\{v\in F^n : (A-\lambda I)v=0\}.
$$

Thus

$$
E_\lambda(A)=\operatorname{Null}(A-\lambda I).
$$

When the matrix \(A\) is clear from context, we usually write \(E_\lambda\).

## 62.3 Why an Eigenspace Is a Subspace

An eigenspace is a null space. Every null space is a subspace.

We can also prove this directly.

Let \(u,v\in E_\lambda\). Then

$$
Au=\lambda u
$$

and

$$
Av=\lambda v.
$$

By linearity,

$$
A(u+v)=Au+Av.
$$

Substitute the eigenvalue equations:

$$
A(u+v)=\lambda u+\lambda v.
$$

Factor:

$$
A(u+v)=\lambda(u+v).
$$

Thus

$$
u+v\in E_\lambda.
$$

Now let \(c\) be a scalar. Since \(v\in E_\lambda\),

$$
Av=\lambda v.
$$

Then

$$
A(cv)=cAv=c\lambda v=\lambda(cv).
$$

Therefore

$$
cv\in E_\lambda.
$$

The eigenspace is closed under addition and scalar multiplication, and it contains the zero vector. Hence it is a subspace.

## 62.4 Computing an Eigenspace

To compute an eigenspace, use the following procedure.

| Step | Operation |
|---|---|
| 1 | Find an eigenvalue \(\lambda\). |
| 2 | Form \(A-\lambda I\). |
| 3 | Solve \((A-\lambda I)v=0\). |
| 4 | Write the solution set as a span. |

The result is a subspace, usually described by a basis.

## 62.5 Example with Two One-Dimensional Eigenspaces

Let

$$
A=
\begin{bmatrix}
2 & 1 \\
1 & 2
\end{bmatrix}.
$$

The eigenvalues are

$$
\lambda=3
\qquad
\text{and}
\qquad
\lambda=1.
$$

First compute the eigenspace for \(\lambda=3\).

$$
A-3I=
\begin{bmatrix}
-1 & 1 \\
1 & -1
\end{bmatrix}.
$$

Solve

$$
(A-3I)v=0.
$$

That is,

$$
\begin{bmatrix}
-1 & 1 \\
1 & -1
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} =
\begin{bmatrix}
0 \\
0
\end{bmatrix}.
$$

The equation is

$$
-x+y=0.
$$

Hence

$$
y=x.
$$

Therefore

$$
E_3=
\left\{
t
\begin{bmatrix}
1 \\
1
\end{bmatrix}
:t\in\mathbb{R}
\right\}.
$$

So

$$
E_3=
\operatorname{span}
\left\{
\begin{bmatrix}
1 \\
1
\end{bmatrix}
\right\}.
$$

Now compute the eigenspace for \(\lambda=1\).

$$
A-I=
\begin{bmatrix}
1 & 1 \\
1 & 1
\end{bmatrix}.
$$

Solve

$$
(A-I)v=0.
$$

That is,

$$
\begin{bmatrix}
1 & 1 \\
1 & 1
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} =
\begin{bmatrix}
0 \\
0
\end{bmatrix}.
$$

The equation is

$$
x+y=0.
$$

Hence

$$
y=-x.
$$

Therefore

$$
E_1=
\operatorname{span}
\left\{
\begin{bmatrix}
1 \\
-1
\end{bmatrix}
\right\}.
$$

The two eigenspaces are two different lines through the origin.

## 62.6 Eigenspaces as Invariant Subspaces

An eigenspace is invariant under the matrix \(A\).

A subspace \(W\) is invariant under \(A\) if

$$
Aw\in W
$$

for every \(w\in W\).

If \(v\in E_\lambda\), then

$$
Av=\lambda v.
$$

Since \(E_\lambda\) is closed under scalar multiplication,

$$
\lambda v\in E_\lambda.
$$

Thus

$$
Av\in E_\lambda.
$$

So each eigenspace is an invariant subspace.

In fact, on \(E_\lambda\), the transformation \(A\) acts in the simplest possible way: it is just scalar multiplication by \(\lambda\).

## 62.7 Dimension of an Eigenspace

The dimension of \(E_\lambda\) is called the geometric multiplicity of \(\lambda\).

Since

$$
E_\lambda=\ker(A-\lambda I),
$$

we have

$$
\dim E_\lambda =
\operatorname{nullity}(A-\lambda I).
$$

By the rank-nullity theorem,

$$
\dim E_\lambda =
n-\operatorname{rank}(A-\lambda I).
$$

Thus the dimension of an eigenspace can be computed by row-reducing \(A-\lambda I\).

## 62.8 Algebraic Multiplicity and Geometric Multiplicity

Let \(\lambda\) be an eigenvalue of \(A\).

The algebraic multiplicity of \(\lambda\) is its multiplicity as a root of the characteristic polynomial.

The geometric multiplicity of \(\lambda\) is

$$
\dim E_\lambda.
$$

These numbers satisfy

$$
1
\leq
\dim E_\lambda
\leq
\text{algebraic multiplicity of }\lambda.
$$

The lower bound holds because \(\lambda\) is an eigenvalue, so at least one nonzero eigenvector exists.

The upper bound is deeper. It expresses a limit on how many independent eigenvectors can belong to a repeated root of the characteristic polynomial.

## 62.9 Example with a Defective Eigenspace

Let

$$
A=
\begin{bmatrix}
2 & 1 \\
0 & 2
\end{bmatrix}.
$$

The characteristic polynomial is

$$
(2-\lambda)^2.
$$

Thus \(\lambda=2\) has algebraic multiplicity \(2\).

Now compute the eigenspace:

$$
A-2I=
\begin{bmatrix}
0 & 1 \\
0 & 0
\end{bmatrix}.
$$

Solve

$$
(A-2I)v=0.
$$

That is,

$$
\begin{bmatrix}
0 & 1 \\
0 & 0
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} =
\begin{bmatrix}
0 \\
0
\end{bmatrix}.
$$

The equation is

$$
y=0.
$$

Hence

$$
v=
\begin{bmatrix}
x \\
0
\end{bmatrix} =
x
\begin{bmatrix}
1 \\
0
\end{bmatrix}.
$$

Therefore

$$
E_2=
\operatorname{span}
\left\{
\begin{bmatrix}
1 \\
0
\end{bmatrix}
\right\}.
$$

The eigenspace has dimension \(1\), although the eigenvalue has algebraic multiplicity \(2\). This matrix does not have enough eigenvectors to be diagonalized.

## 62.10 Example with a Full Eigenspace

Let

$$
A=
\begin{bmatrix}
2 & 0 \\
0 & 2
\end{bmatrix} =
2I.
$$

Then for every vector \(v\in\mathbb{R}^2\),

$$
Av=2v.
$$

Thus every nonzero vector is an eigenvector with eigenvalue \(2\).

The eigenspace is

$$
E_2=\mathbb{R}^2.
$$

Its dimension is \(2\).

This shows that a repeated eigenvalue may have a large eigenspace. The behavior depends on the matrix, not only on the characteristic polynomial.

## 62.11 Eigenspaces for Diagonal Matrices

Let

$$
D=
\begin{bmatrix}
d_1 & 0 & 0 \\
0 & d_2 & 0 \\
0 & 0 & d_3
\end{bmatrix}.
$$

If the diagonal entries are distinct, then each standard basis vector spans one eigenspace:

$$
De_1=d_1e_1,
$$

$$
De_2=d_2e_2,
$$

$$
De_3=d_3e_3.
$$

Thus

$$
E_{d_1}=\operatorname{span}\{e_1\},
$$

$$
E_{d_2}=\operatorname{span}\{e_2\},
$$

$$
E_{d_3}=\operatorname{span}\{e_3\}.
$$

If a diagonal value is repeated, its eigenspace is spanned by all standard basis vectors whose diagonal entries equal that value.

For example,

$$
D=
\begin{bmatrix}
4 & 0 & 0 \\
0 & 4 & 0 \\
0 & 0 & 7
\end{bmatrix}.
$$

Then

$$
E_4=\operatorname{span}\{e_1,e_2\},
$$

and

$$
E_7=\operatorname{span}\{e_3\}.
$$

## 62.12 Eigenspaces and Direct Sums

Eigenspaces corresponding to distinct eigenvalues intersect only at the zero vector.

Suppose

$$
v\in E_\lambda\cap E_\mu
$$

where

$$
\lambda\neq \mu.
$$

Then

$$
Av=\lambda v
$$

and

$$
Av=\mu v.
$$

Therefore

$$
\lambda v=\mu v.
$$

So

$$
(\lambda-\mu)v=0.
$$

Since

$$
\lambda-\mu\neq 0,
$$

we must have

$$
v=0.
$$

Thus

$$
E_\lambda\cap E_\mu=\{0\}.
$$

This means that distinct eigenspaces do not overlap except at the origin.

More generally, eigenspaces belonging to distinct eigenvalues form a direct sum.

## 62.13 Eigenspaces and Diagonalization

A matrix \(A\) is diagonalizable if the whole space has a basis made of eigenvectors of \(A\).

Equivalently, \(A\) is diagonalizable if the direct sum of its eigenspaces is the whole space.

For an \(n\times n\) matrix, this means

$$
\sum_{\lambda}\dim E_\lambda=n,
$$

where the sum is taken over all distinct eigenvalues of \(A\).

If this condition holds, we can choose a basis from the eigenspaces. In that basis, the matrix of the transformation is diagonal.

The diagonal entries are the corresponding eigenvalues.

## 62.14 Example of Diagonalization from Eigenspaces

Let

$$
A=
\begin{bmatrix}
2 & 1 \\
1 & 2
\end{bmatrix}.
$$

We found

$$
E_3=
\operatorname{span}
\left\{
\begin{bmatrix}
1 \\
1
\end{bmatrix}
\right\}
$$

and

$$
E_1=
\operatorname{span}
\left\{
\begin{bmatrix}
1 \\
-1
\end{bmatrix}
\right\}.
$$

The dimensions add to

$$
\dim E_3+\dim E_1=1+1=2.
$$

Since the ambient space is \(\mathbb{R}^2\), these eigenspaces provide a basis.

Let

$$
P=
\begin{bmatrix}
1 & 1 \\
1 & -1
\end{bmatrix}.
$$

Let

$$
D=
\begin{bmatrix}
3 & 0 \\
0 & 1
\end{bmatrix}.
$$

Then

$$
A=PDP^{-1}.
$$

The columns of \(P\) are chosen from the eigenspaces. The diagonal entries of \(D\) are the corresponding eigenvalues.

## 62.15 Eigenspaces over Different Fields

The field matters.

A real matrix may have no real eigenspaces for some complex eigenvalues.

Consider

$$
A=
\begin{bmatrix}
0 & -1 \\
1 & 0
\end{bmatrix}.
$$

This matrix rotates the plane by \(90^\circ\). Its characteristic polynomial is

$$
\lambda^2+1.
$$

Over \(\mathbb{R}\), this polynomial has no roots. Therefore there are no real eigenspaces.

Over \(\mathbb{C}\), the eigenvalues are

$$
i
\qquad
\text{and}
\qquad
-i.
$$

The corresponding eigenspaces are subspaces of \(\mathbb{C}^2\), not \(\mathbb{R}^2\).

Thus eigenspaces must always be understood relative to the chosen scalar field.

## 62.16 Eigenspaces of Linear Transformations

The definition does not require matrices.

Let

$$
T:V\to V
$$

be a linear transformation. If \(\lambda\) is an eigenvalue of \(T\), then the eigenspace corresponding to \(\lambda\) is

$$
E_\lambda(T)=\{v\in V:T(v)=\lambda v\}.
$$

Equivalently,

$$
E_\lambda(T)=\ker(T-\lambda I).
$$

This definition applies to finite-dimensional vector spaces, polynomial spaces, function spaces, and many other settings.

For example, consider the differentiation operator

$$
D(f)=f'
$$

on a suitable function space. The function

$$
f(x)=e^{\lambda x}
$$

satisfies

$$
D(f)=\lambda f.
$$

Thus exponential functions are eigenvectors of differentiation. In this context, they are usually called eigenfunctions.

## 62.17 Eigenspaces and Coordinates

When a vector is expressed in an eigenbasis, the action of the matrix becomes simple.

Suppose

$$
V=E_{\lambda_1}\oplus E_{\lambda_2}\oplus\cdots\oplus E_{\lambda_k}.
$$

Then every vector \(v\in V\) can be written uniquely as

$$
v=v_1+v_2+\cdots+v_k,
$$

where

$$
v_i\in E_{\lambda_i}.
$$

Applying \(A\),

$$
Av=Av_1+Av_2+\cdots+Av_k.
$$

Since \(v_i\in E_{\lambda_i}\),

$$
Av_i=\lambda_i v_i.
$$

Therefore

$$
Av=\lambda_1v_1+\lambda_2v_2+\cdots+\lambda_kv_k.
$$

So each eigenspace component is scaled independently.

This is the structural meaning of diagonalization.

## 62.18 Eigenspaces in Applications

Eigenspaces often represent modes, directions, or states that behave uniformly under a transformation.

| Area | Meaning of eigenspace |
|---|---|
| Differential equations | Set of solutions with the same exponential rate |
| Mechanics | Modes with the same natural frequency |
| Statistics | Principal directions with the same variance |
| Graph theory | Structural modes of an adjacency or Laplacian matrix |
| Markov chains | Long-term or transient state spaces |
| Quantum mechanics | States with the same measured value |
| Numerical analysis | Subspaces controlling convergence |

When an eigenvalue has eigenspace dimension greater than \(1\), there are several independent directions with the same scaling behavior.

## 62.19 Common Errors

The first common error is to call the zero vector an eigenvector. The zero vector belongs to every eigenspace, but it is not an eigenvector.

The second common error is to confuse an eigenvalue with an eigenspace. The eigenvalue is a scalar. The eigenspace is a subspace.

The third common error is to compute only one eigenvector and forget the full span. An eigenspace contains all scalar multiples and all linear combinations of its basis eigenvectors.

The fourth common error is to ignore the field. A matrix may have complex eigenspaces even when all its entries are real.

## 62.20 Summary

For a square matrix \(A\) and an eigenvalue \(\lambda\), the eigenspace is

$$
E_\lambda=\{v:Av=\lambda v\}.
$$

Equivalently,

$$
E_\lambda=\ker(A-\lambda I).
$$

An eigenspace is a subspace. Its nonzero vectors are eigenvectors. Its dimension is the geometric multiplicity of the eigenvalue.

Eigenspaces organize eigenvectors into linear subspaces. They determine whether a matrix has enough eigenvectors to be diagonalized and provide the natural coordinates in which a linear transformation acts by independent scaling.
