Skip to content

Chapter 62. Eigenspaces

An eigenspace is the subspace formed by all eigenvectors associated with a fixed eigenvalue, together with the zero vector.

Eigenvalues describe scaling factors. Eigenvectors describe directions. Eigenspaces collect all directions that share the same scaling factor.

If AA is a square matrix and λ\lambda is an eigenvalue of AA, then the eigenspace corresponding to λ\lambda is

Eλ={v:Av=λv}. E_\lambda = \{v : Av = \lambda v\}.

Equivalently,

Eλ=ker(AλI). E_\lambda = \ker(A-\lambda I).

This identity is the main computational form of an eigenspace: it is the null space of AλIA-\lambda I. Hence an eigenspace is a vector subspace.

62.1 From Eigenvectors to Eigenspaces

Suppose AA is an n×nn \times n matrix. A nonzero vector vv is an eigenvector of AA with eigenvalue λ\lambda if

Av=λv. Av = \lambda v.

The equation may be rewritten as

Avλv=0. Av-\lambda v=0.

Since

λv=λIv, \lambda v = \lambda I v,

we obtain

(AλI)v=0. (A-\lambda I)v=0.

Thus the eigenvectors for λ\lambda are precisely the nonzero solutions of this homogeneous system.

The eigenspace includes those nonzero solutions and also includes the zero vector:

Eλ=ker(AλI). E_\lambda = \ker(A-\lambda I).

The zero vector is included so that the collection becomes a subspace. The zero vector itself is not called an eigenvector.

62.2 Definition

Let AA be an n×nn \times n matrix over a field FF. Let λ\lambda be an eigenvalue of AA.

The eigenspace of AA corresponding to λ\lambda is

Eλ(A)={vFn:Av=λv}. E_\lambda(A)=\{v\in F^n : Av=\lambda v\}.

Equivalently,

Eλ(A)={vFn:(AλI)v=0}. E_\lambda(A)=\{v\in F^n : (A-\lambda I)v=0\}.

Thus

Eλ(A)=Null(AλI). E_\lambda(A)=\operatorname{Null}(A-\lambda I).

When the matrix AA is clear from context, we usually write EλE_\lambda.

62.3 Why an Eigenspace Is a Subspace

An eigenspace is a null space. Every null space is a subspace.

We can also prove this directly.

Let u,vEλu,v\in E_\lambda. Then

Au=λu Au=\lambda u

and

Av=λv. Av=\lambda v.

By linearity,

A(u+v)=Au+Av. A(u+v)=Au+Av.

Substitute the eigenvalue equations:

A(u+v)=λu+λv. A(u+v)=\lambda u+\lambda v.

Factor:

A(u+v)=λ(u+v). A(u+v)=\lambda(u+v).

Thus

u+vEλ. u+v\in E_\lambda.

Now let cc be a scalar. Since vEλv\in E_\lambda,

Av=λv. Av=\lambda v.

Then

A(cv)=cAv=cλv=λ(cv). A(cv)=cAv=c\lambda v=\lambda(cv).

Therefore

cvEλ. cv\in E_\lambda.

The eigenspace is closed under addition and scalar multiplication, and it contains the zero vector. Hence it is a subspace.

62.4 Computing an Eigenspace

To compute an eigenspace, use the following procedure.

StepOperation
1Find an eigenvalue λ\lambda.
2Form AλIA-\lambda I.
3Solve (AλI)v=0(A-\lambda I)v=0.
4Write the solution set as a span.

The result is a subspace, usually described by a basis.

62.5 Example with Two One-Dimensional Eigenspaces

Let

A=[2112]. A= \begin{bmatrix} 2 & 1 \\ 1 & 2 \end{bmatrix}.

The eigenvalues are

λ=3andλ=1. \lambda=3 \qquad \text{and} \qquad \lambda=1.

First compute the eigenspace for λ=3\lambda=3.

A3I=[1111]. A-3I= \begin{bmatrix} -1 & 1 \\ 1 & -1 \end{bmatrix}.

Solve

(A3I)v=0. (A-3I)v=0.

That is,

[1111][xy]=[00]. \begin{bmatrix} -1 & 1 \\ 1 & -1 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix}.

The equation is

x+y=0. -x+y=0.

Hence

y=x. y=x.

Therefore

E3={t[11]:tR}. E_3= \left\{ t \begin{bmatrix} 1 \\ 1 \end{bmatrix} :t\in\mathbb{R} \right\}.

So

E3=span{[11]}. E_3= \operatorname{span} \left\{ \begin{bmatrix} 1 \\ 1 \end{bmatrix} \right\}.

Now compute the eigenspace for λ=1\lambda=1.

AI=[1111]. A-I= \begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix}.

Solve

(AI)v=0. (A-I)v=0.

That is,

[1111][xy]=[00]. \begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix}.

The equation is

x+y=0. x+y=0.

Hence

y=x. y=-x.

Therefore

E1=span{[11]}. E_1= \operatorname{span} \left\{ \begin{bmatrix} 1 \\ -1 \end{bmatrix} \right\}.

The two eigenspaces are two different lines through the origin.

62.6 Eigenspaces as Invariant Subspaces

An eigenspace is invariant under the matrix AA.

A subspace WW is invariant under AA if

AwW Aw\in W

for every wWw\in W.

If vEλv\in E_\lambda, then

Av=λv. Av=\lambda v.

Since EλE_\lambda is closed under scalar multiplication,

λvEλ. \lambda v\in E_\lambda.

Thus

AvEλ. Av\in E_\lambda.

So each eigenspace is an invariant subspace.

In fact, on EλE_\lambda, the transformation AA acts in the simplest possible way: it is just scalar multiplication by λ\lambda.

62.7 Dimension of an Eigenspace

The dimension of EλE_\lambda is called the geometric multiplicity of λ\lambda.

Since

Eλ=ker(AλI), E_\lambda=\ker(A-\lambda I),

we have

dimEλ=nullity(AλI). \dim E_\lambda = \operatorname{nullity}(A-\lambda I).

By the rank-nullity theorem,

dimEλ=nrank(AλI). \dim E_\lambda = n-\operatorname{rank}(A-\lambda I).

Thus the dimension of an eigenspace can be computed by row-reducing AλIA-\lambda I.

62.8 Algebraic Multiplicity and Geometric Multiplicity

Let λ\lambda be an eigenvalue of AA.

The algebraic multiplicity of λ\lambda is its multiplicity as a root of the characteristic polynomial.

The geometric multiplicity of λ\lambda is

dimEλ. \dim E_\lambda.

These numbers satisfy

1dimEλalgebraic multiplicity of λ. 1 \leq \dim E_\lambda \leq \text{algebraic multiplicity of }\lambda.

The lower bound holds because λ\lambda is an eigenvalue, so at least one nonzero eigenvector exists.

The upper bound is deeper. It expresses a limit on how many independent eigenvectors can belong to a repeated root of the characteristic polynomial.

62.9 Example with a Defective Eigenspace

Let

A=[2102]. A= \begin{bmatrix} 2 & 1 \\ 0 & 2 \end{bmatrix}.

The characteristic polynomial is

(2λ)2. (2-\lambda)^2.

Thus λ=2\lambda=2 has algebraic multiplicity 22.

Now compute the eigenspace:

A2I=[0100]. A-2I= \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}.

Solve

(A2I)v=0. (A-2I)v=0.

That is,

[0100][xy]=[00]. \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix}.

The equation is

y=0. y=0.

Hence

v=[x0]=x[10]. v= \begin{bmatrix} x \\ 0 \end{bmatrix} = x \begin{bmatrix} 1 \\ 0 \end{bmatrix}.

Therefore

E2=span{[10]}. E_2= \operatorname{span} \left\{ \begin{bmatrix} 1 \\ 0 \end{bmatrix} \right\}.

The eigenspace has dimension 11, although the eigenvalue has algebraic multiplicity 22. This matrix does not have enough eigenvectors to be diagonalized.

62.10 Example with a Full Eigenspace

Let

A=[2002]=2I. A= \begin{bmatrix} 2 & 0 \\ 0 & 2 \end{bmatrix} = 2I.

Then for every vector vR2v\in\mathbb{R}^2,

Av=2v. Av=2v.

Thus every nonzero vector is an eigenvector with eigenvalue 22.

The eigenspace is

E2=R2. E_2=\mathbb{R}^2.

Its dimension is 22.

This shows that a repeated eigenvalue may have a large eigenspace. The behavior depends on the matrix, not only on the characteristic polynomial.

62.11 Eigenspaces for Diagonal Matrices

Let

D=[d1000d2000d3]. D= \begin{bmatrix} d_1 & 0 & 0 \\ 0 & d_2 & 0 \\ 0 & 0 & d_3 \end{bmatrix}.

If the diagonal entries are distinct, then each standard basis vector spans one eigenspace:

De1=d1e1, De_1=d_1e_1, De2=d2e2, De_2=d_2e_2, De3=d3e3. De_3=d_3e_3.

Thus

Ed1=span{e1}, E_{d_1}=\operatorname{span}\{e_1\}, Ed2=span{e2}, E_{d_2}=\operatorname{span}\{e_2\}, Ed3=span{e3}. E_{d_3}=\operatorname{span}\{e_3\}.

If a diagonal value is repeated, its eigenspace is spanned by all standard basis vectors whose diagonal entries equal that value.

For example,

D=[400040007]. D= \begin{bmatrix} 4 & 0 & 0 \\ 0 & 4 & 0 \\ 0 & 0 & 7 \end{bmatrix}.

Then

E4=span{e1,e2}, E_4=\operatorname{span}\{e_1,e_2\},

and

E7=span{e3}. E_7=\operatorname{span}\{e_3\}.

62.12 Eigenspaces and Direct Sums

Eigenspaces corresponding to distinct eigenvalues intersect only at the zero vector.

Suppose

vEλEμ v\in E_\lambda\cap E_\mu

where

λμ. \lambda\neq \mu.

Then

Av=λv Av=\lambda v

and

Av=μv. Av=\mu v.

Therefore

λv=μv. \lambda v=\mu v.

So

(λμ)v=0. (\lambda-\mu)v=0.

Since

λμ0, \lambda-\mu\neq 0,

we must have

v=0. v=0.

Thus

EλEμ={0}. E_\lambda\cap E_\mu=\{0\}.

This means that distinct eigenspaces do not overlap except at the origin.

More generally, eigenspaces belonging to distinct eigenvalues form a direct sum.

62.13 Eigenspaces and Diagonalization

A matrix AA is diagonalizable if the whole space has a basis made of eigenvectors of AA.

Equivalently, AA is diagonalizable if the direct sum of its eigenspaces is the whole space.

For an n×nn\times n matrix, this means

λdimEλ=n, \sum_{\lambda}\dim E_\lambda=n,

where the sum is taken over all distinct eigenvalues of AA.

If this condition holds, we can choose a basis from the eigenspaces. In that basis, the matrix of the transformation is diagonal.

The diagonal entries are the corresponding eigenvalues.

62.14 Example of Diagonalization from Eigenspaces

Let

A=[2112]. A= \begin{bmatrix} 2 & 1 \\ 1 & 2 \end{bmatrix}.

We found

E3=span{[11]} E_3= \operatorname{span} \left\{ \begin{bmatrix} 1 \\ 1 \end{bmatrix} \right\}

and

E1=span{[11]}. E_1= \operatorname{span} \left\{ \begin{bmatrix} 1 \\ -1 \end{bmatrix} \right\}.

The dimensions add to

dimE3+dimE1=1+1=2. \dim E_3+\dim E_1=1+1=2.

Since the ambient space is R2\mathbb{R}^2, these eigenspaces provide a basis.

Let

P=[1111]. P= \begin{bmatrix} 1 & 1 \\ 1 & -1 \end{bmatrix}.

Let

D=[3001]. D= \begin{bmatrix} 3 & 0 \\ 0 & 1 \end{bmatrix}.

Then

A=PDP1. A=PDP^{-1}.

The columns of PP are chosen from the eigenspaces. The diagonal entries of DD are the corresponding eigenvalues.

62.15 Eigenspaces over Different Fields

The field matters.

A real matrix may have no real eigenspaces for some complex eigenvalues.

Consider

A=[0110]. A= \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix}.

This matrix rotates the plane by 9090^\circ. Its characteristic polynomial is

λ2+1. \lambda^2+1.

Over R\mathbb{R}, this polynomial has no roots. Therefore there are no real eigenspaces.

Over C\mathbb{C}, the eigenvalues are

iandi. i \qquad \text{and} \qquad -i.

The corresponding eigenspaces are subspaces of C2\mathbb{C}^2, not R2\mathbb{R}^2.

Thus eigenspaces must always be understood relative to the chosen scalar field.

62.16 Eigenspaces of Linear Transformations

The definition does not require matrices.

Let

T:VV T:V\to V

be a linear transformation. If λ\lambda is an eigenvalue of TT, then the eigenspace corresponding to λ\lambda is

Eλ(T)={vV:T(v)=λv}. E_\lambda(T)=\{v\in V:T(v)=\lambda v\}.

Equivalently,

Eλ(T)=ker(TλI). E_\lambda(T)=\ker(T-\lambda I).

This definition applies to finite-dimensional vector spaces, polynomial spaces, function spaces, and many other settings.

For example, consider the differentiation operator

D(f)=f D(f)=f'

on a suitable function space. The function

f(x)=eλx f(x)=e^{\lambda x}

satisfies

D(f)=λf. D(f)=\lambda f.

Thus exponential functions are eigenvectors of differentiation. In this context, they are usually called eigenfunctions.

62.17 Eigenspaces and Coordinates

When a vector is expressed in an eigenbasis, the action of the matrix becomes simple.

Suppose

V=Eλ1Eλ2Eλk. V=E_{\lambda_1}\oplus E_{\lambda_2}\oplus\cdots\oplus E_{\lambda_k}.

Then every vector vVv\in V can be written uniquely as

v=v1+v2++vk, v=v_1+v_2+\cdots+v_k,

where

viEλi. v_i\in E_{\lambda_i}.

Applying AA,

Av=Av1+Av2++Avk. Av=Av_1+Av_2+\cdots+Av_k.

Since viEλiv_i\in E_{\lambda_i},

Avi=λivi. Av_i=\lambda_i v_i.

Therefore

Av=λ1v1+λ2v2++λkvk. Av=\lambda_1v_1+\lambda_2v_2+\cdots+\lambda_kv_k.

So each eigenspace component is scaled independently.

This is the structural meaning of diagonalization.

62.18 Eigenspaces in Applications

Eigenspaces often represent modes, directions, or states that behave uniformly under a transformation.

AreaMeaning of eigenspace
Differential equationsSet of solutions with the same exponential rate
MechanicsModes with the same natural frequency
StatisticsPrincipal directions with the same variance
Graph theoryStructural modes of an adjacency or Laplacian matrix
Markov chainsLong-term or transient state spaces
Quantum mechanicsStates with the same measured value
Numerical analysisSubspaces controlling convergence

When an eigenvalue has eigenspace dimension greater than 11, there are several independent directions with the same scaling behavior.

62.19 Common Errors

The first common error is to call the zero vector an eigenvector. The zero vector belongs to every eigenspace, but it is not an eigenvector.

The second common error is to confuse an eigenvalue with an eigenspace. The eigenvalue is a scalar. The eigenspace is a subspace.

The third common error is to compute only one eigenvector and forget the full span. An eigenspace contains all scalar multiples and all linear combinations of its basis eigenvectors.

The fourth common error is to ignore the field. A matrix may have complex eigenspaces even when all its entries are real.

62.20 Summary

For a square matrix AA and an eigenvalue λ\lambda, the eigenspace is

Eλ={v:Av=λv}. E_\lambda=\{v:Av=\lambda v\}.

Equivalently,

Eλ=ker(AλI). E_\lambda=\ker(A-\lambda I).

An eigenspace is a subspace. Its nonzero vectors are eigenvectors. Its dimension is the geometric multiplicity of the eigenvalue.

Eigenspaces organize eigenvectors into linear subspaces. They determine whether a matrix has enough eigenvectors to be diagonalized and provide the natural coordinates in which a linear transformation acts by independent scaling.