Skip to content

Chapter 81. Spectral Theorems

Spectral theorems explain why eigenvalues and eigenvectors are useful in graph theory. They say that certain graph matrices can be decomposed into simple orthogonal pieces. This makes it possible to study a graph by studying the numerical spectrum of a matrix associated with it.

For a finite simple undirected graph, the adjacency matrix AA is real and symmetric. The Laplacian matrix LL is also real and symmetric. Therefore both matrices satisfy the real spectral theorem: their eigenvalues are real, and their eigenvectors may be chosen to form an orthonormal basis. This is the linear algebraic foundation of spectral graph theory.

81.1 The Real Spectral Theorem

Let

MRn×n M\in \mathbb{R}^{n\times n}

be a real symmetric matrix:

MT=M. M^T=M.

The real spectral theorem states that there exist real eigenvalues

λ1,λ2,,λn \lambda_1,\lambda_2,\ldots,\lambda_n

and an orthonormal basis of eigenvectors

q1,q2,,qn q_1,q_2,\ldots,q_n

such that

Mqi=λiqi Mq_i=\lambda_i q_i

for each ii.

Equivalently, there is an orthogonal matrix

Q=[q1q2qn] Q= \begin{bmatrix} q_1 & q_2 & \cdots & q_n \end{bmatrix}

and a diagonal matrix

Λ=diag(λ1,λ2,,λn) \Lambda= \operatorname{diag}(\lambda_1,\lambda_2,\ldots,\lambda_n)

such that

M=QΛQT. M=Q\Lambda Q^T.

This is called an orthogonal diagonalization of MM.

81.2 Orthogonal Matrices

A matrix QQ is orthogonal if

QTQ=I. Q^TQ=I.

Equivalently,

Q1=QT. Q^{-1}=Q^T.

The columns of an orthogonal matrix form an orthonormal basis. Thus

qiTqj={1,i=j,0,ij. q_i^Tq_j = \begin{cases} 1, & i=j,\\ 0, & i\neq j. \end{cases}

Orthogonal matrices preserve lengths and angles. If QQ is orthogonal, then

Qx=x \|Qx\|=\|x\|

for every vector xx. In spectral graph theory, this matters because decomposition into eigenvectors does not distort the geometry of the vector space.

81.3 Spectral Decomposition

The identity

M=QΛQT M=Q\Lambda Q^T

can be written as a sum of rank-one matrices:

M=i=1nλiqiqiT. M=\sum_{i=1}^{n}\lambda_i q_iq_i^T.

Each matrix

qiqiT q_iq_i^T

is the orthogonal projection onto the one-dimensional subspace spanned by qiq_i.

Thus the matrix MM is decomposed into independent spectral components. Each component has a direction qiq_i and a scale λi\lambda_i.

For a graph matrix, this means that the graph action can be split into basic modes on the vertex set.

81.4 Vertex Functions and Eigenbasis Expansion

A vector

xRn x\in \mathbb{R}^n

may be viewed as a function on the vertices of a graph. If

q1,q2,,qn q_1,q_2,\ldots,q_n

is an orthonormal eigenbasis, then every vertex function xx has a unique expansion

x=i=1nciqi, x=\sum_{i=1}^{n} c_i q_i,

where

ci=qiTx. c_i=q_i^T x.

The coefficients cic_i are the spectral coordinates of xx.

This is analogous to writing a signal as a sum of Fourier modes. The eigenvectors play the role of graph-dependent basis functions.

81.5 Matrix Action in Spectral Coordinates

If

x=i=1nciqi, x=\sum_{i=1}^{n} c_i q_i,

then

Mx=i=1nciMqi=i=1nciλiqi. Mx=\sum_{i=1}^{n} c_i Mq_i = \sum_{i=1}^{n} c_i\lambda_i q_i.

Thus MM acts by multiplying each spectral coordinate by its eigenvalue.

In ordinary coordinates, MM may mix many entries. In spectral coordinates, it acts diagonally. This is the main computational and conceptual value of the spectral theorem.

81.6 Powers of a Graph Matrix

The spectral theorem gives a simple formula for powers:

Mk=QΛkQT. M^k=Q\Lambda^kQ^T.

Equivalently,

Mk=i=1nλikqiqiT. M^k=\sum_{i=1}^{n}\lambda_i^k q_iq_i^T.

For the adjacency matrix AA, the entry (Ak)uv(A^k)_{uv} counts walks of length kk from uu to vv. Therefore the spectral theorem connects walk counts to eigenvalues and eigenvectors.

Large eigenvalues dominate high powers. Small eigenvalues decay relative to large ones. Negative eigenvalues create sign alternation in odd and even powers.

81.7 Functions of Matrices

The same idea applies to functions of matrices. If

M=QΛQT, M=Q\Lambda Q^T,

then for a suitable function ff,

f(M)=Qf(Λ)QT, f(M)=Qf(\Lambda)Q^T,

where

f(Λ)=diag(f(λ1),f(λ2),,f(λn)). f(\Lambda)= \operatorname{diag}(f(\lambda_1),f(\lambda_2),\ldots,f(\lambda_n)).

For example,

eM=QeΛQT, e^M=Qe^\Lambda Q^T,

where

eΛ=diag(eλ1,eλ2,,eλn). e^\Lambda= \operatorname{diag}(e^{\lambda_1},e^{\lambda_2},\ldots,e^{\lambda_n}).

This construction appears in heat flow on graphs, diffusion kernels, communicability measures, and graph signal processing.

81.8 The Adjacency Spectral Theorem

Let GG be a finite simple undirected graph with adjacency matrix AA. Since

AT=A, A^T=A,

the real spectral theorem applies. Hence

A=QΛQT A=Q\Lambda Q^T

for an orthogonal matrix QQ and a real diagonal matrix Λ\Lambda.

The eigenvalues of AA form the adjacency spectrum of GG. The columns of QQ are orthonormal vertex functions. The equation

Aqi=λiqi Aq_i=\lambda_i q_i

means that, at every vertex, the sum of neighboring values in qiq_i equals λi\lambda_i times the value at that vertex.

This gives a precise spectral description of adjacency.

81.9 The Laplacian Spectral Theorem

Let LL be the Laplacian matrix of a finite simple undirected graph:

L=ΔA. L=\Delta-A.

Since both Δ\Delta and AA are symmetric,

LT=L. L^T=L.

Therefore

L=UMUT, L=U\Mu U^T,

where UU is orthogonal and

M=diag(μ1,μ2,,μn) \Mu= \operatorname{diag}(\mu_1,\mu_2,\ldots,\mu_n)

contains the Laplacian eigenvalues.

Because

xTLx={u,v}E(xuxv)20, x^TLx=\sum_{\{u,v\}\in E}(x_u-x_v)^2\geq 0,

all Laplacian eigenvalues are nonnegative. Thus they may be ordered as

0=μ1μ2μn. 0=\mu_1\leq \mu_2\leq \cdots\leq \mu_n.

The Laplacian spectral theorem is the basis for spectral partitioning, graph smoothing, and graph Fourier analysis.

81.10 Orthogonality of Distinct Eigenvectors

If MM is symmetric, eigenvectors for distinct eigenvalues are orthogonal.

Suppose

Mx=λx,My=μy, Mx=\lambda x, \qquad My=\mu y,

with

λμ. \lambda\neq \mu.

Then

xTMy=μxTy. x^TMy=\mu x^Ty.

But since MM is symmetric,

xTMy=(Mx)Ty=(λx)Ty=λxTy. x^TMy=(Mx)^Ty=(\lambda x)^Ty=\lambda x^Ty.

Therefore

λxTy=μxTy. \lambda x^Ty=\mu x^Ty.

Since λμ\lambda\neq \mu,

xTy=0. x^Ty=0.

This proves orthogonality for distinct eigenspaces.

81.11 Multiplicity

An eigenvalue may have multiplicity greater than one. In that case, its eigenspace has dimension greater than one, or at least may contain several independent directions.

For a symmetric matrix, the algebraic multiplicity and geometric multiplicity agree. Therefore an eigenvalue of multiplicity rr has an rr-dimensional eigenspace.

Any orthonormal basis of that eigenspace may be used in the spectral decomposition. Individual eigenvectors inside a repeated eigenspace may not be uniquely determined, but the eigenspace itself is intrinsic.

In graph theory, repeated eigenvalues often reflect symmetry, regularity, or special structure.

81.12 Projection Operators

For each eigenvector qiq_i, the matrix

Pi=qiqiT P_i=q_iq_i^T

is the orthogonal projection onto the line spanned by qiq_i.

If an eigenvalue λ\lambda has eigenspace EλE_\lambda, the projection onto that eigenspace is

Pλ=i:λi=λqiqiT. P_\lambda=\sum_{i:\lambda_i=\lambda} q_iq_i^T.

Then

M=λλPλ, M=\sum_{\lambda}\lambda P_\lambda,

where the sum is over distinct eigenvalues.

This form separates the matrix into eigenspace projections rather than individual eigenvectors. It is often cleaner when eigenvalues have multiplicity greater than one.

81.13 Rayleigh Quotients

For a real symmetric matrix MM, the Rayleigh quotient of a nonzero vector xx is

RM(x)=xTMxxTx. R_M(x)=\frac{x^TMx}{x^Tx}.

If xx is a unit eigenvector with eigenvalue λ\lambda, then

RM(x)=λ. R_M(x)=\lambda.

The spectral theorem implies that the largest eigenvalue is

λmax=maxx0xTMxxTx, \lambda_{\max} = \max_{x\neq 0}\frac{x^TMx}{x^Tx},

and the smallest eigenvalue is

λmin=minx0xTMxxTx. \lambda_{\min} = \min_{x\neq 0}\frac{x^TMx}{x^Tx}.

Rayleigh quotients are central in spectral graph theory because many graph quantities can be expressed as quadratic forms.

81.14 Variational Form for the Laplacian

For the Laplacian matrix,

RL(x)=xTLxxTx={u,v}E(xuxv)2vVxv2. R_L(x)=\frac{x^TLx}{x^Tx} = \frac{\sum_{\{u,v\}\in E}(x_u-x_v)^2}{\sum_{v\in V}x_v^2}.

The smallest value is 00, attained by constant vectors on each connected component.

For a connected graph, the second smallest eigenvalue has the variational form

μ2=minx0x1xTLxxTx. \mu_2 = \min_{\substack{x\neq 0\\x\perp \mathbf{1}}} \frac{x^TLx}{x^Tx}.

This formula explains why μ2\mu_2 measures connectivity. It asks how small the edge variation can be among nonconstant vertex functions.

81.15 Min-Max Principle

The spectral theorem also gives the Courant-Fischer min-max principle.

For a symmetric matrix MM with eigenvalues

λ1λ2λn, \lambda_1\leq \lambda_2\leq \cdots\leq \lambda_n,

one has

λk=mindimS=kmaxxSx0xTMxxTx. \lambda_k = \min_{\dim S=k} \max_{\substack{x\in S\\x\neq 0}} \frac{x^TMx}{x^Tx}.

Equivalently,

λk=maxdimS=nk+1minxSx0xTMxxTx. \lambda_k = \max_{\dim S=n-k+1} \min_{\substack{x\in S\\x\neq 0}} \frac{x^TMx}{x^Tx}.

This principle turns eigenvalue bounds into optimization problems over subspaces. In graph theory, it is used to prove inequalities involving connectivity, cuts, expansion, and graph partitioning.

81.16 Positive Semidefinite Matrices

A symmetric matrix MM is positive semidefinite if

xTMx0 x^TMx\geq 0

for every vector xx.

By the spectral theorem, this is equivalent to all eigenvalues being nonnegative.

The Laplacian matrix is positive semidefinite because

xTLx={u,v}E(xuxv)2. x^TLx=\sum_{\{u,v\}\in E}(x_u-x_v)^2.

Therefore all Laplacian eigenvalues are nonnegative. This is a direct application of the spectral theorem to graph structure.

81.17 Normalized Laplacian

The symmetric normalized Laplacian is

Lsym=IΔ1/2AΔ1/2. L_{\mathrm{sym}} = I-\Delta^{-1/2}A\Delta^{-1/2}.

When the graph has no isolated vertices, this matrix is real and symmetric. Therefore it also satisfies the real spectral theorem.

Its eigenvalues are real and lie in the interval

[0,2]. [0,2].

The normalized Laplacian is especially useful when degrees vary widely. It studies graph geometry after degree normalization.

81.18 Random Walk Matrix

The random walk matrix is

P=Δ1A. P=\Delta^{-1}A.

This matrix is generally not symmetric. Therefore the real spectral theorem does not apply directly.

However, if the graph is undirected and has no isolated vertices, then PP is similar to the symmetric matrix

Δ1/2AΔ1/2. \Delta^{-1/2}A\Delta^{-1/2}.

Indeed,

P=Δ1/2(Δ1/2AΔ1/2)Δ1/2. P = \Delta^{-1/2} \left(\Delta^{-1/2}A\Delta^{-1/2}\right) \Delta^{1/2}.

Thus PP has real eigenvalues and can be studied through a symmetric matrix. This is a common technique in the spectral theory of random walks on graphs.

81.19 Directed Graphs

For directed graphs, the adjacency matrix is usually not symmetric. Hence the real spectral theorem may fail.

A directed adjacency matrix can have complex eigenvalues. It may also fail to have a full basis of eigenvectors.

For this reason, spectral theory of directed graphs is more delicate. One may use singular values, Hermitian symmetrizations, normal matrices, magnetic Laplacians, or Perron-Frobenius theory for nonnegative matrices.

The cleanest form of the spectral theorem belongs to undirected graphs because their main matrices are symmetric.

81.20 Spectral Filtering

The spectral theorem allows one to define filters on graphs.

Let

L=UMUT. L=U\Mu U^T.

A graph signal xx can be transformed to spectral coordinates by

x^=UTx. \hat{x}=U^Tx.

A filter gg acts by multiplying each spectral coordinate:

g(L)x=Ug(M)UTx. g(L)x = Ug(\Mu)U^Tx.

Here

g(M)=diag(g(μ1),g(μ2),,g(μn)). g(\Mu)=\operatorname{diag}(g(\mu_1),g(\mu_2),\ldots,g(\mu_n)).

Low-pass filters preserve small Laplacian eigenvalues and reduce large ones. High-pass filters do the opposite. This formalism is used in graph signal processing and graph neural networks.

81.21 Heat Flow on Graphs

The heat equation on a graph can be written as

dxdt=Lx. \frac{dx}{dt}=-Lx.

Using the spectral theorem,

x(t)=etLx(0). x(t)=e^{-tL}x(0).

If

L=UMUT, L=U\Mu U^T,

then

etL=UetMUT. e^{-tL}=Ue^{-t\Mu}U^T.

Each spectral component decays at rate

etμi. e^{-t\mu_i}.

The zero eigenvalue component remains unchanged. Large eigenvalue components decay quickly. Thus heat flow smooths functions on the graph by suppressing high-frequency variation.

81.22 Summary

The spectral theorem is the linear algebraic engine behind spectral graph theory.

For any real symmetric graph matrix MM,

M=QΛQT. M=Q\Lambda Q^T.

This gives an orthonormal eigenbasis, real eigenvalues, diagonal action in spectral coordinates, and a decomposition into independent modes.

ConceptSpectral meaning
Symmetric graph matrixHas real eigenvalues and orthonormal eigenvectors
Orthogonal diagonalizationM=QΛQTM=Q\Lambda Q^T
Spectral decompositionM=iλiqiqiTM=\sum_i \lambda_i q_iq_i^T
Matrix powersMk=QΛkQTM^k=Q\Lambda^kQ^T
Rayleigh quotientVariational form of eigenvalues
Laplacian positivityNonnegative eigenvalues
Heat flowSpectral components decay by etμie^{-t\mu_i}
Graph filteringEigenvectors act as graph frequency modes

The adjacency matrix, Laplacian matrix, and normalized Laplacian become powerful because they are symmetric in the undirected case. The spectral theorem converts their action into independent scalar multiplications along orthogonal eigenvector directions.