# Chapter 81. Spectral Theorems

# Chapter 81. Spectral Theorems

Spectral theorems explain why eigenvalues and eigenvectors are useful in graph theory. They say that certain graph matrices can be decomposed into simple orthogonal pieces. This makes it possible to study a graph by studying the numerical spectrum of a matrix associated with it.

For a finite simple undirected graph, the adjacency matrix \(A\) is real and symmetric. The Laplacian matrix \(L\) is also real and symmetric. Therefore both matrices satisfy the real spectral theorem: their eigenvalues are real, and their eigenvectors may be chosen to form an orthonormal basis. This is the linear algebraic foundation of spectral graph theory.

## 81.1 The Real Spectral Theorem

Let

$$
M\in \mathbb{R}^{n\times n}
$$

be a real symmetric matrix:

$$
M^T=M.
$$

The real spectral theorem states that there exist real eigenvalues

$$
\lambda_1,\lambda_2,\ldots,\lambda_n
$$

and an orthonormal basis of eigenvectors

$$
q_1,q_2,\ldots,q_n
$$

such that

$$
Mq_i=\lambda_i q_i
$$

for each \(i\).

Equivalently, there is an orthogonal matrix

$$
Q=
\begin{bmatrix}
q_1 & q_2 & \cdots & q_n
\end{bmatrix}
$$

and a diagonal matrix

$$
\Lambda=
\operatorname{diag}(\lambda_1,\lambda_2,\ldots,\lambda_n)
$$

such that

$$
M=Q\Lambda Q^T.
$$

This is called an orthogonal diagonalization of \(M\).

## 81.2 Orthogonal Matrices

A matrix \(Q\) is orthogonal if

$$
Q^TQ=I.
$$

Equivalently,

$$
Q^{-1}=Q^T.
$$

The columns of an orthogonal matrix form an orthonormal basis. Thus

$$
q_i^Tq_j =
\begin{cases}
1, & i=j,\\
0, & i\neq j.
\end{cases}
$$

Orthogonal matrices preserve lengths and angles. If \(Q\) is orthogonal, then

$$
\|Qx\|=\|x\|
$$

for every vector \(x\). In spectral graph theory, this matters because decomposition into eigenvectors does not distort the geometry of the vector space.

## 81.3 Spectral Decomposition

The identity

$$
M=Q\Lambda Q^T
$$

can be written as a sum of rank-one matrices:

$$
M=\sum_{i=1}^{n}\lambda_i q_iq_i^T.
$$

Each matrix

$$
q_iq_i^T
$$

is the orthogonal projection onto the one-dimensional subspace spanned by \(q_i\).

Thus the matrix \(M\) is decomposed into independent spectral components. Each component has a direction \(q_i\) and a scale \(\lambda_i\).

For a graph matrix, this means that the graph action can be split into basic modes on the vertex set.

## 81.4 Vertex Functions and Eigenbasis Expansion

A vector

$$
x\in \mathbb{R}^n
$$

may be viewed as a function on the vertices of a graph. If

$$
q_1,q_2,\ldots,q_n
$$

is an orthonormal eigenbasis, then every vertex function \(x\) has a unique expansion

$$
x=\sum_{i=1}^{n} c_i q_i,
$$

where

$$
c_i=q_i^T x.
$$

The coefficients \(c_i\) are the spectral coordinates of \(x\).

This is analogous to writing a signal as a sum of Fourier modes. The eigenvectors play the role of graph-dependent basis functions.

## 81.5 Matrix Action in Spectral Coordinates

If

$$
x=\sum_{i=1}^{n} c_i q_i,
$$

then

$$
Mx=\sum_{i=1}^{n} c_i Mq_i =
\sum_{i=1}^{n} c_i\lambda_i q_i.
$$

Thus \(M\) acts by multiplying each spectral coordinate by its eigenvalue.

In ordinary coordinates, \(M\) may mix many entries. In spectral coordinates, it acts diagonally. This is the main computational and conceptual value of the spectral theorem.

## 81.6 Powers of a Graph Matrix

The spectral theorem gives a simple formula for powers:

$$
M^k=Q\Lambda^kQ^T.
$$

Equivalently,

$$
M^k=\sum_{i=1}^{n}\lambda_i^k q_iq_i^T.
$$

For the adjacency matrix \(A\), the entry \((A^k)_{uv}\) counts walks of length \(k\) from \(u\) to \(v\). Therefore the spectral theorem connects walk counts to eigenvalues and eigenvectors.

Large eigenvalues dominate high powers. Small eigenvalues decay relative to large ones. Negative eigenvalues create sign alternation in odd and even powers.

## 81.7 Functions of Matrices

The same idea applies to functions of matrices. If

$$
M=Q\Lambda Q^T,
$$

then for a suitable function \(f\),

$$
f(M)=Qf(\Lambda)Q^T,
$$

where

$$
f(\Lambda)=
\operatorname{diag}(f(\lambda_1),f(\lambda_2),\ldots,f(\lambda_n)).
$$

For example,

$$
e^M=Qe^\Lambda Q^T,
$$

where

$$
e^\Lambda=
\operatorname{diag}(e^{\lambda_1},e^{\lambda_2},\ldots,e^{\lambda_n}).
$$

This construction appears in heat flow on graphs, diffusion kernels, communicability measures, and graph signal processing.

## 81.8 The Adjacency Spectral Theorem

Let \(G\) be a finite simple undirected graph with adjacency matrix \(A\). Since

$$
A^T=A,
$$

the real spectral theorem applies. Hence

$$
A=Q\Lambda Q^T
$$

for an orthogonal matrix \(Q\) and a real diagonal matrix \(\Lambda\).

The eigenvalues of \(A\) form the adjacency spectrum of \(G\). The columns of \(Q\) are orthonormal vertex functions. The equation

$$
Aq_i=\lambda_i q_i
$$

means that, at every vertex, the sum of neighboring values in \(q_i\) equals \(\lambda_i\) times the value at that vertex.

This gives a precise spectral description of adjacency.

## 81.9 The Laplacian Spectral Theorem

Let \(L\) be the Laplacian matrix of a finite simple undirected graph:

$$
L=\Delta-A.
$$

Since both \(\Delta\) and \(A\) are symmetric,

$$
L^T=L.
$$

Therefore

$$
L=U\Mu U^T,
$$

where \(U\) is orthogonal and

$$
\Mu=
\operatorname{diag}(\mu_1,\mu_2,\ldots,\mu_n)
$$

contains the Laplacian eigenvalues.

Because

$$
x^TLx=\sum_{\{u,v\}\in E}(x_u-x_v)^2\geq 0,
$$

all Laplacian eigenvalues are nonnegative. Thus they may be ordered as

$$
0=\mu_1\leq \mu_2\leq \cdots\leq \mu_n.
$$

The Laplacian spectral theorem is the basis for spectral partitioning, graph smoothing, and graph Fourier analysis.

## 81.10 Orthogonality of Distinct Eigenvectors

If \(M\) is symmetric, eigenvectors for distinct eigenvalues are orthogonal.

Suppose

$$
Mx=\lambda x,
\qquad
My=\mu y,
$$

with

$$
\lambda\neq \mu.
$$

Then

$$
x^TMy=\mu x^Ty.
$$

But since \(M\) is symmetric,

$$
x^TMy=(Mx)^Ty=(\lambda x)^Ty=\lambda x^Ty.
$$

Therefore

$$
\lambda x^Ty=\mu x^Ty.
$$

Since \(\lambda\neq \mu\),

$$
x^Ty=0.
$$

This proves orthogonality for distinct eigenspaces.

## 81.11 Multiplicity

An eigenvalue may have multiplicity greater than one. In that case, its eigenspace has dimension greater than one, or at least may contain several independent directions.

For a symmetric matrix, the algebraic multiplicity and geometric multiplicity agree. Therefore an eigenvalue of multiplicity \(r\) has an \(r\)-dimensional eigenspace.

Any orthonormal basis of that eigenspace may be used in the spectral decomposition. Individual eigenvectors inside a repeated eigenspace may not be uniquely determined, but the eigenspace itself is intrinsic.

In graph theory, repeated eigenvalues often reflect symmetry, regularity, or special structure.

## 81.12 Projection Operators

For each eigenvector \(q_i\), the matrix

$$
P_i=q_iq_i^T
$$

is the orthogonal projection onto the line spanned by \(q_i\).

If an eigenvalue \(\lambda\) has eigenspace \(E_\lambda\), the projection onto that eigenspace is

$$
P_\lambda=\sum_{i:\lambda_i=\lambda} q_iq_i^T.
$$

Then

$$
M=\sum_{\lambda}\lambda P_\lambda,
$$

where the sum is over distinct eigenvalues.

This form separates the matrix into eigenspace projections rather than individual eigenvectors. It is often cleaner when eigenvalues have multiplicity greater than one.

## 81.13 Rayleigh Quotients

For a real symmetric matrix \(M\), the Rayleigh quotient of a nonzero vector \(x\) is

$$
R_M(x)=\frac{x^TMx}{x^Tx}.
$$

If \(x\) is a unit eigenvector with eigenvalue \(\lambda\), then

$$
R_M(x)=\lambda.
$$

The spectral theorem implies that the largest eigenvalue is

$$
\lambda_{\max} =
\max_{x\neq 0}\frac{x^TMx}{x^Tx},
$$

and the smallest eigenvalue is

$$
\lambda_{\min} =
\min_{x\neq 0}\frac{x^TMx}{x^Tx}.
$$

Rayleigh quotients are central in spectral graph theory because many graph quantities can be expressed as quadratic forms.

## 81.14 Variational Form for the Laplacian

For the Laplacian matrix,

$$
R_L(x)=\frac{x^TLx}{x^Tx} =
\frac{\sum_{\{u,v\}\in E}(x_u-x_v)^2}{\sum_{v\in V}x_v^2}.
$$

The smallest value is \(0\), attained by constant vectors on each connected component.

For a connected graph, the second smallest eigenvalue has the variational form

$$
\mu_2 =
\min_{\substack{x\neq 0\\x\perp \mathbf{1}}}
\frac{x^TLx}{x^Tx}.
$$

This formula explains why \(\mu_2\) measures connectivity. It asks how small the edge variation can be among nonconstant vertex functions.

## 81.15 Min-Max Principle

The spectral theorem also gives the Courant-Fischer min-max principle.

For a symmetric matrix \(M\) with eigenvalues

$$
\lambda_1\leq \lambda_2\leq \cdots\leq \lambda_n,
$$

one has

$$
\lambda_k =
\min_{\dim S=k}
\max_{\substack{x\in S\\x\neq 0}}
\frac{x^TMx}{x^Tx}.
$$

Equivalently,

$$
\lambda_k =
\max_{\dim S=n-k+1}
\min_{\substack{x\in S\\x\neq 0}}
\frac{x^TMx}{x^Tx}.
$$

This principle turns eigenvalue bounds into optimization problems over subspaces. In graph theory, it is used to prove inequalities involving connectivity, cuts, expansion, and graph partitioning.

## 81.16 Positive Semidefinite Matrices

A symmetric matrix \(M\) is positive semidefinite if

$$
x^TMx\geq 0
$$

for every vector \(x\).

By the spectral theorem, this is equivalent to all eigenvalues being nonnegative.

The Laplacian matrix is positive semidefinite because

$$
x^TLx=\sum_{\{u,v\}\in E}(x_u-x_v)^2.
$$

Therefore all Laplacian eigenvalues are nonnegative. This is a direct application of the spectral theorem to graph structure.

## 81.17 Normalized Laplacian

The symmetric normalized Laplacian is

$$
L_{\mathrm{sym}} =
I-\Delta^{-1/2}A\Delta^{-1/2}.
$$

When the graph has no isolated vertices, this matrix is real and symmetric. Therefore it also satisfies the real spectral theorem.

Its eigenvalues are real and lie in the interval

$$
[0,2].
$$

The normalized Laplacian is especially useful when degrees vary widely. It studies graph geometry after degree normalization.

## 81.18 Random Walk Matrix

The random walk matrix is

$$
P=\Delta^{-1}A.
$$

This matrix is generally not symmetric. Therefore the real spectral theorem does not apply directly.

However, if the graph is undirected and has no isolated vertices, then \(P\) is similar to the symmetric matrix

$$
\Delta^{-1/2}A\Delta^{-1/2}.
$$

Indeed,

$$
P =
\Delta^{-1/2}
\left(\Delta^{-1/2}A\Delta^{-1/2}\right)
\Delta^{1/2}.
$$

Thus \(P\) has real eigenvalues and can be studied through a symmetric matrix. This is a common technique in the spectral theory of random walks on graphs.

## 81.19 Directed Graphs

For directed graphs, the adjacency matrix is usually not symmetric. Hence the real spectral theorem may fail.

A directed adjacency matrix can have complex eigenvalues. It may also fail to have a full basis of eigenvectors.

For this reason, spectral theory of directed graphs is more delicate. One may use singular values, Hermitian symmetrizations, normal matrices, magnetic Laplacians, or Perron-Frobenius theory for nonnegative matrices.

The cleanest form of the spectral theorem belongs to undirected graphs because their main matrices are symmetric.

## 81.20 Spectral Filtering

The spectral theorem allows one to define filters on graphs.

Let

$$
L=U\Mu U^T.
$$

A graph signal \(x\) can be transformed to spectral coordinates by

$$
\hat{x}=U^Tx.
$$

A filter \(g\) acts by multiplying each spectral coordinate:

$$
g(L)x =
Ug(\Mu)U^Tx.
$$

Here

$$
g(\Mu)=\operatorname{diag}(g(\mu_1),g(\mu_2),\ldots,g(\mu_n)).
$$

Low-pass filters preserve small Laplacian eigenvalues and reduce large ones. High-pass filters do the opposite. This formalism is used in graph signal processing and graph neural networks.

## 81.21 Heat Flow on Graphs

The heat equation on a graph can be written as

$$
\frac{dx}{dt}=-Lx.
$$

Using the spectral theorem,

$$
x(t)=e^{-tL}x(0).
$$

If

$$
L=U\Mu U^T,
$$

then

$$
e^{-tL}=Ue^{-t\Mu}U^T.
$$

Each spectral component decays at rate

$$
e^{-t\mu_i}.
$$

The zero eigenvalue component remains unchanged. Large eigenvalue components decay quickly. Thus heat flow smooths functions on the graph by suppressing high-frequency variation.

## 81.22 Summary

The spectral theorem is the linear algebraic engine behind spectral graph theory.

For any real symmetric graph matrix \(M\),

$$
M=Q\Lambda Q^T.
$$

This gives an orthonormal eigenbasis, real eigenvalues, diagonal action in spectral coordinates, and a decomposition into independent modes.

| Concept | Spectral meaning |
|---|---|
| Symmetric graph matrix | Has real eigenvalues and orthonormal eigenvectors |
| Orthogonal diagonalization | \(M=Q\Lambda Q^T\) |
| Spectral decomposition | \(M=\sum_i \lambda_i q_iq_i^T\) |
| Matrix powers | \(M^k=Q\Lambda^kQ^T\) |
| Rayleigh quotient | Variational form of eigenvalues |
| Laplacian positivity | Nonnegative eigenvalues |
| Heat flow | Spectral components decay by \(e^{-t\mu_i}\) |
| Graph filtering | Eigenvectors act as graph frequency modes |

The adjacency matrix, Laplacian matrix, and normalized Laplacian become powerful because they are symmetric in the undirected case. The spectral theorem converts their action into independent scalar multiplications along orthogonal eigenvector directions.
