Spectral theorems explain why eigenvalues and eigenvectors are useful in graph theory. They say that certain graph matrices can be decomposed into simple orthogonal pieces. This makes it possible to study a graph by studying the numerical spectrum of a matrix associated with it.
For a finite simple undirected graph, the adjacency matrix is real and symmetric. The Laplacian matrix is also real and symmetric. Therefore both matrices satisfy the real spectral theorem: their eigenvalues are real, and their eigenvectors may be chosen to form an orthonormal basis. This is the linear algebraic foundation of spectral graph theory.
81.1 The Real Spectral Theorem
Let
be a real symmetric matrix:
The real spectral theorem states that there exist real eigenvalues
and an orthonormal basis of eigenvectors
such that
for each .
Equivalently, there is an orthogonal matrix
and a diagonal matrix
such that
This is called an orthogonal diagonalization of .
81.2 Orthogonal Matrices
A matrix is orthogonal if
Equivalently,
The columns of an orthogonal matrix form an orthonormal basis. Thus
Orthogonal matrices preserve lengths and angles. If is orthogonal, then
for every vector . In spectral graph theory, this matters because decomposition into eigenvectors does not distort the geometry of the vector space.
81.3 Spectral Decomposition
The identity
can be written as a sum of rank-one matrices:
Each matrix
is the orthogonal projection onto the one-dimensional subspace spanned by .
Thus the matrix is decomposed into independent spectral components. Each component has a direction and a scale .
For a graph matrix, this means that the graph action can be split into basic modes on the vertex set.
81.4 Vertex Functions and Eigenbasis Expansion
A vector
may be viewed as a function on the vertices of a graph. If
is an orthonormal eigenbasis, then every vertex function has a unique expansion
where
The coefficients are the spectral coordinates of .
This is analogous to writing a signal as a sum of Fourier modes. The eigenvectors play the role of graph-dependent basis functions.
81.5 Matrix Action in Spectral Coordinates
If
then
Thus acts by multiplying each spectral coordinate by its eigenvalue.
In ordinary coordinates, may mix many entries. In spectral coordinates, it acts diagonally. This is the main computational and conceptual value of the spectral theorem.
81.6 Powers of a Graph Matrix
The spectral theorem gives a simple formula for powers:
Equivalently,
For the adjacency matrix , the entry counts walks of length from to . Therefore the spectral theorem connects walk counts to eigenvalues and eigenvectors.
Large eigenvalues dominate high powers. Small eigenvalues decay relative to large ones. Negative eigenvalues create sign alternation in odd and even powers.
81.7 Functions of Matrices
The same idea applies to functions of matrices. If
then for a suitable function ,
where
For example,
where
This construction appears in heat flow on graphs, diffusion kernels, communicability measures, and graph signal processing.
81.8 The Adjacency Spectral Theorem
Let be a finite simple undirected graph with adjacency matrix . Since
the real spectral theorem applies. Hence
for an orthogonal matrix and a real diagonal matrix .
The eigenvalues of form the adjacency spectrum of . The columns of are orthonormal vertex functions. The equation
means that, at every vertex, the sum of neighboring values in equals times the value at that vertex.
This gives a precise spectral description of adjacency.
81.9 The Laplacian Spectral Theorem
Let be the Laplacian matrix of a finite simple undirected graph:
Since both and are symmetric,
Therefore
where is orthogonal and
contains the Laplacian eigenvalues.
Because
all Laplacian eigenvalues are nonnegative. Thus they may be ordered as
The Laplacian spectral theorem is the basis for spectral partitioning, graph smoothing, and graph Fourier analysis.
81.10 Orthogonality of Distinct Eigenvectors
If is symmetric, eigenvectors for distinct eigenvalues are orthogonal.
Suppose
with
Then
But since is symmetric,
Therefore
Since ,
This proves orthogonality for distinct eigenspaces.
81.11 Multiplicity
An eigenvalue may have multiplicity greater than one. In that case, its eigenspace has dimension greater than one, or at least may contain several independent directions.
For a symmetric matrix, the algebraic multiplicity and geometric multiplicity agree. Therefore an eigenvalue of multiplicity has an -dimensional eigenspace.
Any orthonormal basis of that eigenspace may be used in the spectral decomposition. Individual eigenvectors inside a repeated eigenspace may not be uniquely determined, but the eigenspace itself is intrinsic.
In graph theory, repeated eigenvalues often reflect symmetry, regularity, or special structure.
81.12 Projection Operators
For each eigenvector , the matrix
is the orthogonal projection onto the line spanned by .
If an eigenvalue has eigenspace , the projection onto that eigenspace is
Then
where the sum is over distinct eigenvalues.
This form separates the matrix into eigenspace projections rather than individual eigenvectors. It is often cleaner when eigenvalues have multiplicity greater than one.
81.13 Rayleigh Quotients
For a real symmetric matrix , the Rayleigh quotient of a nonzero vector is
If is a unit eigenvector with eigenvalue , then
The spectral theorem implies that the largest eigenvalue is
and the smallest eigenvalue is
Rayleigh quotients are central in spectral graph theory because many graph quantities can be expressed as quadratic forms.
81.14 Variational Form for the Laplacian
For the Laplacian matrix,
The smallest value is , attained by constant vectors on each connected component.
For a connected graph, the second smallest eigenvalue has the variational form
This formula explains why measures connectivity. It asks how small the edge variation can be among nonconstant vertex functions.
81.15 Min-Max Principle
The spectral theorem also gives the Courant-Fischer min-max principle.
For a symmetric matrix with eigenvalues
one has
Equivalently,
This principle turns eigenvalue bounds into optimization problems over subspaces. In graph theory, it is used to prove inequalities involving connectivity, cuts, expansion, and graph partitioning.
81.16 Positive Semidefinite Matrices
A symmetric matrix is positive semidefinite if
for every vector .
By the spectral theorem, this is equivalent to all eigenvalues being nonnegative.
The Laplacian matrix is positive semidefinite because
Therefore all Laplacian eigenvalues are nonnegative. This is a direct application of the spectral theorem to graph structure.
81.17 Normalized Laplacian
The symmetric normalized Laplacian is
When the graph has no isolated vertices, this matrix is real and symmetric. Therefore it also satisfies the real spectral theorem.
Its eigenvalues are real and lie in the interval
The normalized Laplacian is especially useful when degrees vary widely. It studies graph geometry after degree normalization.
81.18 Random Walk Matrix
The random walk matrix is
This matrix is generally not symmetric. Therefore the real spectral theorem does not apply directly.
However, if the graph is undirected and has no isolated vertices, then is similar to the symmetric matrix
Indeed,
Thus has real eigenvalues and can be studied through a symmetric matrix. This is a common technique in the spectral theory of random walks on graphs.
81.19 Directed Graphs
For directed graphs, the adjacency matrix is usually not symmetric. Hence the real spectral theorem may fail.
A directed adjacency matrix can have complex eigenvalues. It may also fail to have a full basis of eigenvectors.
For this reason, spectral theory of directed graphs is more delicate. One may use singular values, Hermitian symmetrizations, normal matrices, magnetic Laplacians, or Perron-Frobenius theory for nonnegative matrices.
The cleanest form of the spectral theorem belongs to undirected graphs because their main matrices are symmetric.
81.20 Spectral Filtering
The spectral theorem allows one to define filters on graphs.
Let
A graph signal can be transformed to spectral coordinates by
A filter acts by multiplying each spectral coordinate:
Here
Low-pass filters preserve small Laplacian eigenvalues and reduce large ones. High-pass filters do the opposite. This formalism is used in graph signal processing and graph neural networks.
81.21 Heat Flow on Graphs
The heat equation on a graph can be written as
Using the spectral theorem,
If
then
Each spectral component decays at rate
The zero eigenvalue component remains unchanged. Large eigenvalue components decay quickly. Thus heat flow smooths functions on the graph by suppressing high-frequency variation.
81.22 Summary
The spectral theorem is the linear algebraic engine behind spectral graph theory.
For any real symmetric graph matrix ,
This gives an orthonormal eigenbasis, real eigenvalues, diagonal action in spectral coordinates, and a decomposition into independent modes.
| Concept | Spectral meaning |
|---|---|
| Symmetric graph matrix | Has real eigenvalues and orthonormal eigenvectors |
| Orthogonal diagonalization | |
| Spectral decomposition | |
| Matrix powers | |
| Rayleigh quotient | Variational form of eigenvalues |
| Laplacian positivity | Nonnegative eigenvalues |
| Heat flow | Spectral components decay by |
| Graph filtering | Eigenvectors act as graph frequency modes |
The adjacency matrix, Laplacian matrix, and normalized Laplacian become powerful because they are symmetric in the undirected case. The spectral theorem converts their action into independent scalar multiplications along orthogonal eigenvector directions.