# Chapter 136. Spectral Graph Theory

# Chapter 136. Spectral Graph Theory

## 136.1 Introduction

Spectral graph theory studies graphs using matrices and their eigenvalues.

A graph is a combinatorial object made of vertices and edges. Linear algebra enters through matrices associated with the graph, such as adjacency matrices and Laplacian matrices. The eigenvalues and eigenvectors of these matrices encode structural information about connectivity, expansion, clustering, random walks, and dynamics on the graph.

The subject lies at the intersection of:

| Area | Connection |
|---|---|
| Linear algebra | Eigenvalues and eigenvectors |
| Graph theory | Combinatorial structure |
| Probability | Random walks and Markov chains |
| Computer science | Networks and algorithms |
| Geometry | Discrete analogues of manifolds |
| Physics | Diffusion and vibration |
| Data science | Spectral clustering |

Spectral graph theory translates combinatorial problems into matrix problems.

## 136.2 Graphs

A graph \(G\) consists of:

1. A set of vertices \(V\),
2. A set of edges \(E\).

If the graph has \(n\) vertices, one often labels them

$$
1,2,\ldots,n.
$$

An edge between vertices \(i\) and \(j\) is written

$$
(i,j).
$$

A graph is undirected if:

$$
(i,j)\in E
\quad\Longrightarrow\quad
(j,i)\in E.
$$

Otherwise it is directed.

Throughout most of spectral graph theory, graphs are undirected unless stated otherwise.

## 136.3 Adjacency Matrix

The adjacency matrix of a graph with \(n\) vertices is the matrix

$$
A=(a_{ij}),
$$

where

$$
a_{ij} =
\begin{cases}
1, & \text{if } i \text{ and } j \text{ are adjacent},\\
0, & \text{otherwise}.
\end{cases}
$$

For an undirected graph,

$$
A=A^T.
$$

Thus the adjacency matrix is symmetric, so its eigenvalues are real.

Example:

If the graph has edges

$$
(1,2),\quad (2,3),
$$

then

$$
A =
\begin{bmatrix}
0 & 1 & 0\\
1 & 0 & 1\\
0 & 1 & 0
\end{bmatrix}.
$$

The adjacency matrix converts graph structure into linear algebra.

## 136.4 Degree Matrix

The degree of a vertex is the number of edges incident to it.

If vertex \(i\) has degree \(d_i\), then the degree matrix is

$$
D =
\operatorname{diag}(d_1,\ldots,d_n).
$$

This is a diagonal matrix.

For the graph

$$
1-2-3,
$$

the degrees are:

| Vertex | Degree |
|---|---|
| \(1\) | \(1\) |
| \(2\) | \(2\) |
| \(3\) | \(1\) |

Thus

$$
D =
\begin{bmatrix}
1 & 0 & 0\\
0 & 2 & 0\\
0 & 0 & 1
\end{bmatrix}.
$$

The degree matrix records local connectivity information.

## 136.5 Graph Laplacian

The Laplacian matrix of a graph is

$$
L=D-A.
$$

Explicitly:

$$
L_{ij} =
\begin{cases}
d_i, & i=j,\\
-1, & i\ne j \text{ and adjacent},\\
0, & \text{otherwise}.
\end{cases}
$$

For the path graph

$$
1-2-3,
$$

the Laplacian is

$$
L =
\begin{bmatrix}
1 & -1 & 0\\
-1 & 2 & -1\\
0 & -1 & 1
\end{bmatrix}.
$$

The Laplacian is one of the central matrices of spectral graph theory.

It acts as a discrete analogue of the continuous Laplacian operator from differential equations.

## 136.6 Basic Properties of the Laplacian

The graph Laplacian has several important properties.

### Symmetry

For undirected graphs:

$$
L=L^T.
$$

Therefore all eigenvalues are real.

### Positive Semidefiniteness

For every vector \(x\in\mathbb{R}^n\),

$$
x^TLx =
\sum_{(i,j)\in E}(x_i-x_j)^2.
$$

x^TLx=\sum_{(i,j)\in E}(x_i-x_j)^2

Thus:

$$
x^TLx\ge0.
$$

So the Laplacian is positive semidefinite.

### Zero Eigenvalue

The vector

$$
\mathbf{1} =
\begin{bmatrix}
1\\
1\\
\vdots\\
1
\end{bmatrix}
$$

satisfies

$$
L\mathbf{1}=0.
$$

Thus \(0\) is always an eigenvalue.

## 136.7 Connectivity and Eigenvalues

The Laplacian eigenvalues encode connectivity.

Order the eigenvalues as:

$$
0=\lambda_1\le\lambda_2\le\cdots\le\lambda_n.
$$

The multiplicity of the eigenvalue \(0\) equals the number of connected components of the graph.

Thus:

| Multiplicity of \(0\) | Graph property |
|---|---|
| \(1\) | Connected |
| \(k\) | \(k\) connected components |

This result is fundamental because it translates connectivity into a linear-algebraic statement.

## 136.8 Algebraic Connectivity

The second-smallest Laplacian eigenvalue

$$
\lambda_2
$$

is called the algebraic connectivity or Fiedler value.

It measures how well connected the graph is.

Properties include:

| Value of \(\lambda_2\) | Interpretation |
|---|---|
| \(0\) | Graph disconnected |
| Small positive | Weakly connected |
| Large | Strongly connected |

The corresponding eigenvector is called the Fiedler vector.

It is important in graph partitioning and clustering.

## 136.9 Spectral Clustering

Spectral clustering uses eigenvectors of graph matrices to partition data.

The basic idea is:

1. Construct a similarity graph,
2. Form a Laplacian matrix,
3. Compute eigenvectors,
4. Embed the data into a low-dimensional spectral space,
5. Cluster in that space.

The Fiedler vector often separates the graph into two weakly connected regions.

Spectral clustering is widely used in:

| Application | Use |
|---|---|
| Image segmentation | Pixel grouping |
| Social networks | Community detection |
| Machine learning | Unsupervised clustering |
| Data analysis | Dimensional reduction |

Linear algebra provides the embedding coordinates.

## 136.10 Normalized Laplacian

The normalized Laplacian is:

$$
\mathcal{L} =
I-D^{-1/2}AD^{-1/2}.
$$

It adjusts for varying vertex degrees.

Another form is the random-walk Laplacian:

$$
L_{\mathrm{rw}} =
I-D^{-1}A.
$$

Normalized Laplacians are useful when degree variation is large.

Their eigenvalues lie in the interval

$$
[0,2].
$$

Normalized forms are common in probability and machine learning.

## 136.11 Random Walks on Graphs

A random walk moves from vertex to adjacent vertex randomly.

The transition matrix is

$$
P=D^{-1}A.
$$

The entry

$$
P_{ij}
$$

gives the probability of moving from vertex \(i\) to vertex \(j\).

Properties of the random walk are closely related to spectral properties of \(P\) and the Laplacian.

Examples include:

| Spectral quantity | Probabilistic meaning |
|---|---|
| Largest eigenvalue | Stationary behavior |
| Spectral gap | Mixing speed |
| Eigenvectors | Long-term structure |

Random walks connect graph theory with Markov chains.

## 136.12 Expansion

An expander graph is a sparse graph with strong connectivity properties.

Informally, every subset of vertices has many edges leaving it.

Expansion is related to the spectral gap:

$$
\lambda_2.
$$

Large spectral gap implies strong expansion.

Expanders are important in:

| Area | Application |
|---|---|
| Computer science | Network design |
| Complexity theory | Derandomization |
| Coding theory | Error correction |
| Distributed systems | Communication efficiency |

Spectral graph theory provides quantitative measures of expansion.

## 136.13 Cheeger's Inequality

Cheeger's inequality relates graph expansion to Laplacian eigenvalues.

The conductance of a graph measures how difficult it is to disconnect the graph.

Cheeger's inequality states, roughly:

$$
\lambda_2
$$

controls conductance up to multiplicative constants.

Thus spectral information determines combinatorial bottlenecks.

This theorem is fundamental in spectral partitioning.

## 136.14 Eigenvectors and Geometry

Eigenvectors of graph matrices often reveal geometric structure.

For example:

| Eigenvector | Interpretation |
|---|---|
| Constant eigenvector | Global uniform mode |
| Fiedler vector | Weakest separation direction |
| Higher eigenvectors | Finer structural modes |

This resembles Fourier analysis.

Low-frequency eigenvectors vary slowly across edges.

High-frequency eigenvectors oscillate rapidly.

Thus graph eigenvectors act like discrete harmonic modes.

## 136.15 Discrete Laplacian and PDE Analogy

The graph Laplacian is analogous to the continuous Laplace operator

$$
\Delta.
$$

Continuous Laplacian:

$$
\Delta f =
\sum_i
\frac{\partial^2 f}{\partial x_i^2}.
$$

Discrete Laplacian:

$$
(Lf)(i) =
\sum_{j\sim i}(f(i)-f(j)).
$$

Both measure local variation.

This analogy explains why spectral graph theory resembles harmonic analysis and differential geometry.

## 136.16 Heat Diffusion on Graphs

Heat flow on a graph satisfies the differential equation:

$$
\frac{du}{dt} =
-Lu.
$$

The solution is:

$$
u(t) =
e^{-tL}u(0).
$$

The matrix exponential

$$
e^{-tL}
$$

acts as a diffusion operator.

Eigenvalues determine diffusion rates:

| Eigenvalue | Effect |
|---|---|
| Small | Slow decay |
| Large | Fast decay |

Heat diffusion smooths functions on graphs.

## 136.17 Graph Fourier Transform

The eigenvectors of the Laplacian define a graph Fourier basis.

If:

$$
L=U\Lambda U^T,
$$

then the columns of \(U\) are orthonormal eigenvectors.

For a graph signal \(f\), the graph Fourier transform is:

$$
\widehat{f}=U^Tf.
$$

Low-frequency components correspond to small eigenvalues.

Graph signal processing extends Fourier analysis to irregular discrete structures.

Applications include:

| Application | Goal |
|---|---|
| Sensor networks | Signal smoothing |
| Recommendation systems | Graph filtering |
| Image processing | Structured denoising |

## 136.18 Directed Graphs

Directed graphs lead to nonsymmetric matrices.

If:

$$
A\ne A^T,
$$

then eigenvalues may become complex.

Many properties of symmetric spectral theory fail or become more complicated.

Directed spectral theory often uses:

| Matrix | Purpose |
|---|---|
| Transition matrix | Markov chains |
| Directed Laplacian | Flow analysis |
| PageRank matrix | Web ranking |

Nonsymmetric spectral theory is generally harder than symmetric theory.

## 136.19 Random Graphs

Random graphs produce random matrices.

For example, in the Erdős-Rényi model:

$$
G(n,p),
$$

each edge appears independently with probability \(p\).

The adjacency matrix becomes a random symmetric matrix.

Spectral graph theory and random matrix theory interact strongly in this setting.

Important questions include:

| Question | Meaning |
|---|---|
| Largest eigenvalue | Connectivity strength |
| Spectral distribution | Global randomness |
| Eigenvector localization | Structural concentration |

Random graph spectra reveal hidden structure in networks.

## 136.20 Applications

Spectral graph theory has many applications.

### Computer Science

| Problem | Spectral tool |
|---|---|
| Graph partitioning | Fiedler vector |
| Web search | Eigenvector ranking |
| Recommendation systems | Spectral embeddings |

### Machine Learning

| Task | Spectral method |
|---|---|
| Clustering | Laplacian eigenvectors |
| Manifold learning | Spectral embeddings |
| Semi-supervised learning | Graph diffusion |

### Physics

| Phenomenon | Graph interpretation |
|---|---|
| Diffusion | Heat equation |
| Vibrations | Laplacian eigenmodes |
| Quantum transport | Spectral propagation |

### Network Science

| Structure | Spectral indicator |
|---|---|
| Communities | Eigenvector structure |
| Robustness | Spectral gap |
| Synchronization | Laplacian eigenvalues |

## 136.21 Summary

Spectral graph theory studies graphs using eigenvalues and eigenvectors of associated matrices.

The main ideas are:

| Concept | Meaning |
|---|---|
| Adjacency matrix | Encodes edges |
| Degree matrix | Encodes vertex degrees |
| Laplacian | Discrete differential operator |
| Spectral gap | Connectivity strength |
| Fiedler vector | Graph partition direction |
| Random walk matrix | Probabilistic dynamics |
| Expansion | Sparse but highly connected structure |
| Graph Fourier basis | Laplacian eigenvectors |
| Heat diffusion | Dynamics generated by Laplacian |
| Spectral clustering | Eigenvector-based partitioning |

Spectral graph theory converts combinatorial structure into linear algebra. Eigenvalues and eigenvectors reveal connectivity, geometry, diffusion, clustering, and randomness in discrete networks.
