136.1 Introduction
Spectral graph theory studies graphs using matrices and their eigenvalues.
A graph is a combinatorial object made of vertices and edges. Linear algebra enters through matrices associated with the graph, such as adjacency matrices and Laplacian matrices. The eigenvalues and eigenvectors of these matrices encode structural information about connectivity, expansion, clustering, random walks, and dynamics on the graph.
The subject lies at the intersection of:
| Area | Connection |
|---|---|
| Linear algebra | Eigenvalues and eigenvectors |
| Graph theory | Combinatorial structure |
| Probability | Random walks and Markov chains |
| Computer science | Networks and algorithms |
| Geometry | Discrete analogues of manifolds |
| Physics | Diffusion and vibration |
| Data science | Spectral clustering |
Spectral graph theory translates combinatorial problems into matrix problems.
136.2 Graphs
A graph consists of:
- A set of vertices ,
- A set of edges .
If the graph has vertices, one often labels them
An edge between vertices and is written
A graph is undirected if:
Otherwise it is directed.
Throughout most of spectral graph theory, graphs are undirected unless stated otherwise.
136.3 Adjacency Matrix
The adjacency matrix of a graph with vertices is the matrix
where
For an undirected graph,
Thus the adjacency matrix is symmetric, so its eigenvalues are real.
Example:
If the graph has edges
then
The adjacency matrix converts graph structure into linear algebra.
136.4 Degree Matrix
The degree of a vertex is the number of edges incident to it.
If vertex has degree , then the degree matrix is
This is a diagonal matrix.
For the graph
the degrees are:
| Vertex | Degree |
|---|---|
Thus
The degree matrix records local connectivity information.
136.5 Graph Laplacian
The Laplacian matrix of a graph is
Explicitly:
For the path graph
the Laplacian is
The Laplacian is one of the central matrices of spectral graph theory.
It acts as a discrete analogue of the continuous Laplacian operator from differential equations.
136.6 Basic Properties of the Laplacian
The graph Laplacian has several important properties.
Symmetry
For undirected graphs:
Therefore all eigenvalues are real.
Positive Semidefiniteness
For every vector ,
x^TLx=\sum_{(i,j)\in E}(x_i-x_j)^2
Thus:
So the Laplacian is positive semidefinite.
Zero Eigenvalue
The vector
satisfies
Thus is always an eigenvalue.
136.7 Connectivity and Eigenvalues
The Laplacian eigenvalues encode connectivity.
Order the eigenvalues as:
The multiplicity of the eigenvalue equals the number of connected components of the graph.
Thus:
| Multiplicity of | Graph property |
|---|---|
| Connected | |
| connected components |
This result is fundamental because it translates connectivity into a linear-algebraic statement.
136.8 Algebraic Connectivity
The second-smallest Laplacian eigenvalue
is called the algebraic connectivity or Fiedler value.
It measures how well connected the graph is.
Properties include:
| Value of | Interpretation |
|---|---|
| Graph disconnected | |
| Small positive | Weakly connected |
| Large | Strongly connected |
The corresponding eigenvector is called the Fiedler vector.
It is important in graph partitioning and clustering.
136.9 Spectral Clustering
Spectral clustering uses eigenvectors of graph matrices to partition data.
The basic idea is:
- Construct a similarity graph,
- Form a Laplacian matrix,
- Compute eigenvectors,
- Embed the data into a low-dimensional spectral space,
- Cluster in that space.
The Fiedler vector often separates the graph into two weakly connected regions.
Spectral clustering is widely used in:
| Application | Use |
|---|---|
| Image segmentation | Pixel grouping |
| Social networks | Community detection |
| Machine learning | Unsupervised clustering |
| Data analysis | Dimensional reduction |
Linear algebra provides the embedding coordinates.
136.10 Normalized Laplacian
The normalized Laplacian is:
It adjusts for varying vertex degrees.
Another form is the random-walk Laplacian:
Normalized Laplacians are useful when degree variation is large.
Their eigenvalues lie in the interval
Normalized forms are common in probability and machine learning.
136.11 Random Walks on Graphs
A random walk moves from vertex to adjacent vertex randomly.
The transition matrix is
The entry
gives the probability of moving from vertex to vertex .
Properties of the random walk are closely related to spectral properties of and the Laplacian.
Examples include:
| Spectral quantity | Probabilistic meaning |
|---|---|
| Largest eigenvalue | Stationary behavior |
| Spectral gap | Mixing speed |
| Eigenvectors | Long-term structure |
Random walks connect graph theory with Markov chains.
136.12 Expansion
An expander graph is a sparse graph with strong connectivity properties.
Informally, every subset of vertices has many edges leaving it.
Expansion is related to the spectral gap:
Large spectral gap implies strong expansion.
Expanders are important in:
| Area | Application |
|---|---|
| Computer science | Network design |
| Complexity theory | Derandomization |
| Coding theory | Error correction |
| Distributed systems | Communication efficiency |
Spectral graph theory provides quantitative measures of expansion.
136.13 Cheeger’s Inequality
Cheeger’s inequality relates graph expansion to Laplacian eigenvalues.
The conductance of a graph measures how difficult it is to disconnect the graph.
Cheeger’s inequality states, roughly:
controls conductance up to multiplicative constants.
Thus spectral information determines combinatorial bottlenecks.
This theorem is fundamental in spectral partitioning.
136.14 Eigenvectors and Geometry
Eigenvectors of graph matrices often reveal geometric structure.
For example:
| Eigenvector | Interpretation |
|---|---|
| Constant eigenvector | Global uniform mode |
| Fiedler vector | Weakest separation direction |
| Higher eigenvectors | Finer structural modes |
This resembles Fourier analysis.
Low-frequency eigenvectors vary slowly across edges.
High-frequency eigenvectors oscillate rapidly.
Thus graph eigenvectors act like discrete harmonic modes.
136.15 Discrete Laplacian and PDE Analogy
The graph Laplacian is analogous to the continuous Laplace operator
Continuous Laplacian:
Discrete Laplacian:
Both measure local variation.
This analogy explains why spectral graph theory resembles harmonic analysis and differential geometry.
136.16 Heat Diffusion on Graphs
Heat flow on a graph satisfies the differential equation:
The solution is:
The matrix exponential
acts as a diffusion operator.
Eigenvalues determine diffusion rates:
| Eigenvalue | Effect |
|---|---|
| Small | Slow decay |
| Large | Fast decay |
Heat diffusion smooths functions on graphs.
136.17 Graph Fourier Transform
The eigenvectors of the Laplacian define a graph Fourier basis.
If:
then the columns of are orthonormal eigenvectors.
For a graph signal , the graph Fourier transform is:
Low-frequency components correspond to small eigenvalues.
Graph signal processing extends Fourier analysis to irregular discrete structures.
Applications include:
| Application | Goal |
|---|---|
| Sensor networks | Signal smoothing |
| Recommendation systems | Graph filtering |
| Image processing | Structured denoising |
136.18 Directed Graphs
Directed graphs lead to nonsymmetric matrices.
If:
then eigenvalues may become complex.
Many properties of symmetric spectral theory fail or become more complicated.
Directed spectral theory often uses:
| Matrix | Purpose |
|---|---|
| Transition matrix | Markov chains |
| Directed Laplacian | Flow analysis |
| PageRank matrix | Web ranking |
Nonsymmetric spectral theory is generally harder than symmetric theory.
136.19 Random Graphs
Random graphs produce random matrices.
For example, in the Erdős-Rényi model:
each edge appears independently with probability .
The adjacency matrix becomes a random symmetric matrix.
Spectral graph theory and random matrix theory interact strongly in this setting.
Important questions include:
| Question | Meaning |
|---|---|
| Largest eigenvalue | Connectivity strength |
| Spectral distribution | Global randomness |
| Eigenvector localization | Structural concentration |
Random graph spectra reveal hidden structure in networks.
136.20 Applications
Spectral graph theory has many applications.
Computer Science
| Problem | Spectral tool |
|---|---|
| Graph partitioning | Fiedler vector |
| Web search | Eigenvector ranking |
| Recommendation systems | Spectral embeddings |
Machine Learning
| Task | Spectral method |
|---|---|
| Clustering | Laplacian eigenvectors |
| Manifold learning | Spectral embeddings |
| Semi-supervised learning | Graph diffusion |
Physics
| Phenomenon | Graph interpretation |
|---|---|
| Diffusion | Heat equation |
| Vibrations | Laplacian eigenmodes |
| Quantum transport | Spectral propagation |
Network Science
| Structure | Spectral indicator |
|---|---|
| Communities | Eigenvector structure |
| Robustness | Spectral gap |
| Synchronization | Laplacian eigenvalues |
136.21 Summary
Spectral graph theory studies graphs using eigenvalues and eigenvectors of associated matrices.
The main ideas are:
| Concept | Meaning |
|---|---|
| Adjacency matrix | Encodes edges |
| Degree matrix | Encodes vertex degrees |
| Laplacian | Discrete differential operator |
| Spectral gap | Connectivity strength |
| Fiedler vector | Graph partition direction |
| Random walk matrix | Probabilistic dynamics |
| Expansion | Sparse but highly connected structure |
| Graph Fourier basis | Laplacian eigenvectors |
| Heat diffusion | Dynamics generated by Laplacian |
| Spectral clustering | Eigenvector-based partitioning |
Spectral graph theory converts combinatorial structure into linear algebra. Eigenvalues and eigenvectors reveal connectivity, geometry, diffusion, clustering, and randomness in discrete networks.