Skip to content

Chapter 136. Spectral Graph Theory

136.1 Introduction

Spectral graph theory studies graphs using matrices and their eigenvalues.

A graph is a combinatorial object made of vertices and edges. Linear algebra enters through matrices associated with the graph, such as adjacency matrices and Laplacian matrices. The eigenvalues and eigenvectors of these matrices encode structural information about connectivity, expansion, clustering, random walks, and dynamics on the graph.

The subject lies at the intersection of:

AreaConnection
Linear algebraEigenvalues and eigenvectors
Graph theoryCombinatorial structure
ProbabilityRandom walks and Markov chains
Computer scienceNetworks and algorithms
GeometryDiscrete analogues of manifolds
PhysicsDiffusion and vibration
Data scienceSpectral clustering

Spectral graph theory translates combinatorial problems into matrix problems.

136.2 Graphs

A graph GG consists of:

  1. A set of vertices VV,
  2. A set of edges EE.

If the graph has nn vertices, one often labels them

1,2,,n. 1,2,\ldots,n.

An edge between vertices ii and jj is written

(i,j). (i,j).

A graph is undirected if:

(i,j)E(j,i)E. (i,j)\in E \quad\Longrightarrow\quad (j,i)\in E.

Otherwise it is directed.

Throughout most of spectral graph theory, graphs are undirected unless stated otherwise.

136.3 Adjacency Matrix

The adjacency matrix of a graph with nn vertices is the matrix

A=(aij), A=(a_{ij}),

where

aij={1,if i and j are adjacent,0,otherwise. a_{ij} = \begin{cases} 1, & \text{if } i \text{ and } j \text{ are adjacent},\\ 0, & \text{otherwise}. \end{cases}

For an undirected graph,

A=AT. A=A^T.

Thus the adjacency matrix is symmetric, so its eigenvalues are real.

Example:

If the graph has edges

(1,2),(2,3), (1,2),\quad (2,3),

then

A=[010101010]. A = \begin{bmatrix} 0 & 1 & 0\\ 1 & 0 & 1\\ 0 & 1 & 0 \end{bmatrix}.

The adjacency matrix converts graph structure into linear algebra.

136.4 Degree Matrix

The degree of a vertex is the number of edges incident to it.

If vertex ii has degree did_i, then the degree matrix is

D=diag(d1,,dn). D = \operatorname{diag}(d_1,\ldots,d_n).

This is a diagonal matrix.

For the graph

123, 1-2-3,

the degrees are:

VertexDegree
1111
2222
3311

Thus

D=[100020001]. D = \begin{bmatrix} 1 & 0 & 0\\ 0 & 2 & 0\\ 0 & 0 & 1 \end{bmatrix}.

The degree matrix records local connectivity information.

136.5 Graph Laplacian

The Laplacian matrix of a graph is

L=DA. L=D-A.

Explicitly:

Lij={di,i=j,1,ij and adjacent,0,otherwise. L_{ij} = \begin{cases} d_i, & i=j,\\ -1, & i\ne j \text{ and adjacent},\\ 0, & \text{otherwise}. \end{cases}

For the path graph

123, 1-2-3,

the Laplacian is

L=[110121011]. L = \begin{bmatrix} 1 & -1 & 0\\ -1 & 2 & -1\\ 0 & -1 & 1 \end{bmatrix}.

The Laplacian is one of the central matrices of spectral graph theory.

It acts as a discrete analogue of the continuous Laplacian operator from differential equations.

136.6 Basic Properties of the Laplacian

The graph Laplacian has several important properties.

Symmetry

For undirected graphs:

L=LT. L=L^T.

Therefore all eigenvalues are real.

Positive Semidefiniteness

For every vector xRnx\in\mathbb{R}^n,

xTLx=(i,j)E(xixj)2. x^TLx = \sum_{(i,j)\in E}(x_i-x_j)^2.

x^TLx=\sum_{(i,j)\in E}(x_i-x_j)^2

Thus:

xTLx0. x^TLx\ge0.

So the Laplacian is positive semidefinite.

Zero Eigenvalue

The vector

1=[111] \mathbf{1} = \begin{bmatrix} 1\\ 1\\ \vdots\\ 1 \end{bmatrix}

satisfies

L1=0. L\mathbf{1}=0.

Thus 00 is always an eigenvalue.

136.7 Connectivity and Eigenvalues

The Laplacian eigenvalues encode connectivity.

Order the eigenvalues as:

0=λ1λ2λn. 0=\lambda_1\le\lambda_2\le\cdots\le\lambda_n.

The multiplicity of the eigenvalue 00 equals the number of connected components of the graph.

Thus:

Multiplicity of 00Graph property
11Connected
kkkk connected components

This result is fundamental because it translates connectivity into a linear-algebraic statement.

136.8 Algebraic Connectivity

The second-smallest Laplacian eigenvalue

λ2 \lambda_2

is called the algebraic connectivity or Fiedler value.

It measures how well connected the graph is.

Properties include:

Value of λ2\lambda_2Interpretation
00Graph disconnected
Small positiveWeakly connected
LargeStrongly connected

The corresponding eigenvector is called the Fiedler vector.

It is important in graph partitioning and clustering.

136.9 Spectral Clustering

Spectral clustering uses eigenvectors of graph matrices to partition data.

The basic idea is:

  1. Construct a similarity graph,
  2. Form a Laplacian matrix,
  3. Compute eigenvectors,
  4. Embed the data into a low-dimensional spectral space,
  5. Cluster in that space.

The Fiedler vector often separates the graph into two weakly connected regions.

Spectral clustering is widely used in:

ApplicationUse
Image segmentationPixel grouping
Social networksCommunity detection
Machine learningUnsupervised clustering
Data analysisDimensional reduction

Linear algebra provides the embedding coordinates.

136.10 Normalized Laplacian

The normalized Laplacian is:

L=ID1/2AD1/2. \mathcal{L} = I-D^{-1/2}AD^{-1/2}.

It adjusts for varying vertex degrees.

Another form is the random-walk Laplacian:

Lrw=ID1A. L_{\mathrm{rw}} = I-D^{-1}A.

Normalized Laplacians are useful when degree variation is large.

Their eigenvalues lie in the interval

[0,2]. [0,2].

Normalized forms are common in probability and machine learning.

136.11 Random Walks on Graphs

A random walk moves from vertex to adjacent vertex randomly.

The transition matrix is

P=D1A. P=D^{-1}A.

The entry

Pij P_{ij}

gives the probability of moving from vertex ii to vertex jj.

Properties of the random walk are closely related to spectral properties of PP and the Laplacian.

Examples include:

Spectral quantityProbabilistic meaning
Largest eigenvalueStationary behavior
Spectral gapMixing speed
EigenvectorsLong-term structure

Random walks connect graph theory with Markov chains.

136.12 Expansion

An expander graph is a sparse graph with strong connectivity properties.

Informally, every subset of vertices has many edges leaving it.

Expansion is related to the spectral gap:

λ2. \lambda_2.

Large spectral gap implies strong expansion.

Expanders are important in:

AreaApplication
Computer scienceNetwork design
Complexity theoryDerandomization
Coding theoryError correction
Distributed systemsCommunication efficiency

Spectral graph theory provides quantitative measures of expansion.

136.13 Cheeger’s Inequality

Cheeger’s inequality relates graph expansion to Laplacian eigenvalues.

The conductance of a graph measures how difficult it is to disconnect the graph.

Cheeger’s inequality states, roughly:

λ2 \lambda_2

controls conductance up to multiplicative constants.

Thus spectral information determines combinatorial bottlenecks.

This theorem is fundamental in spectral partitioning.

136.14 Eigenvectors and Geometry

Eigenvectors of graph matrices often reveal geometric structure.

For example:

EigenvectorInterpretation
Constant eigenvectorGlobal uniform mode
Fiedler vectorWeakest separation direction
Higher eigenvectorsFiner structural modes

This resembles Fourier analysis.

Low-frequency eigenvectors vary slowly across edges.

High-frequency eigenvectors oscillate rapidly.

Thus graph eigenvectors act like discrete harmonic modes.

136.15 Discrete Laplacian and PDE Analogy

The graph Laplacian is analogous to the continuous Laplace operator

Δ. \Delta.

Continuous Laplacian:

Δf=i2fxi2. \Delta f = \sum_i \frac{\partial^2 f}{\partial x_i^2}.

Discrete Laplacian:

(Lf)(i)=ji(f(i)f(j)). (Lf)(i) = \sum_{j\sim i}(f(i)-f(j)).

Both measure local variation.

This analogy explains why spectral graph theory resembles harmonic analysis and differential geometry.

136.16 Heat Diffusion on Graphs

Heat flow on a graph satisfies the differential equation:

dudt=Lu. \frac{du}{dt} = -Lu.

The solution is:

u(t)=etLu(0). u(t) = e^{-tL}u(0).

The matrix exponential

etL e^{-tL}

acts as a diffusion operator.

Eigenvalues determine diffusion rates:

EigenvalueEffect
SmallSlow decay
LargeFast decay

Heat diffusion smooths functions on graphs.

136.17 Graph Fourier Transform

The eigenvectors of the Laplacian define a graph Fourier basis.

If:

L=UΛUT, L=U\Lambda U^T,

then the columns of UU are orthonormal eigenvectors.

For a graph signal ff, the graph Fourier transform is:

f^=UTf. \widehat{f}=U^Tf.

Low-frequency components correspond to small eigenvalues.

Graph signal processing extends Fourier analysis to irregular discrete structures.

Applications include:

ApplicationGoal
Sensor networksSignal smoothing
Recommendation systemsGraph filtering
Image processingStructured denoising

136.18 Directed Graphs

Directed graphs lead to nonsymmetric matrices.

If:

AAT, A\ne A^T,

then eigenvalues may become complex.

Many properties of symmetric spectral theory fail or become more complicated.

Directed spectral theory often uses:

MatrixPurpose
Transition matrixMarkov chains
Directed LaplacianFlow analysis
PageRank matrixWeb ranking

Nonsymmetric spectral theory is generally harder than symmetric theory.

136.19 Random Graphs

Random graphs produce random matrices.

For example, in the Erdős-Rényi model:

G(n,p), G(n,p),

each edge appears independently with probability pp.

The adjacency matrix becomes a random symmetric matrix.

Spectral graph theory and random matrix theory interact strongly in this setting.

Important questions include:

QuestionMeaning
Largest eigenvalueConnectivity strength
Spectral distributionGlobal randomness
Eigenvector localizationStructural concentration

Random graph spectra reveal hidden structure in networks.

136.20 Applications

Spectral graph theory has many applications.

Computer Science

ProblemSpectral tool
Graph partitioningFiedler vector
Web searchEigenvector ranking
Recommendation systemsSpectral embeddings

Machine Learning

TaskSpectral method
ClusteringLaplacian eigenvectors
Manifold learningSpectral embeddings
Semi-supervised learningGraph diffusion

Physics

PhenomenonGraph interpretation
DiffusionHeat equation
VibrationsLaplacian eigenmodes
Quantum transportSpectral propagation

Network Science

StructureSpectral indicator
CommunitiesEigenvector structure
RobustnessSpectral gap
SynchronizationLaplacian eigenvalues

136.21 Summary

Spectral graph theory studies graphs using eigenvalues and eigenvectors of associated matrices.

The main ideas are:

ConceptMeaning
Adjacency matrixEncodes edges
Degree matrixEncodes vertex degrees
LaplacianDiscrete differential operator
Spectral gapConnectivity strength
Fiedler vectorGraph partition direction
Random walk matrixProbabilistic dynamics
ExpansionSparse but highly connected structure
Graph Fourier basisLaplacian eigenvectors
Heat diffusionDynamics generated by Laplacian
Spectral clusteringEigenvector-based partitioning

Spectral graph theory converts combinatorial structure into linear algebra. Eigenvalues and eigenvectors reveal connectivity, geometry, diffusion, clustering, and randomness in discrete networks.