A graph is a mathematical object used to describe pairwise relationships.
The objects being related are called vertices. The relationships between them are called edges. Graphs appear in network analysis, computer science, chemistry, optimization, social systems, transportation, web search, and machine learning.
Linear algebra enters graph theory by encoding a graph as a matrix. Once a graph is represented by a matrix, its structure can be studied using rank, eigenvalues, eigenvectors, matrix powers, null spaces, and decompositions. This is the main idea behind algebraic and spectral graph theory, where graph properties are studied through matrices associated with the graph.
111.1 Graphs
A graph is an ordered pair
where is a set of vertices and is a set of edges.
If
then the graph has vertices.
An edge joins two vertices. In an undirected graph, the edge between and has no direction. In a directed graph, the edge has an initial vertex and a terminal vertex.
For example, if
and
then the graph has four vertices and three undirected edges.
The edge set says that vertex is adjacent to vertices and , and vertex is adjacent to vertex .
111.2 Adjacency
Two vertices are adjacent if an edge connects them.
Adjacency is the basic relation in a graph. Most graph questions begin with this relation:
| Question | Meaning |
|---|---|
| Is connected to ? | Adjacency |
| How many neighbors does have? | Degree |
| Can reach ? | Connectivity |
| How many walks go from to ? | Matrix powers |
| How central is ? | Spectral information |
Linear algebra makes these questions computable by storing adjacency information in a matrix.
111.3 The Adjacency Matrix
Let be a graph with vertices
The adjacency matrix of is the matrix whose entries are
For a finite simple graph, this gives a square - matrix with zeros on the diagonal. For an undirected graph, the adjacency matrix is symmetric.
For the graph
the adjacency matrix is
The entry because vertices and are adjacent. The entry because vertices and are not adjacent.
111.4 Symmetry
For an undirected graph,
Therefore the adjacency matrix satisfies
Thus the adjacency matrix of an undirected graph is symmetric.
This has important consequences. A real symmetric matrix has real eigenvalues and an orthogonal basis of eigenvectors. Therefore an undirected graph can be studied using the spectral theorem. The eigenvalues of the adjacency matrix form part of the spectrum of the graph.
For directed graphs, symmetry usually fails. If there is an edge from to , but no edge from to , then
In that case need not be symmetric.
111.5 Degree
The degree of a vertex in an undirected graph is the number of edges incident to it.
For a simple undirected graph, the degree of is
Thus the degree is the sum of the entries in row of the adjacency matrix.
For the example matrix
the degrees are
The degree vector is
It can be computed as
where is the vector whose entries are all .
111.6 Directed Graphs
For a directed graph, adjacency has direction.
The adjacency matrix is usually defined by
The row sum gives the out-degree:
The column sum gives the in-degree:
Thus directed graphs naturally distinguish between outgoing and incoming structure.
A web graph is a directed graph. A page links to another page, but the reverse link need not exist.
111.7 Weighted Graphs
In a weighted graph, edges carry numerical weights.
The adjacency matrix becomes
The weight may represent distance, cost, strength, similarity, probability, flow capacity, or frequency.
For example,
describes an undirected weighted graph with three vertices. The edge between vertices and has weight . The edge between vertices and has weight .
Weighted adjacency matrices allow graph algorithms to use numerical intensity, not only presence or absence.
111.8 Matrix Powers and Walks
One of the most important facts about adjacency matrices is that powers of count walks.
A walk of length from to is a sequence of edges starting at and ending at .
The entry equals the number of walks of length from to .
For ,
Each nonzero term corresponds to a vertex such that
is a walk of length .
For , the entries of count walks of length . The same pattern continues for all positive integers .
This connects graph traversal to matrix multiplication.
111.9 Paths, Walks, and Reachability
A walk may repeat vertices and edges. A path does not repeat vertices.
Matrix powers count walks, not necessarily paths. Still, they are useful for determining reachability.
If there exists some such that
then there is a walk from to . In a finite graph, the existence of a walk implies the existence of a path.
Thus reachability can be studied through the sequence
For a graph with vertices, if can reach , then there is a path of length at most . Therefore it is enough to examine powers up to .
111.10 The Degree Matrix
The degree matrix of an undirected graph is the diagonal matrix
The diagonal entries are the degrees of the vertices.
For the earlier example,
The degree matrix is often used together with the adjacency matrix to form the graph Laplacian.
111.11 The Graph Laplacian
The graph Laplacian is
For an undirected graph, is symmetric.
Using the example,
The Laplacian records both adjacency and degree information. It is central in spectral graph theory, random walks, electrical networks, graph partitioning, clustering, and discrete analogues of differential operators. Spectral graph theory commonly studies graph structure through matrices such as the adjacency matrix and the Laplacian matrix.
111.12 Quadratic Form of the Laplacian
For an undirected graph, the Laplacian has a useful quadratic form.
If , then
This identity shows that measures how much the values of vary across edges.
If adjacent vertices have similar values, the sum is small. If adjacent vertices have very different values, the sum is large.
This formula explains why the Laplacian appears in smoothing, graph signal processing, clustering, and energy minimization.
111.13 Connected Components
A graph is connected if every vertex can be reached from every other vertex.
The Laplacian detects connected components.
For an undirected graph,
Thus is always an eigenvalue of .
The multiplicity of the eigenvalue equals the number of connected components of the graph.
If the graph is connected, then the null space of is
This is a strong example of a graph property expressed as a linear algebra property.
111.14 Eigenvalues of the Adjacency Matrix
The eigenvalues of the adjacency matrix give information about graph structure.
If is the adjacency matrix of an undirected graph, then is real and symmetric. Therefore it has real eigenvalues
and an orthonormal basis of eigenvectors.
The multiset of eigenvalues is called the spectrum of the graph. The spectrum is invariant under relabeling of vertices, although it does not fully determine the graph.
A large leading eigenvalue often indicates high connectivity or strong concentration of edges. Eigenvectors can reveal clusters, central vertices, or structural divisions.
111.15 Regular Graphs
A graph is -regular if every vertex has degree .
In a -regular graph,
Therefore is an eigenvector of with eigenvalue .
For regular graphs, the adjacency matrix and Laplacian are closely related:
Thus the eigenvalues of are obtained from the eigenvalues of by subtracting from .
If
then
This relation makes regular graphs especially convenient in spectral analysis.
111.16 Incidence Matrices
Another matrix representation of a graph is the incidence matrix.
Choose an orientation for each edge. For a graph with vertices and edges, the oriented incidence matrix is an matrix.
For an edge directed from to ,
and all other entries in column are .
For an undirected graph, the orientation is arbitrary. The Laplacian satisfies
This factorization shows that the Laplacian is positive semidefinite:
Thus a structural graph matrix becomes a Gram-type matrix.
111.17 Graphs as Linear Operators
An adjacency matrix can be viewed as a linear operator on functions defined on vertices.
A vector
assigns a number to each vertex .
Then
For an unweighted graph, this is the sum of the values of over the neighbors of .
Thus acts by aggregating neighboring values. This interpretation is used in random walks, diffusion, message passing, graph neural networks, ranking algorithms, and spectral methods.
111.18 Random Walk Matrices
For a graph with degree matrix , the matrix
is the transition matrix of a simple random walk, assuming all degrees are positive.
The entry
is the probability of moving from vertex to vertex in one step.
If is a row probability vector, then
is the distribution after one step.
Random walk matrices connect graph theory with Markov chains.
They are used in PageRank, diffusion models, network analysis, and stochastic processes.
111.19 Relabeling Vertices
The adjacency matrix depends on the ordering of vertices.
If the vertices are relabeled, the matrix changes. However, the graph itself remains the same.
Relabeling corresponds to multiplication by a permutation matrix.
If is a permutation matrix, then the relabeled adjacency matrix is
Matrices related in this way represent the same graph with different vertex labels.
Because is similar to , they have the same eigenvalues.
This explains why the spectrum is a graph invariant.
111.20 Example: A Path Graph
Consider the path graph on four vertices:
Its adjacency matrix is
The degree matrix is
The Laplacian is
This matrix records the discrete second-difference structure of the path. It is the graph analogue of a one-dimensional Laplace operator.
111.21 Applications
Adjacency matrices give a common language for many systems.
| System | Vertices | Edges |
|---|---|---|
| Web graph | Web pages | Hyperlinks |
| Social network | People or accounts | Relationships |
| Road network | Intersections | Roads |
| Citation network | Papers | Citations |
| Molecular graph | Atoms | Bonds |
| Computer network | Machines | Connections |
| Knowledge graph | Entities | Relations |
| Recommender system | Users and items | Interactions |
Once encoded as matrices, these systems can be analyzed with linear algebra.
Common tasks include:
| Task | Linear algebra method |
|---|---|
| Ranking | Leading eigenvectors |
| Clustering | Laplacian eigenvectors |
| Connectivity | Null space of Laplacian |
| Similarity | Matrix powers and kernels |
| Diffusion | Random walk matrices |
| Compression | Low-rank approximation |
| Link prediction | Matrix factorization |
111.22 Summary
Graphs describe relationships. Adjacency matrices turn those relationships into linear algebra.
For a graph with vertices, the adjacency matrix is an matrix whose entries record which vertices are adjacent. For undirected graphs, the adjacency matrix is symmetric. Its powers count walks. Its row sums give degrees. Together with the degree matrix, it forms the graph Laplacian
The Laplacian, adjacency matrix, incidence matrix, and random walk matrix provide complementary views of graph structure.
This chapter shows a general principle: discrete structure can be encoded as linear algebra. Once encoded, graph problems become questions about matrices, subspaces, eigenvalues, and operators.