# Chapter 76. Adjacency Matrices

# Chapter 76. Adjacency Matrices

An adjacency matrix is a matrix representation of a finite graph. It records which pairs of vertices are joined by edges.

Let

$$
G = (V,E)
$$

be a graph with vertex set

$$
V = \{v_1, v_2, \ldots, v_n\}.
$$

The adjacency matrix of \(G\), denoted \(A(G)\), is the \(n \times n\) matrix

$$
A(G) = (a_{ij}),
$$

where

$$
a_{ij} =
\begin{cases}
1, & \text{if } v_i \text{ is adjacent to } v_j, \\
0, & \text{otherwise.}
\end{cases}
$$

For a simple graph, the diagonal entries are zero, since a vertex has no loop to itself. For an undirected graph, the matrix is symmetric, because adjacency does not depend on order: if \(v_i\) is adjacent to \(v_j\), then \(v_j\) is adjacent to \(v_i\).

## 76.1 From Graphs to Matrices

The main purpose of the adjacency matrix is to translate graph structure into linear algebra.

A graph is a discrete object. It consists of vertices and edges. A matrix is an algebraic object. It can be added, multiplied, factored, diagonalized, and studied through eigenvalues.

The adjacency matrix gives a bridge between these two languages. Once a graph is written as a matrix, questions about adjacency, walks, connectivity, and symmetry can be studied with matrix operations.

Consider the graph with vertices

$$
V = \{v_1,v_2,v_3,v_4\}
$$

and edges

$$
E = \{\{v_1,v_2\}, \{v_1,v_3\}, \{v_2,v_3\}, \{v_3,v_4\}\}.
$$

Then the adjacency matrix is

$$
A(G)=
\begin{bmatrix}
0 & 1 & 1 & 0 \\
1 & 0 & 1 & 0 \\
1 & 1 & 0 & 1 \\
0 & 0 & 1 & 0
\end{bmatrix}.
$$

The first row says that \(v_1\) is adjacent to \(v_2\) and \(v_3\), but not to \(v_4\). The third row says that \(v_3\) is adjacent to \(v_1\), \(v_2\), and \(v_4\).

## 76.2 Labeling of Vertices

An adjacency matrix depends on an ordering of the vertices.

If the vertices are listed as

$$
v_1, v_2, \ldots, v_n,
$$

then row \(i\) and column \(i\) both correspond to \(v_i\). A different ordering of the same vertices gives a different matrix, but it represents the same graph.

For example, suppose the vertex labels are permuted. The rows and columns of the adjacency matrix are permuted in the same way. Thus the matrix changes by simultaneous row and column permutation.

This is why the adjacency matrix represents a labeled graph directly. For an unlabeled graph, the matrix represents the graph only up to relabeling.

## 76.3 Symmetry

For an undirected graph, adjacency is symmetric:

$$
v_i \sim v_j
\quad \Longleftrightarrow \quad
v_j \sim v_i.
$$

Therefore

$$
a_{ij} = a_{ji}
$$

for all \(i,j\). Hence

$$
A(G)^T = A(G).
$$

Thus the adjacency matrix of an undirected graph is symmetric. This fact is fundamental in algebraic graph theory. A real symmetric matrix has real eigenvalues and admits an orthogonal basis of eigenvectors. Therefore many structural properties of undirected graphs can be studied spectrally.

For a directed graph, symmetry usually fails. If there is an arc from \(v_i\) to \(v_j\), there need not be an arc from \(v_j\) to \(v_i\). In that case, the adjacency matrix may satisfy

$$
a_{ij} \neq a_{ji}.
$$

Thus directed graphs naturally lead to nonsymmetric matrices.

## 76.4 Diagonal Entries

In a simple graph, no vertex is adjacent to itself. Therefore

$$
a_{ii} = 0
$$

for every \(i\). The diagonal of the adjacency matrix is zero.

If loops are allowed, then diagonal entries may be nonzero. A loop at \(v_i\) is recorded in the entry \(a_{ii}\). Different conventions are used. In some settings, a loop contributes \(1\) to \(a_{ii}\). In degree counting for undirected graphs, a loop often contributes \(2\) to the degree. The convention must be stated clearly when loops are present.

## 76.5 Directed Graphs

Let \(D\) be a directed graph with vertices

$$
V = \{v_1, v_2, \ldots, v_n\}.
$$

The adjacency matrix of \(D\) is the matrix

$$
A(D) = (a_{ij}),
$$

where

$$
a_{ij} =
\begin{cases}
1, & \text{if there is an arc from } v_i \text{ to } v_j, \\
0, & \text{otherwise.}
\end{cases}
$$

The row index records the initial vertex. The column index records the terminal vertex.

For example, if the arcs are

$$
v_1 \to v_2,\quad v_2 \to v_3,\quad v_3 \to v_1,
$$

then

$$
A(D)=
\begin{bmatrix}
0 & 1 & 0 \\
0 & 0 & 1 \\
1 & 0 & 0
\end{bmatrix}.
$$

The entry in row \(1\), column \(2\) is \(1\), because there is an arc from \(v_1\) to \(v_2\). The entry in row \(2\), column \(1\) is \(0\), because there is no arc from \(v_2\) to \(v_1\).

## 76.6 Multigraphs

For a multigraph, two vertices may be joined by more than one edge. The adjacency matrix records the number of edges between the vertices.

Thus

$$
a_{ij} = \text{the number of edges joining } v_i \text{ and } v_j.
$$

For a directed multigraph,

$$
a_{ij} = \text{the number of arcs from } v_i \text{ to } v_j.
$$

This generalization replaces a \(0\)-\(1\) matrix by a matrix with nonnegative integer entries.

## 76.7 Weighted Graphs

For a weighted graph, each edge carries a numerical weight. The adjacency matrix may store these weights.

If \(w_{ij}\) is the weight of the edge between \(v_i\) and \(v_j\), then

$$
a_{ij} =
\begin{cases}
w_{ij}, & \text{if } v_i \text{ and } v_j \text{ are adjacent,} \\
0, & \text{otherwise.}
\end{cases}
$$

For weighted directed graphs, \(a_{ij}\) stores the weight of the arc from \(v_i\) to \(v_j\).

In weighted graphs, the value \(0\) needs care. If zero-weight edges are allowed, then the matrix entry \(0\) can no longer mean only “no edge.” In such cases, one may use a separate adjacency relation, a special symbol, or another convention.

## 76.8 Degree and Row Sums

For a simple undirected graph, the degree of \(v_i\) is the number of vertices adjacent to \(v_i\). In the adjacency matrix, this is the sum of row \(i\):

$$
\deg(v_i) = \sum_{j=1}^{n} a_{ij}.
$$

Since the matrix is symmetric, the same number is also the sum of column \(i\):

$$
\deg(v_i) = \sum_{j=1}^{n} a_{ji}.
$$

For a directed graph, row sums and column sums have different meanings.

The out-degree of \(v_i\) is

$$
\deg^+(v_i) = \sum_{j=1}^{n} a_{ij}.
$$

The in-degree of \(v_i\) is

$$
\deg^-(v_i) = \sum_{j=1}^{n} a_{ji}.
$$

Thus rows count outgoing arcs, and columns count incoming arcs.

## 76.9 Edge Count

For a simple undirected graph, each edge appears twice in the adjacency matrix: once as \(a_{ij}=1\) and once as \(a_{ji}=1\). Therefore

$$
|E| = \frac{1}{2}\sum_{i=1}^{n}\sum_{j=1}^{n} a_{ij}.
$$

Equivalently,

$$
\sum_{i=1}^{n} \deg(v_i) = 2|E|.
$$

This is the handshaking identity in matrix form.

For a directed graph, each arc appears once in the matrix. Therefore

$$
|E| = \sum_{i=1}^{n}\sum_{j=1}^{n} a_{ij}.
$$

## 76.10 Matrix Powers and Walks

The powers of the adjacency matrix count walks.

Let \(A\) be the adjacency matrix of a graph \(G\). Then the \((i,j)\)-entry of \(A^k\) equals the number of walks of length \(k\) from \(v_i\) to \(v_j\). This holds for directed and undirected graphs, with the direction respected in the directed case.

For \(k=1\), this is exactly the definition of adjacency. The entry \(a_{ij}\) counts walks of length \(1\) from \(v_i\) to \(v_j\).

For \(k=2\),

$$
(A^2)_{ij} =
\sum_{\ell=1}^{n} a_{i\ell}a_{\ell j}.
$$

Each product \(a_{i\ell}a_{\ell j}\) contributes \(1\) precisely when there is a walk

$$
v_i \to v_\ell \to v_j.
$$

Thus \((A^2)_{ij}\) counts all two-step walks from \(v_i\) to \(v_j\).

The same argument extends by induction to every positive integer \(k\).

## 76.11 Connectivity

Adjacency matrices can also detect connectivity.

For an undirected graph with \(n\) vertices, two distinct vertices \(v_i\) and \(v_j\) lie in the same connected component if and only if there exists some integer \(k\) with

$$
1 \leq k \leq n-1
$$

such that

$$
(A^k)_{ij} > 0.
$$

Equivalently, the matrix

$$
A + A^2 + \cdots + A^{n-1}
$$

has a positive \((i,j)\)-entry exactly when there is a walk from \(v_i\) to \(v_j\).

The upper bound \(n-1\) is enough because if two vertices are connected, then there is a simple path between them, and a simple path in an \(n\)-vertex graph has length at most \(n-1\).

## 76.12 Triangles

The adjacency matrix gives a compact formula for counting triangles in a simple undirected graph.

A triangle is a cycle of length \(3\). The number of closed walks of length \(3\) beginning and ending at \(v_i\) is the diagonal entry \((A^3)_{ii}\). Therefore the total number of closed walks of length \(3\) is

$$
\operatorname{tr}(A^3).
$$

Each triangle contributes \(6\) such walks: one for each choice of starting vertex and one for each direction around the triangle. Hence the number of triangles in a simple undirected graph is

$$
\frac{\operatorname{tr}(A^3)}{6}.
$$

This formula illustrates the power of matrix methods. A local graph pattern becomes a trace computation.

## 76.13 Isomorphism and Permutation Matrices

Adjacency matrices also express graph isomorphism.

Let \(G\) and \(H\) be graphs with adjacency matrices \(A_G\) and \(A_H\). If \(G\) and \(H\) are isomorphic, then there is a permutation matrix \(P\) such that

$$
A_H = P A_G P^{-1}.
$$

Since permutation matrices satisfy

$$
P^{-1} = P^T,
$$

this is often written as

$$
A_H = P A_G P^T.
$$

This equation says that relabeling vertices corresponds to permuting rows and columns in the same way. Therefore isomorphic graphs have adjacency matrices that differ only by a simultaneous permutation of rows and columns.

The converse also holds: if such a permutation matrix exists, then the graphs are isomorphic.

## 76.14 Spectrum of a Graph

For an undirected graph, the adjacency matrix is real and symmetric. Therefore its eigenvalues are real. These eigenvalues form the adjacency spectrum of the graph.

If

$$
A x = \lambda x
$$

for a nonzero vector \(x\), then \(\lambda\) is an eigenvalue of the graph and \(x\) is an eigenvector.

The spectrum is an isomorphism invariant. If two graphs are isomorphic, then their adjacency matrices are related by a similarity transformation. Similar matrices have the same eigenvalues. Hence isomorphic graphs have the same adjacency spectrum.

The converse is false. Different non-isomorphic graphs may have the same spectrum. Such graphs are called cospectral graphs.

The spectrum still carries important information. It is related to regularity, bipartiteness, expansion, walks, and many extremal properties of graphs.

## 76.15 Regular Graphs

A graph is \(d\)-regular if every vertex has degree \(d\). In matrix form, this means every row sum of \(A\) is \(d\).

Let

$$
\mathbf{1} =
\begin{bmatrix}
1 \\
1 \\
\vdots \\
1
\end{bmatrix}.
$$

If \(G\) is \(d\)-regular, then

$$
A\mathbf{1} = d\mathbf{1}.
$$

Thus \(d\) is an eigenvalue of \(A\), with eigenvector \(\mathbf{1}\).

This simple fact is the starting point for spectral graph theory. In a regular graph, the largest eigenvalue reflects the common degree, while the remaining eigenvalues describe finer structure.

## 76.16 Bipartite Graphs

If \(G\) is bipartite with parts \(X\) and \(Y\), and the vertices are ordered so that all vertices of \(X\) come first and all vertices of \(Y\) come second, then the adjacency matrix has block form

$$
A =
\begin{bmatrix}
0 & B \\
B^T & 0
\end{bmatrix}.
$$

The zero blocks occur because there are no edges inside \(X\) and no edges inside \(Y\). All edges go between the two parts.

The matrix \(B\) is called the biadjacency matrix. It records which vertices of \(X\) are adjacent to which vertices of \(Y\).

This block form is useful in matching theory, spectral graph theory, and the study of networks with two types of vertices.

## 76.17 Complete Graphs

The complete graph \(K_n\) has every pair of distinct vertices adjacent. Its adjacency matrix is

$$
A(K_n) = J_n - I_n,
$$

where \(J_n\) is the \(n \times n\) all-ones matrix and \(I_n\) is the identity matrix.

Thus

$$
A(K_n)=
\begin{bmatrix}
0 & 1 & \cdots & 1 \\
1 & 0 & \cdots & 1 \\
\vdots & \vdots & \ddots & \vdots \\
1 & 1 & \cdots & 0
\end{bmatrix}.
$$

Every row has sum \(n-1\), so \(K_n\) is \((n-1)\)-regular.

## 76.18 Empty Graphs

The empty graph on \(n\) vertices has no edges. Its adjacency matrix is the zero matrix:

$$
A = 0_{n \times n}.
$$

All degrees are zero. All powers of \(A\) are also zero. There are no walks of positive length between any pair of vertices.

This is the opposite extreme from the complete graph.

## 76.19 Paths and Cycles

The path graph \(P_n\) has vertices arranged in a line:

$$
v_1 - v_2 - \cdots - v_n.
$$

Its adjacency matrix has ones immediately above and below the diagonal:

$$
A(P_n)=
\begin{bmatrix}
0 & 1 & 0 & \cdots & 0 \\
1 & 0 & 1 & \cdots & 0 \\
0 & 1 & 0 & \ddots & 0 \\
\vdots & \vdots & \ddots & \ddots & 1 \\
0 & 0 & 0 & 1 & 0
\end{bmatrix}.
$$

The cycle graph \(C_n\) has the same adjacent structure, with additional edges between \(v_1\) and \(v_n\). Thus its adjacency matrix has two extra ones in the corner positions:

$$
a_{1n} = a_{n1} = 1.
$$

These matrices are sparse. Most entries are zero.

## 76.20 Storage and Computation

An adjacency matrix uses \(n^2\) entries for a graph with \(n\) vertices. This is efficient for dense graphs, where many possible edges are present. It may be wasteful for sparse graphs, where the number of edges is much smaller than \(n^2\).

The main advantage is constant-time adjacency testing. To check whether \(v_i\) and \(v_j\) are adjacent, one reads the entry \(a_{ij}\).

The main disadvantage is neighbor listing. To find all neighbors of \(v_i\), one must scan row \(i\), which takes time proportional to \(n\). In an adjacency list, listing neighbors takes time proportional to the degree of \(v_i\).

Thus the adjacency matrix is natural for dense graphs, algebraic methods, and spectral theory. The adjacency list is often better for large sparse graphs and traversal algorithms.

## 76.21 Summary

The adjacency matrix records the edge relation of a finite graph in matrix form.

For a simple graph,

$$
a_{ij}=1
$$

when \(v_i\) and \(v_j\) are adjacent, and

$$
a_{ij}=0
$$

otherwise.

For undirected graphs, the matrix is symmetric. For directed graphs, the entry \(a_{ij}\) records an arc from \(v_i\) to \(v_j\). For multigraphs, entries count parallel edges. For weighted graphs, entries may store weights.

The adjacency matrix supports several basic computations.

| Graph property | Matrix expression |
|---|---|
| Degree of \(v_i\) | Row sum \(\sum_j a_{ij}\) |
| Number of edges in simple undirected graph | \(\frac{1}{2}\sum_{i,j}a_{ij}\) |
| Walks of length \(k\) from \(v_i\) to \(v_j\) | Entry \((A^k)_{ij}\) |
| Number of triangles | \(\operatorname{tr}(A^3)/6\) |
| Relabeling vertices | \(PAP^{-1}\) |
| Regularity | \(A\mathbf{1}=d\mathbf{1}\) |

The importance of the adjacency matrix lies in this translation. It turns graph structure into linear algebra. This translation is the foundation of algebraic graph theory, spectral graph theory, and many computational methods for networks.
