# Chapter 79. Spectrum of a Graph

# Chapter 79. Spectrum of a Graph

The spectrum of a graph is the collection of eigenvalues of a matrix associated with the graph. Most often, the matrix is the adjacency matrix or the Laplacian matrix. In this chapter, we focus first on the adjacency spectrum, then compare it with the Laplacian spectrum.

Let

$$
G=(V,E)
$$

be a finite simple undirected graph with vertex set

$$
V=\{v_1,v_2,\ldots,v_n\}.
$$

Let \(A\) be the adjacency matrix of \(G\). Since \(G\) is undirected, \(A\) is real and symmetric. Therefore all eigenvalues of \(A\) are real, and \(A\) has an orthonormal basis of eigenvectors. The study of graph properties through eigenvalues and eigenvectors of graph matrices is called spectral graph theory.

## 79.1 Eigenvalues of the Adjacency Matrix

A number \(\lambda\) is an eigenvalue of \(A\) if there is a nonzero vector \(x\) such that

$$
Ax=\lambda x.
$$

The vector \(x\) is called an eigenvector corresponding to \(\lambda\).

The adjacency spectrum of \(G\) is the multiset of eigenvalues of \(A\). Multiplicity matters. If an eigenvalue occurs as a root of the characteristic polynomial \(r\) times, then it appears in the spectrum with multiplicity \(r\).

The characteristic polynomial is

$$
p_A(t)=\det(tI-A).
$$

The eigenvalues are precisely the roots of this polynomial.

## 79.2 Spectrum as a Multiset

The spectrum is written as

$$
\operatorname{Spec}(G)=\{\lambda_1,\lambda_2,\ldots,\lambda_n\},
$$

where eigenvalues are repeated according to multiplicity.

For a simple undirected graph, it is common to order the eigenvalues as

$$
\lambda_1 \geq \lambda_2 \geq \cdots \geq \lambda_n.
$$

The largest eigenvalue

$$
\lambda_1
$$

is called the spectral radius when all eigenvalues are real and the largest absolute value is equal to \(\lambda_1\). More generally, the spectral radius of \(A\) is

$$
\rho(A)=\max_i |\lambda_i|.
$$

For adjacency matrices of many connected graphs, the largest eigenvalue plays a central structural role.

## 79.3 Example: A Path on Three Vertices

Consider the path graph

$$
P_3: v_1-v_2-v_3.
$$

Its adjacency matrix is

$$
A=
\begin{bmatrix}
0&1&0\\
1&0&1\\
0&1&0
\end{bmatrix}.
$$

The characteristic polynomial is

$$
\det(tI-A) =
\det
\begin{bmatrix}
t&-1&0\\
-1&t&-1\\
0&-1&t
\end{bmatrix}.
$$

Expanding gives

$$
p_A(t)=t(t^2-2).
$$

Thus the eigenvalues are

$$
\sqrt{2},\quad 0,\quad -\sqrt{2}.
$$

So

$$
\operatorname{Spec}(P_3)=\{\sqrt{2},0,-\sqrt{2}\}.
$$

This example already shows a common feature of bipartite graphs: eigenvalues occur symmetrically around zero.

## 79.4 Spectrum and Relabeling

The adjacency matrix depends on the chosen ordering of vertices. The spectrum does not.

If the vertices are relabeled, the adjacency matrix changes by a permutation similarity:

$$
A' = P A P^T,
$$

where \(P\) is a permutation matrix.

Since

$$
P^T=P^{-1},
$$

the matrices \(A\) and \(A'\) are similar. Similar matrices have the same characteristic polynomial and the same eigenvalues.

Therefore the spectrum is a graph invariant. Isomorphic graphs have the same spectrum. The converse can fail: non-isomorphic graphs may have the same spectrum. Such graphs are called cospectral graphs.

## 79.5 Trace and Eigenvalues

The trace of a matrix is the sum of its diagonal entries. It is also the sum of its eigenvalues, counted with multiplicity.

For a simple graph, the diagonal entries of \(A\) are zero. Hence

$$
\operatorname{tr}(A)=0.
$$

Therefore

$$
\lambda_1+\lambda_2+\cdots+\lambda_n=0.
$$

This gives a simple constraint on the adjacency spectrum of a loopless graph.

## 79.6 Edges and the Sum of Squares

The trace of \(A^2\) equals the sum of the squares of the eigenvalues:

$$
\operatorname{tr}(A^2)=\lambda_1^2+\lambda_2^2+\cdots+\lambda_n^2.
$$

For a simple undirected graph, the diagonal entry \((A^2)_{ii}\) equals \(\deg(v_i)\). Therefore

$$
\operatorname{tr}(A^2)=\sum_{i=1}^{n}\deg(v_i)=2|E|.
$$

Thus

$$
\lambda_1^2+\lambda_2^2+\cdots+\lambda_n^2=2|E|.
$$

The spectrum determines the number of edges.

## 79.7 Triangles and the Third Power

The trace of \(A^3\) equals the sum of the cubes of the eigenvalues:

$$
\operatorname{tr}(A^3)=\lambda_1^3+\lambda_2^3+\cdots+\lambda_n^3.
$$

The diagonal entry \((A^3)_{ii}\) counts closed walks of length \(3\) starting and ending at \(v_i\). In a simple undirected graph, each triangle contributes six closed walks of length \(3\): three choices of starting vertex and two directions.

Therefore the number of triangles in \(G\) is

$$
\frac{1}{6}\operatorname{tr}(A^3) =
\frac{1}{6}\sum_{i=1}^{n}\lambda_i^3.
$$

So the adjacency spectrum determines the number of triangles.

## 79.8 Walk Counts

More generally, the trace of \(A^k\) counts closed walks of length \(k\):

$$
\operatorname{tr}(A^k) =
\sum_{i=1}^{n}\lambda_i^k.
$$

The entry \((A^k)_{ij}\) counts walks of length \(k\) from \(v_i\) to \(v_j\). Hence the diagonal entries count closed walks, and their sum counts all closed walks of length \(k\).

Thus the spectrum determines the number of closed walks of every length. This is one reason eigenvalues contain substantial information about graph structure. Spectral graph theory uses eigenvalues and eigenvectors of graph-associated matrices to study combinatorial properties.

## 79.9 Regular Graphs

A graph is \(d\)-regular if every vertex has degree \(d\).

Let

$$
\mathbf{1}=
\begin{bmatrix}
1\\
1\\
\vdots\\
1
\end{bmatrix}.
$$

If \(G\) is \(d\)-regular, then every row of \(A\) has sum \(d\). Therefore

$$
A\mathbf{1}=d\mathbf{1}.
$$

Thus \(d\) is an eigenvalue of \(A\).

For a connected \(d\)-regular graph, \(d\) is the largest adjacency eigenvalue, and its eigenspace is spanned by \(\mathbf{1}\). The remaining eigenvalues measure deviations from complete uniformity.

## 79.10 Complete Graphs

The complete graph \(K_n\) has adjacency matrix

$$
A=J-I,
$$

where \(J\) is the all-ones matrix.

Since

$$
J\mathbf{1}=n\mathbf{1},
$$

we have

$$
A\mathbf{1}=(J-I)\mathbf{1}=(n-1)\mathbf{1}.
$$

Thus \(n-1\) is an eigenvalue.

If \(x\) is orthogonal to \(\mathbf{1}\), then

$$
Jx=0.
$$

Therefore

$$
Ax=(J-I)x=-x.
$$

So \(-1\) is an eigenvalue with multiplicity \(n-1\). Hence

$$
\operatorname{Spec}(K_n)=\{n-1,-1,-1,\ldots,-1\}.
$$

## 79.11 Empty Graphs

The empty graph on \(n\) vertices has adjacency matrix

$$
A=0.
$$

Therefore every eigenvalue is zero:

$$
\operatorname{Spec}(\overline{K_n})=\{0,0,\ldots,0\}.
$$

This spectrum reflects the absence of edges. There are no walks of positive length, no nonzero row sums, and no adjacency structure.

## 79.12 Cycle Graphs

The cycle graph \(C_n\) has vertices arranged in a ring. Its adjacency matrix is circulant. The eigenvalues are

$$
\lambda_k = 2\cos\left(\frac{2\pi k}{n}\right),
\qquad k=0,1,\ldots,n-1.
$$

For example, for \(C_4\),

$$
\lambda_0=2,\quad
\lambda_1=0,\quad
\lambda_2=-2,\quad
\lambda_3=0.
$$

Thus

$$
\operatorname{Spec}(C_4)=\{2,0,0,-2\}.
$$

Cycle spectra are important examples because they connect graph theory with Fourier analysis.

## 79.13 Bipartite Graphs

A graph is bipartite if its vertex set can be split into two parts \(X\) and \(Y\), with every edge going between the two parts.

If the vertices are ordered with all vertices in \(X\) first and all vertices in \(Y\) second, then the adjacency matrix has block form

$$
A=
\begin{bmatrix}
0&B\\
B^T&0
\end{bmatrix}.
$$

This block form implies spectral symmetry. If \(\lambda\) is an eigenvalue, then \(-\lambda\) is also an eigenvalue with the same multiplicity.

Thus the adjacency spectrum of a bipartite graph is symmetric about zero.

## 79.14 Spectral Radius

The spectral radius of the adjacency matrix is

$$
\rho(A)=\max_i |\lambda_i|.
$$

For a simple graph, \(\rho(A)\) is bounded by the maximum degree:

$$
\rho(A)\leq \Delta_{\max}.
$$

If \(G\) is \(d\)-regular, then

$$
\rho(A)=d.
$$

The spectral radius is influenced by high-degree vertices and dense substructures. In connected graphs, it is also governed by the Perron-Frobenius theorem, because the adjacency matrix is nonnegative and irreducible precisely when the graph is connected.

## 79.15 Perron-Frobenius Phenomenon

If \(G\) is connected, then \(A\) is an irreducible nonnegative matrix. The Perron-Frobenius theorem implies that the largest eigenvalue has an eigenvector with strictly positive entries.

This positive eigenvector is called the Perron vector.

It gives a measure of vertex importance. A vertex receives a large coordinate when it is connected to other vertices with large coordinates. This principle underlies eigenvector centrality and related network measures.

## 79.16 Laplacian Spectrum

The Laplacian matrix is

$$
L=\Delta-A.
$$

Its eigenvalues are usually written

$$
0=\mu_1\leq \mu_2\leq \cdots\leq \mu_n.
$$

Unlike adjacency eigenvalues, Laplacian eigenvalues are always nonnegative because

$$
x^TLx=\sum_{\{v_i,v_j\}\in E}(x_i-x_j)^2\geq 0.
$$

The Laplacian spectrum gives information about connectivity, cuts, expansion, and spanning trees. The multiplicity of the eigenvalue \(0\) equals the number of connected components of the graph.

## 79.17 Comparing Adjacency and Laplacian Spectra

The adjacency spectrum and Laplacian spectrum emphasize different aspects of the graph.

| Spectrum | Matrix | Main information |
|---|---|---|
| Adjacency spectrum | \(A\) | Walks, regularity, triangles, bipartiteness |
| Laplacian spectrum | \(L=\Delta-A\) | Connectivity, cuts, smoothness, spanning trees |
| Normalized Laplacian spectrum | \(I-\Delta^{-1/2}A\Delta^{-1/2}\) | Random walks, clustering, degree-normalized structure |

For regular graphs, the adjacency and Laplacian spectra are closely related. If \(G\) is \(d\)-regular, then

$$
L=dI-A.
$$

Therefore if \(\lambda\) is an adjacency eigenvalue, then

$$
d-\lambda
$$

is a Laplacian eigenvalue.

For irregular graphs, the relationship is less direct because \(\Delta\) is not a scalar multiple of the identity.

## 79.18 Cospectral Graphs

Two graphs are cospectral if they have the same spectrum.

If two graphs are isomorphic, then they are cospectral. This follows from permutation similarity of their adjacency matrices.

The converse is false. There exist non-isomorphic graphs with the same adjacency spectrum. Therefore the spectrum is a graph invariant, but it is not a complete invariant. Cospectral graphs show that eigenvalues capture much structure, but not all structure.

Cospectrality can also be defined using the Laplacian matrix or normalized Laplacian matrix. Two graphs may be cospectral for one matrix and not for another.

## 79.19 Graphs Determined by Spectrum

A graph is determined by its spectrum if every graph with the same spectrum is isomorphic to it.

Some graph families have this property. Others do not. Determining which graphs are characterized by their spectra is a major theme in spectral graph theory.

This question matters because it measures how much information eigenvalues retain about a graph. If a graph is determined by its spectrum, then its eigenvalues are enough to recover its isomorphism class. If it has cospectral mates, then eigenvalues alone are insufficient.

## 79.20 Spectral Invariants

The spectrum determines several quantities.

| Quantity | Determined by adjacency spectrum? | Reason |
|---|---:|---|
| Number of vertices | Yes | Number of eigenvalues |
| Number of edges | Yes | \(\sum_i \lambda_i^2=2|E|\) |
| Number of triangles | Yes | \(\sum_i \lambda_i^3=6T\) |
| Number of closed walks of length \(k\) | Yes | \(\sum_i \lambda_i^k=\operatorname{tr}(A^k)\) |
| Bipartiteness, for connected graphs | Yes | Spectrum symmetric about zero |
| Isomorphism class | Not always | Cospectral non-isomorphic graphs exist |

This table should be read carefully. The spectrum determines closed walk counts, but it does not determine all subgraph counts or the full adjacency relation.

## 79.21 Why the Spectrum Matters

The spectrum matters because eigenvalues compress global graph information into numerical data.

A graph may have thousands or millions of vertices. Its adjacency matrix may be too large to inspect directly. Eigenvalues give a smaller summary that reflects connectivity, expansion, regularity, symmetry, and walk structure.

Spectral methods are used in graph partitioning, clustering, network analysis, chemistry, coding theory, random walks, and machine learning. They are useful because they convert combinatorial questions into linear algebraic questions.

The translation is not perfect. Eigenvalues lose information. But the information they keep is often the part most relevant to global structure.

## 79.22 Summary

The spectrum of a graph is the multiset of eigenvalues of a graph matrix, usually the adjacency matrix or the Laplacian matrix.

For the adjacency matrix \(A\), the spectrum is determined by

$$
Ax=\lambda x.
$$

For a simple undirected graph, \(A\) is real and symmetric, so its eigenvalues are real. The adjacency spectrum is invariant under relabeling and therefore invariant under graph isomorphism.

The spectrum records important information.

| Spectral fact | Graph meaning |
|---|---|
| \(\sum_i \lambda_i=0\) | The graph has no loops |
| \(\sum_i \lambda_i^2=2|E|\) | The spectrum determines the number of edges |
| \(\sum_i \lambda_i^3=6T\) | The spectrum determines the number of triangles |
| \(\operatorname{tr}(A^k)=\sum_i\lambda_i^k\) | The spectrum determines closed walk counts |
| Symmetry about zero | Characteristic of bipartite structure |
| Largest eigenvalue | Measures dense and highly connected structure |
| Cospectrality | Spectrum does not fully determine graph isomorphism |

The spectrum gives a linear algebraic shadow of the graph. It is compact, computable, and structurally meaningful, but it remains a shadow: different graphs can cast the same one.
