Skip to content

Chapter 79. Spectrum of a Graph

The spectrum of a graph is the collection of eigenvalues of a matrix associated with the graph. Most often, the matrix is the adjacency matrix or the Laplacian matrix. In this chapter, we focus first on the adjacency spectrum, then compare it with the Laplacian spectrum.

Let

G=(V,E) G=(V,E)

be a finite simple undirected graph with vertex set

V={v1,v2,,vn}. V=\{v_1,v_2,\ldots,v_n\}.

Let AA be the adjacency matrix of GG. Since GG is undirected, AA is real and symmetric. Therefore all eigenvalues of AA are real, and AA has an orthonormal basis of eigenvectors. The study of graph properties through eigenvalues and eigenvectors of graph matrices is called spectral graph theory.

79.1 Eigenvalues of the Adjacency Matrix

A number λ\lambda is an eigenvalue of AA if there is a nonzero vector xx such that

Ax=λx. Ax=\lambda x.

The vector xx is called an eigenvector corresponding to λ\lambda.

The adjacency spectrum of GG is the multiset of eigenvalues of AA. Multiplicity matters. If an eigenvalue occurs as a root of the characteristic polynomial rr times, then it appears in the spectrum with multiplicity rr.

The characteristic polynomial is

pA(t)=det(tIA). p_A(t)=\det(tI-A).

The eigenvalues are precisely the roots of this polynomial.

79.2 Spectrum as a Multiset

The spectrum is written as

Spec(G)={λ1,λ2,,λn}, \operatorname{Spec}(G)=\{\lambda_1,\lambda_2,\ldots,\lambda_n\},

where eigenvalues are repeated according to multiplicity.

For a simple undirected graph, it is common to order the eigenvalues as

λ1λ2λn. \lambda_1 \geq \lambda_2 \geq \cdots \geq \lambda_n.

The largest eigenvalue

λ1 \lambda_1

is called the spectral radius when all eigenvalues are real and the largest absolute value is equal to λ1\lambda_1. More generally, the spectral radius of AA is

ρ(A)=maxiλi. \rho(A)=\max_i |\lambda_i|.

For adjacency matrices of many connected graphs, the largest eigenvalue plays a central structural role.

79.3 Example: A Path on Three Vertices

Consider the path graph

P3:v1v2v3. P_3: v_1-v_2-v_3.

Its adjacency matrix is

A=[010101010]. A= \begin{bmatrix} 0&1&0\\ 1&0&1\\ 0&1&0 \end{bmatrix}.

The characteristic polynomial is

det(tIA)=det[t101t101t]. \det(tI-A) = \det \begin{bmatrix} t&-1&0\\ -1&t&-1\\ 0&-1&t \end{bmatrix}.

Expanding gives

pA(t)=t(t22). p_A(t)=t(t^2-2).

Thus the eigenvalues are

2,0,2. \sqrt{2},\quad 0,\quad -\sqrt{2}.

So

Spec(P3)={2,0,2}. \operatorname{Spec}(P_3)=\{\sqrt{2},0,-\sqrt{2}\}.

This example already shows a common feature of bipartite graphs: eigenvalues occur symmetrically around zero.

79.4 Spectrum and Relabeling

The adjacency matrix depends on the chosen ordering of vertices. The spectrum does not.

If the vertices are relabeled, the adjacency matrix changes by a permutation similarity:

A=PAPT, A' = P A P^T,

where PP is a permutation matrix.

Since

PT=P1, P^T=P^{-1},

the matrices AA and AA' are similar. Similar matrices have the same characteristic polynomial and the same eigenvalues.

Therefore the spectrum is a graph invariant. Isomorphic graphs have the same spectrum. The converse can fail: non-isomorphic graphs may have the same spectrum. Such graphs are called cospectral graphs.

79.5 Trace and Eigenvalues

The trace of a matrix is the sum of its diagonal entries. It is also the sum of its eigenvalues, counted with multiplicity.

For a simple graph, the diagonal entries of AA are zero. Hence

tr(A)=0. \operatorname{tr}(A)=0.

Therefore

λ1+λ2++λn=0. \lambda_1+\lambda_2+\cdots+\lambda_n=0.

This gives a simple constraint on the adjacency spectrum of a loopless graph.

79.6 Edges and the Sum of Squares

The trace of A2A^2 equals the sum of the squares of the eigenvalues:

tr(A2)=λ12+λ22++λn2. \operatorname{tr}(A^2)=\lambda_1^2+\lambda_2^2+\cdots+\lambda_n^2.

For a simple undirected graph, the diagonal entry (A2)ii(A^2)_{ii} equals deg(vi)\deg(v_i). Therefore

tr(A2)=i=1ndeg(vi)=2E. \operatorname{tr}(A^2)=\sum_{i=1}^{n}\deg(v_i)=2|E|.

Thus

λ12+λ22++λn2=2E. \lambda_1^2+\lambda_2^2+\cdots+\lambda_n^2=2|E|.

The spectrum determines the number of edges.

79.7 Triangles and the Third Power

The trace of A3A^3 equals the sum of the cubes of the eigenvalues:

tr(A3)=λ13+λ23++λn3. \operatorname{tr}(A^3)=\lambda_1^3+\lambda_2^3+\cdots+\lambda_n^3.

The diagonal entry (A3)ii(A^3)_{ii} counts closed walks of length 33 starting and ending at viv_i. In a simple undirected graph, each triangle contributes six closed walks of length 33: three choices of starting vertex and two directions.

Therefore the number of triangles in GG is

16tr(A3)=16i=1nλi3. \frac{1}{6}\operatorname{tr}(A^3) = \frac{1}{6}\sum_{i=1}^{n}\lambda_i^3.

So the adjacency spectrum determines the number of triangles.

79.8 Walk Counts

More generally, the trace of AkA^k counts closed walks of length kk:

tr(Ak)=i=1nλik. \operatorname{tr}(A^k) = \sum_{i=1}^{n}\lambda_i^k.

The entry (Ak)ij(A^k)_{ij} counts walks of length kk from viv_i to vjv_j. Hence the diagonal entries count closed walks, and their sum counts all closed walks of length kk.

Thus the spectrum determines the number of closed walks of every length. This is one reason eigenvalues contain substantial information about graph structure. Spectral graph theory uses eigenvalues and eigenvectors of graph-associated matrices to study combinatorial properties.

79.9 Regular Graphs

A graph is dd-regular if every vertex has degree dd.

Let

1=[111]. \mathbf{1}= \begin{bmatrix} 1\\ 1\\ \vdots\\ 1 \end{bmatrix}.

If GG is dd-regular, then every row of AA has sum dd. Therefore

A1=d1. A\mathbf{1}=d\mathbf{1}.

Thus dd is an eigenvalue of AA.

For a connected dd-regular graph, dd is the largest adjacency eigenvalue, and its eigenspace is spanned by 1\mathbf{1}. The remaining eigenvalues measure deviations from complete uniformity.

79.10 Complete Graphs

The complete graph KnK_n has adjacency matrix

A=JI, A=J-I,

where JJ is the all-ones matrix.

Since

J1=n1, J\mathbf{1}=n\mathbf{1},

we have

A1=(JI)1=(n1)1. A\mathbf{1}=(J-I)\mathbf{1}=(n-1)\mathbf{1}.

Thus n1n-1 is an eigenvalue.

If xx is orthogonal to 1\mathbf{1}, then

Jx=0. Jx=0.

Therefore

Ax=(JI)x=x. Ax=(J-I)x=-x.

So 1-1 is an eigenvalue with multiplicity n1n-1. Hence

Spec(Kn)={n1,1,1,,1}. \operatorname{Spec}(K_n)=\{n-1,-1,-1,\ldots,-1\}.

79.11 Empty Graphs

The empty graph on nn vertices has adjacency matrix

A=0. A=0.

Therefore every eigenvalue is zero:

Spec(Kn)={0,0,,0}. \operatorname{Spec}(\overline{K_n})=\{0,0,\ldots,0\}.

This spectrum reflects the absence of edges. There are no walks of positive length, no nonzero row sums, and no adjacency structure.

79.12 Cycle Graphs

The cycle graph CnC_n has vertices arranged in a ring. Its adjacency matrix is circulant. The eigenvalues are

λk=2cos(2πkn),k=0,1,,n1. \lambda_k = 2\cos\left(\frac{2\pi k}{n}\right), \qquad k=0,1,\ldots,n-1.

For example, for C4C_4,

λ0=2,λ1=0,λ2=2,λ3=0. \lambda_0=2,\quad \lambda_1=0,\quad \lambda_2=-2,\quad \lambda_3=0.

Thus

Spec(C4)={2,0,0,2}. \operatorname{Spec}(C_4)=\{2,0,0,-2\}.

Cycle spectra are important examples because they connect graph theory with Fourier analysis.

79.13 Bipartite Graphs

A graph is bipartite if its vertex set can be split into two parts XX and YY, with every edge going between the two parts.

If the vertices are ordered with all vertices in XX first and all vertices in YY second, then the adjacency matrix has block form

A=[0BBT0]. A= \begin{bmatrix} 0&B\\ B^T&0 \end{bmatrix}.

This block form implies spectral symmetry. If λ\lambda is an eigenvalue, then λ-\lambda is also an eigenvalue with the same multiplicity.

Thus the adjacency spectrum of a bipartite graph is symmetric about zero.

79.14 Spectral Radius

The spectral radius of the adjacency matrix is

ρ(A)=maxiλi. \rho(A)=\max_i |\lambda_i|.

For a simple graph, ρ(A)\rho(A) is bounded by the maximum degree:

ρ(A)Δmax. \rho(A)\leq \Delta_{\max}.

If GG is dd-regular, then

ρ(A)=d. \rho(A)=d.

The spectral radius is influenced by high-degree vertices and dense substructures. In connected graphs, it is also governed by the Perron-Frobenius theorem, because the adjacency matrix is nonnegative and irreducible precisely when the graph is connected.

79.15 Perron-Frobenius Phenomenon

If GG is connected, then AA is an irreducible nonnegative matrix. The Perron-Frobenius theorem implies that the largest eigenvalue has an eigenvector with strictly positive entries.

This positive eigenvector is called the Perron vector.

It gives a measure of vertex importance. A vertex receives a large coordinate when it is connected to other vertices with large coordinates. This principle underlies eigenvector centrality and related network measures.

79.16 Laplacian Spectrum

The Laplacian matrix is

L=ΔA. L=\Delta-A.

Its eigenvalues are usually written

0=μ1μ2μn. 0=\mu_1\leq \mu_2\leq \cdots\leq \mu_n.

Unlike adjacency eigenvalues, Laplacian eigenvalues are always nonnegative because

xTLx={vi,vj}E(xixj)20. x^TLx=\sum_{\{v_i,v_j\}\in E}(x_i-x_j)^2\geq 0.

The Laplacian spectrum gives information about connectivity, cuts, expansion, and spanning trees. The multiplicity of the eigenvalue 00 equals the number of connected components of the graph.

79.17 Comparing Adjacency and Laplacian Spectra

The adjacency spectrum and Laplacian spectrum emphasize different aspects of the graph.

SpectrumMatrixMain information
Adjacency spectrumAAWalks, regularity, triangles, bipartiteness
Laplacian spectrumL=ΔAL=\Delta-AConnectivity, cuts, smoothness, spanning trees
Normalized Laplacian spectrumIΔ1/2AΔ1/2I-\Delta^{-1/2}A\Delta^{-1/2}Random walks, clustering, degree-normalized structure

For regular graphs, the adjacency and Laplacian spectra are closely related. If GG is dd-regular, then

L=dIA. L=dI-A.

Therefore if λ\lambda is an adjacency eigenvalue, then

dλ d-\lambda

is a Laplacian eigenvalue.

For irregular graphs, the relationship is less direct because Δ\Delta is not a scalar multiple of the identity.

79.18 Cospectral Graphs

Two graphs are cospectral if they have the same spectrum.

If two graphs are isomorphic, then they are cospectral. This follows from permutation similarity of their adjacency matrices.

The converse is false. There exist non-isomorphic graphs with the same adjacency spectrum. Therefore the spectrum is a graph invariant, but it is not a complete invariant. Cospectral graphs show that eigenvalues capture much structure, but not all structure.

Cospectrality can also be defined using the Laplacian matrix or normalized Laplacian matrix. Two graphs may be cospectral for one matrix and not for another.

79.19 Graphs Determined by Spectrum

A graph is determined by its spectrum if every graph with the same spectrum is isomorphic to it.

Some graph families have this property. Others do not. Determining which graphs are characterized by their spectra is a major theme in spectral graph theory.

This question matters because it measures how much information eigenvalues retain about a graph. If a graph is determined by its spectrum, then its eigenvalues are enough to recover its isomorphism class. If it has cospectral mates, then eigenvalues alone are insufficient.

79.20 Spectral Invariants

The spectrum determines several quantities.

QuantityDetermined by adjacency spectrum?Reason
Number of verticesYesNumber of eigenvalues
Number of edgesYes(\sum_i \lambda_i^2=2
Number of trianglesYesiλi3=6T\sum_i \lambda_i^3=6T
Number of closed walks of length kkYesiλik=tr(Ak)\sum_i \lambda_i^k=\operatorname{tr}(A^k)
Bipartiteness, for connected graphsYesSpectrum symmetric about zero
Isomorphism classNot alwaysCospectral non-isomorphic graphs exist

This table should be read carefully. The spectrum determines closed walk counts, but it does not determine all subgraph counts or the full adjacency relation.

79.21 Why the Spectrum Matters

The spectrum matters because eigenvalues compress global graph information into numerical data.

A graph may have thousands or millions of vertices. Its adjacency matrix may be too large to inspect directly. Eigenvalues give a smaller summary that reflects connectivity, expansion, regularity, symmetry, and walk structure.

Spectral methods are used in graph partitioning, clustering, network analysis, chemistry, coding theory, random walks, and machine learning. They are useful because they convert combinatorial questions into linear algebraic questions.

The translation is not perfect. Eigenvalues lose information. But the information they keep is often the part most relevant to global structure.

79.22 Summary

The spectrum of a graph is the multiset of eigenvalues of a graph matrix, usually the adjacency matrix or the Laplacian matrix.

For the adjacency matrix AA, the spectrum is determined by

Ax=λx. Ax=\lambda x.

For a simple undirected graph, AA is real and symmetric, so its eigenvalues are real. The adjacency spectrum is invariant under relabeling and therefore invariant under graph isomorphism.

The spectrum records important information.

Spectral factGraph meaning
iλi=0\sum_i \lambda_i=0The graph has no loops
(\sum_i \lambda_i^2=2E
iλi3=6T\sum_i \lambda_i^3=6TThe spectrum determines the number of triangles
tr(Ak)=iλik\operatorname{tr}(A^k)=\sum_i\lambda_i^kThe spectrum determines closed walk counts
Symmetry about zeroCharacteristic of bipartite structure
Largest eigenvalueMeasures dense and highly connected structure
CospectralitySpectrum does not fully determine graph isomorphism

The spectrum gives a linear algebraic shadow of the graph. It is compact, computable, and structurally meaningful, but it remains a shadow: different graphs can cast the same one.