Eigenvalues and eigenvectors describe special directions of a linear transformation. In graph theory, they describe special patterns on the vertices of a graph.
Let
be a finite simple undirected graph with vertices
Let be the adjacency matrix of . A nonzero vector
is an eigenvector of if there is a scalar such that
The scalar is called the corresponding eigenvalue.
In spectral graph theory, eigenvalues and eigenvectors of graph matrices, especially adjacency and Laplacian matrices, are used to study structural properties of graphs. For an undirected graph, the adjacency matrix is real and symmetric, so its eigenvalues are real and it has an orthonormal basis of eigenvectors.
80.1 Eigenvectors as Vertex Functions
A vector
may be viewed as a function on the vertices of . The entry assigns a value to vertex .
The equation
can be read one vertex at a time. Since
the eigenvalue equation becomes
Thus an adjacency eigenvector assigns values to vertices so that, at every vertex, the sum of neighboring values is a fixed multiple of the value at that vertex.
This is the basic local meaning of an adjacency eigenvector.
80.2 Example: The Path on Three Vertices
Consider the path
Its adjacency matrix is
Let
Then
Hence is an eigenvector with eigenvalue
The middle vertex receives the largest value. This reflects its central position in the path: both endpoints are adjacent to it, and it connects the two sides of the graph.
80.3 Scaling of Eigenvectors
If is an eigenvector with eigenvalue , then every nonzero scalar multiple of is also an eigenvector with eigenvalue .
Indeed,
Therefore an eigenvector is not unique as a single vector. It represents a direction in vector space.
For this reason, eigenvectors are often normalized. A common convention is
The normalization changes the length of the vector, but not its eigendirection.
80.4 Eigenspaces
The eigenspace corresponding to an eigenvalue is
This set includes the zero vector, even though the zero vector is not called an eigenvector.
Equivalently,
Thus the eigenspace is a subspace of .
The dimension of is the geometric multiplicity of . It counts the number of independent eigenvectors associated with .
80.5 Orthogonality
For a simple undirected graph, the adjacency matrix is symmetric:
A real symmetric matrix has orthogonal eigenspaces for distinct eigenvalues. Therefore, if
and
then
This orthogonality gives a clean decomposition of vertex functions. Any vector on the vertices can be written as a sum of eigenvector components.
This is one reason spectral graph theory is so useful: it gives a coordinate system adapted to the graph.
80.6 Diagonalization
Since is real and symmetric, it can be diagonalized orthogonally. There is an orthogonal matrix and a diagonal matrix such that
The columns of are orthonormal eigenvectors of . The diagonal entries of are the corresponding eigenvalues.
If
then many powers and functions of become simple:
This formula explains why eigenvalues control walk counts, diffusion, random walks, and many iterative processes on graphs.
80.7 Eigenvalues and Walk Growth
The entry counts walks of length from to . Since
the eigenvalues determine how powers of grow.
Large eigenvalues dominate high powers. If is the largest eigenvalue in absolute value, then terms involving dominate as becomes large, unless canceled by orthogonality or symmetry.
Thus the largest eigenvalues describe large-scale walk behavior.
80.8 The Largest Eigenvalue
For a connected graph, the adjacency matrix is nonnegative and irreducible. The Perron-Frobenius theorem implies that the largest adjacency eigenvalue has an eigenvector whose entries are strictly positive. This eigenvector is called the Perron vector.
The largest eigenvalue is often denoted
For a -regular connected graph,
The all-ones vector is an eigenvector:
For irregular graphs, the largest eigenvector often assigns larger values to vertices that are connected to other important vertices. This is the basis of eigenvector centrality.
80.9 Eigenvector Centrality
Eigenvector centrality assigns a score to each vertex so that a vertex is important when it is adjacent to important vertices.
The defining equation is
At vertex , this says
Thus is proportional to the sum of the scores of its neighbors.
The principal eigenvector, when positive, gives a natural centrality measure. It differs from degree centrality. Degree centrality counts neighbors. Eigenvector centrality weights neighbors by their own importance.
80.10 Regular Graphs
Let be -regular. Then every row of sums to . Hence
So is an eigenvalue.
If is connected, then is the largest eigenvalue and has multiplicity .
The other eigenvalues measure structure beyond uniform degree. For example, in a good expander graph, all nontrivial eigenvalues are small in absolute value compared with . This means that the graph behaves, in many ways, like a random regular graph.
For regular graphs, spectral information from the adjacency matrix is closely related to spectral information from the Laplacian. For nonregular graphs, the choice of matrix can make a substantial difference.
80.11 Complete Graphs
For the complete graph , the adjacency matrix is
where is the all-ones matrix.
The vector satisfies
Therefore is an eigenvalue.
If is orthogonal to , then
Hence
Thus the eigenvalues of are
The eigenspace for consists of all vectors whose entries sum to zero.
80.12 Empty Graphs
The empty graph on vertices has adjacency matrix
Therefore
for every vector . Every nonzero vector is an eigenvector with eigenvalue .
The only eigenvalue is
with multiplicity .
This reflects the absence of adjacency structure.
80.13 Bipartite Graphs
For a bipartite graph with parts and , the adjacency matrix can be written in block form:
Suppose
Then
Changing the sign of the second block gives
Therefore if is an eigenvalue, then is also an eigenvalue with the same multiplicity.
Thus the adjacency spectrum of a bipartite graph is symmetric about zero.
80.14 Laplacian Eigenvectors
The Laplacian matrix is
A Laplacian eigenvector satisfies
Since
the equation at vertex is
Equivalently,
Thus a Laplacian eigenvector describes how a vertex value differs from neighboring values.
80.15 The Zero Laplacian Eigenvalue
The all-ones vector satisfies
Thus is always a Laplacian eigenvalue.
More generally, if a graph has connected components, then the eigenspace for eigenvalue has dimension . Its vectors are constant on each connected component.
In particular, a graph is connected if and only if the zero eigenvalue of has multiplicity . The multiplicity of in the Laplacian spectrum equals the number of connected components.
80.16 The Fiedler Vector
For a connected graph, the Laplacian eigenvalues are ordered as
The eigenvalue is the algebraic connectivity. An eigenvector for is called a Fiedler vector.
The Fiedler vector is important because it often reveals a natural partition of the graph. Vertices with positive entries may be placed on one side, and vertices with negative entries on the other. This idea is used in spectral partitioning and spectral clustering.
The value of also measures how strongly connected the graph is. Small indicates a bottleneck. Larger indicates stronger connectivity. This connection is a standard theme in spectral graph theory.
80.17 Normalized Eigenvectors
For graphs with uneven degrees, normalized matrices are often more appropriate.
The symmetric normalized Laplacian is
The random-walk matrix is
Eigenvectors of these matrices account for degree variation. They are common in random walks, spectral clustering, and graph-based data analysis.
For regular graphs, normalization changes little because all degrees are equal. For irregular graphs, normalized and unnormalized eigenvectors may give substantially different information.
80.18 Sign Patterns
Eigenvectors are often useful because of their sign patterns.
For example, a Laplacian eigenvector with both positive and negative entries can suggest a cut. Vertices with similar signs and similar magnitudes tend to be placed near one another in the graph.
The sign pattern alone does not determine the graph structure. However, it often reveals coarse geometry: clusters, bottlenecks, central regions, and peripheral regions.
This is one reason eigenvectors are used for visualization and partitioning.
80.19 Spectral Embeddings
Several eigenvectors can be used together to embed a graph into Euclidean space.
Choose eigenvectors
Map each vertex to the point
This construction is called a spectral embedding.
Nearby points in the embedding often correspond to vertices with similar graph positions. Spectral embeddings are used for clustering, drawing graphs, dimensionality reduction, and network analysis.
80.20 Eigenvalues as Frequencies
For Laplacian eigenvectors, eigenvalues can be interpreted as graph frequencies.
The quadratic form
measures variation across edges.
If and , then
Thus small Laplacian eigenvalues correspond to smooth eigenvectors that vary slowly across edges. Large eigenvalues correspond to oscillatory eigenvectors that change sharply across edges.
This is analogous to Fourier analysis, where low frequencies vary slowly and high frequencies oscillate rapidly.
80.21 Summary
Eigenvalues and eigenvectors translate graph structure into linear algebra.
For the adjacency matrix,
an eigenvector is a vertex function whose neighbor sum is proportional to its value.
For the Laplacian matrix,
an eigenvector is a vertex function whose difference from neighboring values is proportional to its value.
| Object | Meaning |
|---|---|
| Adjacency eigenvalue | Scaling factor for neighbor-sum patterns |
| Adjacency eigenvector | Vertex function balanced by neighboring values |
| Largest adjacency eigenvalue | Measures dominant walk growth and centrality |
| Perron vector | Positive eigenvector for a connected graph |
| Laplacian eigenvalue | Constant functions on connected components |
| Fiedler vector | Eigenvector used to detect bottlenecks and partitions |
| Large Laplacian eigenvalues | Highly oscillatory vertex functions |
| Spectral embedding | Coordinates built from graph eigenvectors |
Eigenvalues give numerical invariants. Eigenvectors give structured functions on vertices. Together they provide a coordinate system for studying graphs through matrices.