Skip to content

Chapter 59. Eigenvalues

Eigenvalues are numbers that describe how a linear transformation stretches or compresses space along special directions.

Most vectors change direction when a matrix acts on them. A few vectors may keep their direction. These vectors are called eigenvectors. The factors by which they are stretched are called eigenvalues.

Eigenvalues are among the most important objects in linear algebra. They appear in differential equations, quantum mechanics, numerical analysis, graph theory, optimization, statistics, machine learning, and dynamical systems.

The study of eigenvalues connects algebra, geometry, and computation.

59.1 Motivation

Consider the matrix

A=[2003]. A = \begin{bmatrix} 2 & 0 \\ 0 & 3 \end{bmatrix}.

Apply AA to the vector

v=[10]. v = \begin{bmatrix} 1 \\ 0 \end{bmatrix}.

Then

Av=[20]=2v. Av = \begin{bmatrix} 2 \\ 0 \end{bmatrix} = 2v.

The vector keeps its direction. Only its length changes.

Now apply AA to

w=[01]. w = \begin{bmatrix} 0 \\ 1 \end{bmatrix}.

Then

Aw=[03]=3w. Aw = \begin{bmatrix} 0 \\ 3 \end{bmatrix} = 3w.

Again, the direction is preserved.

The vectors vv and ww are eigenvectors. The numbers 22 and 33 are eigenvalues.

Most vectors do not behave this way. For example,

A[11]=[23], A \begin{bmatrix} 1 \\ 1 \end{bmatrix} = \begin{bmatrix} 2 \\ 3 \end{bmatrix},

which is not a scalar multiple of the original vector.

Eigenvectors identify the intrinsic directions of a transformation.

59.2 Definition of Eigenvalue

Let VV be a vector space and let

T:VV T : V \to V

be a linear transformation.

A nonzero vector vVv \in V is called an eigenvector of TT if there exists a scalar λ\lambda such that

T(v)=λv. T(v) = \lambda v.

The scalar λ\lambda is called the eigenvalue associated with vv.

For matrices, the definition becomes

Av=λv. Av = \lambda v.

Av=\lambda v

The vector vv must be nonzero. Otherwise every scalar would satisfy

A0=λ0. A0 = \lambda 0.

The eigenvalue equation says that the action of AA on vv only rescales the vector.

59.3 Geometric Interpretation

Geometrically, eigenvectors are directions that remain invariant under the transformation.

The matrix may:

  • stretch the vector,
  • shrink the vector,
  • reverse the vector,
  • leave the vector unchanged.

If

λ>1, \lambda > 1,

the vector is stretched.

If

0<λ<1, 0 < \lambda < 1,

the vector is compressed.

If

λ<0, \lambda < 0,

the vector reverses direction.

If

λ=1, \lambda = 1,

the vector remains unchanged.

If

λ=0, \lambda = 0,

the vector is mapped to zero.

For example, reflection across the xx-axis has matrix

A=[1001]. A = \begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix}.

Vectors on the xx-axis have eigenvalue 11. Vectors on the yy-axis have eigenvalue 1-1.

59.4 Rearranging the Eigenvalue Equation

Starting from

Av=λv, Av = \lambda v,

move all terms to one side:

Avλv=0. Av - \lambda v = 0.

Factor out vv:

(AλI)v=0. (A - \lambda I)v = 0.

(A-\lambda I)v=0

Here II is the identity matrix.

This is a homogeneous system of equations. A nonzero solution exists only if the matrix

AλI A - \lambda I

is singular.

Therefore,

det(AλI)=0. \det(A - \lambda I) = 0.

\det(A-\lambda I)=0

This equation determines the eigenvalues.

59.5 Characteristic Polynomial

The polynomial

p(λ)=det(AλI) p(\lambda) = \det(A - \lambda I)

is called the characteristic polynomial of AA.

Its roots are the eigenvalues of the matrix.

For an n×nn \times n matrix, the characteristic polynomial has degree nn.

Example

Let

A=[2112]. A = \begin{bmatrix} 2 & 1 \\ 1 & 2 \end{bmatrix}.

Then

AλI=[2λ112λ]. A - \lambda I = \begin{bmatrix} 2 - \lambda & 1 \\ 1 & 2 - \lambda \end{bmatrix}.

Compute the determinant:

det(AλI)=(2λ)21. \det(A - \lambda I) = (2-\lambda)^2 - 1.

Expand:

=44λ+λ21 = 4 - 4\lambda + \lambda^2 - 1 =λ24λ+3. = \lambda^2 - 4\lambda + 3.

Solve

λ24λ+3=0. \lambda^2 - 4\lambda + 3 = 0.

Factor:

(λ1)(λ3)=0. (\lambda - 1)(\lambda - 3) = 0.

The eigenvalues are

λ1=1,λ2=3. \lambda_1 = 1, \qquad \lambda_2 = 3.

59.6 Finding Eigenvectors

After computing an eigenvalue, substitute it into

(AλI)v=0 (A - \lambda I)v = 0

and solve for vv.

Example

For

A=[2112], A = \begin{bmatrix} 2 & 1 \\ 1 & 2 \end{bmatrix},

consider the eigenvalue

λ=3. \lambda = 3.

Then

A3I=[1111]. A - 3I = \begin{bmatrix} -1 & 1 \\ 1 & -1 \end{bmatrix}.

Solve

[1111][xy]=[00]. \begin{bmatrix} -1 & 1 \\ 1 & -1 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix}.

The equations reduce to

x=y. x = y.

Thus every nonzero multiple of

[11] \begin{bmatrix} 1 \\ 1 \end{bmatrix}

is an eigenvector corresponding to eigenvalue 33.

Now consider λ=1\lambda = 1:

AI=[1111]. A - I = \begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix}.

The equations become

x+y=0. x + y = 0.

Thus

[11] \begin{bmatrix} 1 \\ -1 \end{bmatrix}

is an eigenvector.

59.7 Eigenspaces

The set of all eigenvectors associated with an eigenvalue λ\lambda, together with the zero vector, forms a subspace.

This subspace is called the eigenspace corresponding to λ\lambda.

The eigenspace is

Eλ=ker(AλI). E_\lambda = \ker(A - \lambda I).

Thus eigenspaces are null spaces.

For the previous example:

E3=span{[11]}, E_3 = \operatorname{span} \left\{ \begin{bmatrix} 1 \\ 1 \end{bmatrix} \right\},

and

E1=span{[11]}. E_1 = \operatorname{span} \left\{ \begin{bmatrix} 1 \\ -1 \end{bmatrix} \right\}.

Each eigenspace is a line through the origin.

59.8 Algebraic and Geometric Multiplicity

An eigenvalue may appear more than once as a root of the characteristic polynomial.

The number of times it appears is called its algebraic multiplicity.

The dimension of its eigenspace is called its geometric multiplicity.

For every eigenvalue,

1geometric multiplicityalgebraic multiplicity. 1 \leq \text{geometric multiplicity} \leq \text{algebraic multiplicity}.

Example

Consider

A=[1101]. A = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}.

The characteristic polynomial is

(1λ)2. (1-\lambda)^2.

Thus λ=1\lambda = 1 has algebraic multiplicity 22.

Now solve

(AI)v=0. (A-I)v = 0.

We obtain

[0100][xy]=0. \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} = 0.

Thus

y=0. y = 0.

The eigenspace is

span{[10]}. \operatorname{span} \left\{ \begin{bmatrix} 1 \\ 0 \end{bmatrix} \right\}.

Its dimension is 11. Therefore the geometric multiplicity is 11.

59.9 Diagonal Matrices

Diagonal matrices provide the simplest example of eigenvalues.

If

D=[d1000d2000dn], D = \begin{bmatrix} d_1 & 0 & \cdots & 0 \\ 0 & d_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & d_n \end{bmatrix},

then the eigenvalues are exactly the diagonal entries:

d1,d2,,dn. d_1, d_2, \ldots, d_n.

The standard basis vectors are eigenvectors.

For example,

De1=d1e1. De_1 = d_1 e_1.

Diagonal matrices are easy to understand because each coordinate acts independently.

Much of spectral theory attempts to reduce matrices to diagonal form.

59.10 Triangular Matrices

For triangular matrices, the eigenvalues are also the diagonal entries.

If

A=[a110a2200a33], A = \begin{bmatrix} a_{11} & * & * \\ 0 & a_{22} & * \\ 0 & 0 & a_{33} \end{bmatrix},

then

det(AλI)=(a11λ)(a22λ)(a33λ). \det(A - \lambda I) = (a_{11}-\lambda) (a_{22}-\lambda) (a_{33}-\lambda).

Therefore the eigenvalues are

a11,a22,a33. a_{11}, \quad a_{22}, \quad a_{33}.

This fact is important in numerical linear algebra because many algorithms reduce matrices to triangular form.

59.11 Complex Eigenvalues

Real matrices may have complex eigenvalues.

Consider

A=[0110]. A = \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix}.

This matrix rotates vectors by 9090^\circ.

No nonzero real vector keeps its direction under this rotation.

Compute the characteristic polynomial:

λ2+1=0. \lambda^2 + 1 = 0.

The roots are

λ=i,λ=i. \lambda = i, \qquad \lambda = -i.

Thus the eigenvalues are complex.

Complex eigenvalues are essential in oscillatory systems, wave equations, quantum mechanics, and control theory.

59.12 Determinant and Trace

The eigenvalues are closely related to the determinant and trace.

If AA has eigenvalues

λ1,λ2,,λn, \lambda_1, \lambda_2, \ldots, \lambda_n,

counted with multiplicity, then

det(A)=λ1λ2λn, \det(A) = \lambda_1\lambda_2\cdots\lambda_n,

and

tr(A)=λ1+λ2++λn. \operatorname{tr}(A) = \lambda_1+\lambda_2+\cdots+\lambda_n.

These identities follow from the characteristic polynomial.

They connect local geometric scaling with global algebraic quantities.

59.13 Eigenvalues and Dynamical Systems

Repeated application of a matrix reveals the importance of eigenvalues.

Suppose

xk+1=Axk. x_{k+1} = Ax_k.

Then

xk=Akx0. x_k = A^k x_0.

If vv is an eigenvector with eigenvalue λ\lambda, then

Akv=λkv. A^k v = \lambda^k v.

Thus:

  • if λ>1|\lambda| > 1, growth occurs,
  • if λ<1|\lambda| < 1, decay occurs,
  • if λ=1|\lambda| = 1, oscillation or stability occurs.

Eigenvalues therefore determine long-term behavior.

This principle appears in population models, differential equations, Markov chains, iterative algorithms, and neural networks.

59.14 Spectral Perspective

The collection of eigenvalues of a matrix is called its spectrum.

Spectral theory studies how operators behave through their eigenvalues and eigenvectors.

Many difficult problems become simpler in spectral coordinates.

Examples include:

ProblemSpectral interpretation
Heat equationModes decay exponentially
Vibrating systemsNatural frequencies
Principal component analysisLargest variance directions
Quantum mechanicsEnergy levels
Graph analysisConnectivity structure
Markov chainsLong-term probability behavior

The spectral viewpoint is one of the unifying themes of modern mathematics.

59.15 Summary

An eigenvector of a matrix AA is a nonzero vector vv satisfying

Av=λv. Av = \lambda v.

The scalar λ\lambda is the eigenvalue.

Eigenvalues are found from

det(AλI)=0. \det(A - \lambda I)=0.

Eigenvectors are obtained by solving

(AλI)v=0. (A-\lambda I)v=0.

Eigenvalues describe invariant directions and scaling behavior of linear transformations. They connect algebraic structure, geometric behavior, and dynamical evolution.

The next chapter studies eigenvectors and eigenspaces in greater detail, including independence, bases of eigenvectors, and diagonalization.