# Chapter 59. Eigenvalues

# Chapter 59. Eigenvalues

Eigenvalues are numbers that describe how a linear transformation stretches or compresses space along special directions.

Most vectors change direction when a matrix acts on them. A few vectors may keep their direction. These vectors are called eigenvectors. The factors by which they are stretched are called eigenvalues.

Eigenvalues are among the most important objects in linear algebra. They appear in differential equations, quantum mechanics, numerical analysis, graph theory, optimization, statistics, machine learning, and dynamical systems.

The study of eigenvalues connects algebra, geometry, and computation.

## 59.1 Motivation

Consider the matrix

$$
A =
\begin{bmatrix}
2 & 0 \\
0 & 3
\end{bmatrix}.
$$

Apply \(A\) to the vector

$$
v =
\begin{bmatrix}
1 \\
0
\end{bmatrix}.
$$

Then

$$
Av =
\begin{bmatrix}
2 \\
0
\end{bmatrix} =
2v.
$$

The vector keeps its direction. Only its length changes.

Now apply \(A\) to

$$
w =
\begin{bmatrix}
0 \\
1
\end{bmatrix}.
$$

Then

$$
Aw =
\begin{bmatrix}
0 \\
3
\end{bmatrix} =
3w.
$$

Again, the direction is preserved.

The vectors \(v\) and \(w\) are eigenvectors. The numbers \(2\) and \(3\) are eigenvalues.

Most vectors do not behave this way. For example,

$$
A
\begin{bmatrix}
1 \\
1
\end{bmatrix} =
\begin{bmatrix}
2 \\
3
\end{bmatrix},
$$

which is not a scalar multiple of the original vector.

Eigenvectors identify the intrinsic directions of a transformation.

## 59.2 Definition of Eigenvalue

Let \(V\) be a vector space and let

$$
T : V \to V
$$

be a linear transformation.

A nonzero vector \(v \in V\) is called an eigenvector of \(T\) if there exists a scalar \(\lambda\) such that

$$
T(v) = \lambda v.
$$

The scalar \(\lambda\) is called the eigenvalue associated with \(v\).

For matrices, the definition becomes

$$
Av = \lambda v.
$$

Av=\lambda v

The vector \(v\) must be nonzero. Otherwise every scalar would satisfy

$$
A0 = \lambda 0.
$$

The eigenvalue equation says that the action of \(A\) on \(v\) only rescales the vector.

## 59.3 Geometric Interpretation

Geometrically, eigenvectors are directions that remain invariant under the transformation.

The matrix may:

- stretch the vector,
- shrink the vector,
- reverse the vector,
- leave the vector unchanged.

If

$$
\lambda > 1,
$$

the vector is stretched.

If

$$
0 < \lambda < 1,
$$

the vector is compressed.

If

$$
\lambda < 0,
$$

the vector reverses direction.

If

$$
\lambda = 1,
$$

the vector remains unchanged.

If

$$
\lambda = 0,
$$

the vector is mapped to zero.

For example, reflection across the \(x\)-axis has matrix

$$
A =
\begin{bmatrix}
1 & 0 \\
0 & -1
\end{bmatrix}.
$$

Vectors on the \(x\)-axis have eigenvalue \(1\). Vectors on the \(y\)-axis have eigenvalue \(-1\).

## 59.4 Rearranging the Eigenvalue Equation

Starting from

$$
Av = \lambda v,
$$

move all terms to one side:

$$
Av - \lambda v = 0.
$$

Factor out \(v\):

$$
(A - \lambda I)v = 0.
$$

(A-\lambda I)v=0

Here \(I\) is the identity matrix.

This is a homogeneous system of equations. A nonzero solution exists only if the matrix

$$
A - \lambda I
$$

is singular.

Therefore,

$$
\det(A - \lambda I) = 0.
$$

\det(A-\lambda I)=0

This equation determines the eigenvalues.

## 59.5 Characteristic Polynomial

The polynomial

$$
p(\lambda) = \det(A - \lambda I)
$$

is called the characteristic polynomial of \(A\).

Its roots are the eigenvalues of the matrix.

For an \(n \times n\) matrix, the characteristic polynomial has degree \(n\).

### Example

Let

$$
A =
\begin{bmatrix}
2 & 1 \\
1 & 2
\end{bmatrix}.
$$

Then

$$
A - \lambda I =
\begin{bmatrix}
2 - \lambda & 1 \\
1 & 2 - \lambda
\end{bmatrix}.
$$

Compute the determinant:

$$
\det(A - \lambda I) =
(2-\lambda)^2 - 1.
$$

Expand:

$$
= 4 - 4\lambda + \lambda^2 - 1
$$

$$
= \lambda^2 - 4\lambda + 3.
$$

Solve

$$
\lambda^2 - 4\lambda + 3 = 0.
$$

Factor:

$$
(\lambda - 1)(\lambda - 3) = 0.
$$

The eigenvalues are

$$
\lambda_1 = 1,
\qquad
\lambda_2 = 3.
$$

## 59.6 Finding Eigenvectors

After computing an eigenvalue, substitute it into

$$
(A - \lambda I)v = 0
$$

and solve for \(v\).

### Example

For

$$
A =
\begin{bmatrix}
2 & 1 \\
1 & 2
\end{bmatrix},
$$

consider the eigenvalue

$$
\lambda = 3.
$$

Then

$$
A - 3I =
\begin{bmatrix}
-1 & 1 \\
1 & -1
\end{bmatrix}.
$$

Solve

$$
\begin{bmatrix}
-1 & 1 \\
1 & -1
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} =
\begin{bmatrix}
0 \\
0
\end{bmatrix}.
$$

The equations reduce to

$$
x = y.
$$

Thus every nonzero multiple of

$$
\begin{bmatrix}
1 \\
1
\end{bmatrix}
$$

is an eigenvector corresponding to eigenvalue \(3\).

Now consider \(\lambda = 1\):

$$
A - I =
\begin{bmatrix}
1 & 1 \\
1 & 1
\end{bmatrix}.
$$

The equations become

$$
x + y = 0.
$$

Thus

$$
\begin{bmatrix}
1 \\
-1
\end{bmatrix}
$$

is an eigenvector.

## 59.7 Eigenspaces

The set of all eigenvectors associated with an eigenvalue \(\lambda\), together with the zero vector, forms a subspace.

This subspace is called the eigenspace corresponding to \(\lambda\).

The eigenspace is

$$
E_\lambda =
\ker(A - \lambda I).
$$

Thus eigenspaces are null spaces.

For the previous example:

$$
E_3 =
\operatorname{span}
\left\{
\begin{bmatrix}
1 \\
1
\end{bmatrix}
\right\},
$$

and

$$
E_1 =
\operatorname{span}
\left\{
\begin{bmatrix}
1 \\
-1
\end{bmatrix}
\right\}.
$$

Each eigenspace is a line through the origin.

## 59.8 Algebraic and Geometric Multiplicity

An eigenvalue may appear more than once as a root of the characteristic polynomial.

The number of times it appears is called its algebraic multiplicity.

The dimension of its eigenspace is called its geometric multiplicity.

For every eigenvalue,

$$
1 \leq \text{geometric multiplicity}
\leq
\text{algebraic multiplicity}.
$$

### Example

Consider

$$
A =
\begin{bmatrix}
1 & 1 \\
0 & 1
\end{bmatrix}.
$$

The characteristic polynomial is

$$
(1-\lambda)^2.
$$

Thus \(\lambda = 1\) has algebraic multiplicity \(2\).

Now solve

$$
(A-I)v = 0.
$$

We obtain

$$
\begin{bmatrix}
0 & 1 \\
0 & 0
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} =
0.
$$

Thus

$$
y = 0.
$$

The eigenspace is

$$
\operatorname{span}
\left\{
\begin{bmatrix}
1 \\
0
\end{bmatrix}
\right\}.
$$

Its dimension is \(1\). Therefore the geometric multiplicity is \(1\).

## 59.9 Diagonal Matrices

Diagonal matrices provide the simplest example of eigenvalues.

If

$$
D =
\begin{bmatrix}
d_1 & 0 & \cdots & 0 \\
0 & d_2 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & d_n
\end{bmatrix},
$$

then the eigenvalues are exactly the diagonal entries:

$$
d_1, d_2, \ldots, d_n.
$$

The standard basis vectors are eigenvectors.

For example,

$$
De_1 = d_1 e_1.
$$

Diagonal matrices are easy to understand because each coordinate acts independently.

Much of spectral theory attempts to reduce matrices to diagonal form.

## 59.10 Triangular Matrices

For triangular matrices, the eigenvalues are also the diagonal entries.

If

$$
A =
\begin{bmatrix}
a_{11} & * & * \\
0 & a_{22} & * \\
0 & 0 & a_{33}
\end{bmatrix},
$$

then

$$
\det(A - \lambda I) =
(a_{11}-\lambda)
(a_{22}-\lambda)
(a_{33}-\lambda).
$$

Therefore the eigenvalues are

$$
a_{11},
\quad
a_{22},
\quad
a_{33}.
$$

This fact is important in numerical linear algebra because many algorithms reduce matrices to triangular form.

## 59.11 Complex Eigenvalues

Real matrices may have complex eigenvalues.

Consider

$$
A =
\begin{bmatrix}
0 & -1 \\
1 & 0
\end{bmatrix}.
$$

This matrix rotates vectors by \(90^\circ\).

No nonzero real vector keeps its direction under this rotation.

Compute the characteristic polynomial:

$$
\lambda^2 + 1 = 0.
$$

The roots are

$$
\lambda = i,
\qquad
\lambda = -i.
$$

Thus the eigenvalues are complex.

Complex eigenvalues are essential in oscillatory systems, wave equations, quantum mechanics, and control theory.

## 59.12 Determinant and Trace

The eigenvalues are closely related to the determinant and trace.

If \(A\) has eigenvalues

$$
\lambda_1, \lambda_2, \ldots, \lambda_n,
$$

counted with multiplicity, then

$$
\det(A) =
\lambda_1\lambda_2\cdots\lambda_n,
$$

and

$$
\operatorname{tr}(A) =
\lambda_1+\lambda_2+\cdots+\lambda_n.
$$

These identities follow from the characteristic polynomial.

They connect local geometric scaling with global algebraic quantities.

## 59.13 Eigenvalues and Dynamical Systems

Repeated application of a matrix reveals the importance of eigenvalues.

Suppose

$$
x_{k+1} = Ax_k.
$$

Then

$$
x_k = A^k x_0.
$$

If \(v\) is an eigenvector with eigenvalue \(\lambda\), then

$$
A^k v = \lambda^k v.
$$

Thus:

- if \(|\lambda| > 1\), growth occurs,
- if \(|\lambda| < 1\), decay occurs,
- if \(|\lambda| = 1\), oscillation or stability occurs.

Eigenvalues therefore determine long-term behavior.

This principle appears in population models, differential equations, Markov chains, iterative algorithms, and neural networks.

## 59.14 Spectral Perspective

The collection of eigenvalues of a matrix is called its spectrum.

Spectral theory studies how operators behave through their eigenvalues and eigenvectors.

Many difficult problems become simpler in spectral coordinates.

Examples include:

| Problem | Spectral interpretation |
|---|---|
| Heat equation | Modes decay exponentially |
| Vibrating systems | Natural frequencies |
| Principal component analysis | Largest variance directions |
| Quantum mechanics | Energy levels |
| Graph analysis | Connectivity structure |
| Markov chains | Long-term probability behavior |

The spectral viewpoint is one of the unifying themes of modern mathematics.

## 59.15 Summary

An eigenvector of a matrix \(A\) is a nonzero vector \(v\) satisfying

$$
Av = \lambda v.
$$

The scalar \(\lambda\) is the eigenvalue.

Eigenvalues are found from

$$
\det(A - \lambda I)=0.
$$

Eigenvectors are obtained by solving

$$
(A-\lambda I)v=0.
$$

Eigenvalues describe invariant directions and scaling behavior of linear transformations. They connect algebraic structure, geometric behavior, and dynamical evolution.

The next chapter studies eigenvectors and eigenspaces in greater detail, including independence, bases of eigenvectors, and diagonalization.
