# Chapter 60. Eigenvectors

# Chapter 60. Eigenvectors

Eigenvectors are the nonzero vectors whose directions are preserved by a linear transformation. If a matrix sends a vector to a scalar multiple of itself, then that vector is an eigenvector.

The eigenvalue tells how much the vector is scaled. The eigenvector tells which direction is being scaled.

For a square matrix \(A\), the eigenvector equation is

$$
Av = \lambda v.
$$

Here \(v\) is a nonzero vector and \(\lambda\) is a scalar. The vector \(v\) is an eigenvector of \(A\), and \(\lambda\) is its associated eigenvalue. This formulation is standard in linear algebra: eigenvectors are nonzero vectors transformed into scalar multiples of themselves.

## 60.1 Direction Preserved by a Matrix

A matrix usually changes both the length and direction of a vector. Eigenvectors are special because their direction does not change.

Let

$$
A =
\begin{bmatrix}
3 & 0 \\
0 & 2
\end{bmatrix}.
$$

Then

$$
A
\begin{bmatrix}
1 \\
0
\end{bmatrix} =
\begin{bmatrix}
3 \\
0
\end{bmatrix} =
3
\begin{bmatrix}
1 \\
0
\end{bmatrix}.
$$

The vector

$$
\begin{bmatrix}
1 \\
0
\end{bmatrix}
$$

is an eigenvector with eigenvalue \(3\).

Also,

$$
A
\begin{bmatrix}
0 \\
1
\end{bmatrix} =
\begin{bmatrix}
0 \\
2
\end{bmatrix} =
2
\begin{bmatrix}
0 \\
1
\end{bmatrix}.
$$

The vector

$$
\begin{bmatrix}
0 \\
1
\end{bmatrix}
$$

is an eigenvector with eigenvalue \(2\).

The coordinate axes are invariant directions for this transformation. Vectors on the first coordinate axis are stretched by \(3\). Vectors on the second coordinate axis are stretched by \(2\).

## 60.2 Eigenvectors Are Not Unique

If \(v\) is an eigenvector of \(A\) with eigenvalue \(\lambda\), then every nonzero scalar multiple of \(v\) is also an eigenvector with the same eigenvalue.

Indeed, suppose

$$
Av = \lambda v.
$$

Let \(c\) be a nonzero scalar. Then

$$
A(cv) = cAv = c\lambda v = \lambda(cv).
$$

Thus \(cv\) is also an eigenvector.

Eigenvectors therefore describe directions, not individual arrows. The vectors

$$
\begin{bmatrix}
1 \\
1
\end{bmatrix},
\qquad
\begin{bmatrix}
2 \\
2
\end{bmatrix},
\qquad
\begin{bmatrix}
-5 \\
-5
\end{bmatrix}
$$

all lie on the same line through the origin. If one of them is an eigenvector, all nonzero vectors on that line are eigenvectors with the same eigenvalue.

## 60.3 The Zero Vector Is Excluded

The zero vector is not an eigenvector.

This exclusion is part of the definition. The reason is simple. For every scalar \(\lambda\),

$$
A0 = 0 = \lambda 0.
$$

If zero were allowed, every scalar would appear to be an eigenvalue for every matrix. The definition would lose its meaning.

Eigenvectors must therefore be nonzero.

## 60.4 Eigenspaces

For a fixed eigenvalue \(\lambda\), all eigenvectors with eigenvalue \(\lambda\), together with the zero vector, form a subspace.

This subspace is called the eigenspace of \(A\) corresponding to \(\lambda\).

It is written as

$$
E_\lambda = \{v : Av = \lambda v\}.
$$

Equivalently,

$$
E_\lambda = \ker(A - \lambda I).
$$

This identity is central. It says that finding eigenvectors is the same as finding the null space of \(A - \lambda I\). Eigenspaces are commonly described as kernels of \(A-\lambda I\).

## 60.5 Computing an Eigenvector

To compute eigenvectors, first find an eigenvalue \(\lambda\). Then solve

$$
(A - \lambda I)v = 0.
$$

The nonzero solutions are the eigenvectors.

### Example

Let

$$
A =
\begin{bmatrix}
4 & 1 \\
2 & 3
\end{bmatrix}.
$$

The characteristic polynomial is

$$
\det(A-\lambda I) =
\det
\begin{bmatrix}
4-\lambda & 1 \\
2 & 3-\lambda
\end{bmatrix}.
$$

Compute:

$$
(4-\lambda)(3-\lambda)-2 =
12 - 7\lambda + \lambda^2 - 2 =
\lambda^2 - 7\lambda + 10.
$$

Thus

$$
\lambda^2 - 7\lambda + 10 = 0.
$$

Factor:

$$
(\lambda - 5)(\lambda - 2)=0.
$$

The eigenvalues are

$$
\lambda = 5
\qquad
\text{and}
\qquad
\lambda = 2.
$$

Now find the eigenvectors for \(\lambda = 5\).

$$
A - 5I =
\begin{bmatrix}
-1 & 1 \\
2 & -2
\end{bmatrix}.
$$

Solve

$$
\begin{bmatrix}
-1 & 1 \\
2 & -2
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} =
\begin{bmatrix}
0 \\
0
\end{bmatrix}.
$$

The equation is

$$
-x + y = 0.
$$

Hence

$$
y = x.
$$

So the eigenspace for \(\lambda = 5\) is

$$
E_5 =
\operatorname{span}
\left\{
\begin{bmatrix}
1 \\
1
\end{bmatrix}
\right\}.
$$

Now find the eigenvectors for \(\lambda = 2\).

$$
A - 2I =
\begin{bmatrix}
2 & 1 \\
2 & 1
\end{bmatrix}.
$$

Solve

$$
\begin{bmatrix}
2 & 1 \\
2 & 1
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} =
\begin{bmatrix}
0 \\
0
\end{bmatrix}.
$$

The equation is

$$
2x + y = 0.
$$

Hence

$$
y = -2x.
$$

So

$$
E_2 =
\operatorname{span}
\left\{
\begin{bmatrix}
1 \\
-2
\end{bmatrix}
\right\}.
$$

## 60.6 Checking an Eigenvector

A proposed eigenvector should be checked directly.

For

$$
A =
\begin{bmatrix}
4 & 1 \\
2 & 3
\end{bmatrix},
$$

check

$$
v =
\begin{bmatrix}
1 \\
1
\end{bmatrix}.
$$

Then

$$
Av =
\begin{bmatrix}
4 & 1 \\
2 & 3
\end{bmatrix}
\begin{bmatrix}
1 \\
1
\end{bmatrix} =
\begin{bmatrix}
5 \\
5
\end{bmatrix} =
5
\begin{bmatrix}
1 \\
1
\end{bmatrix}.
$$

Thus \(v\) is an eigenvector with eigenvalue \(5\).

Now check

$$
w =
\begin{bmatrix}
1 \\
-2
\end{bmatrix}.
$$

Then

$$
Aw =
\begin{bmatrix}
4 & 1 \\
2 & 3
\end{bmatrix}
\begin{bmatrix}
1 \\
-2
\end{bmatrix} =
\begin{bmatrix}
2 \\
-4
\end{bmatrix} =
2
\begin{bmatrix}
1 \\
-2
\end{bmatrix}.
$$

Thus \(w\) is an eigenvector with eigenvalue \(2\).

## 60.7 Eigenvectors and Linear Independence

Eigenvectors belonging to distinct eigenvalues are linearly independent.

For example, if \(v_1\) and \(v_2\) are eigenvectors with distinct eigenvalues \(\lambda_1\) and \(\lambda_2\), then \(v_1\) and \(v_2\) cannot lie on the same line.

More generally, if

$$
v_1, v_2, \ldots, v_k
$$

are eigenvectors corresponding to distinct eigenvalues

$$
\lambda_1, \lambda_2, \ldots, \lambda_k,
$$

then the list

$$
v_1, v_2, \ldots, v_k
$$

is linearly independent.

This theorem is one of the main reasons eigenvectors are useful. A matrix with enough independent eigenvectors can be described in a simpler coordinate system.

## 60.8 Proof for Two Eigenvectors

Let \(v_1\) and \(v_2\) be eigenvectors of \(A\) with distinct eigenvalues \(\lambda_1\) and \(\lambda_2\).

Suppose

$$
c_1v_1 + c_2v_2 = 0.
$$

Apply \(A\) to both sides:

$$
A(c_1v_1 + c_2v_2) = A0.
$$

Using linearity,

$$
c_1Av_1 + c_2Av_2 = 0.
$$

Since \(Av_1 = \lambda_1v_1\) and \(Av_2 = \lambda_2v_2\),

$$
c_1\lambda_1v_1 + c_2\lambda_2v_2 = 0.
$$

Now multiply the original equation by \(\lambda_1\):

$$
c_1\lambda_1v_1 + c_2\lambda_1v_2 = 0.
$$

Subtract this equation from the previous one:

$$
c_2(\lambda_2-\lambda_1)v_2 = 0.
$$

Since \(v_2 \neq 0\) and \(\lambda_2 \neq \lambda_1\), we must have

$$
c_2 = 0.
$$

Then the original equation gives

$$
c_1v_1 = 0.
$$

Since \(v_1 \neq 0\), we also have

$$
c_1 = 0.
$$

Thus \(v_1\) and \(v_2\) are linearly independent.

## 60.9 Eigenvectors as a Basis

If an \(n \times n\) matrix has \(n\) linearly independent eigenvectors, then those eigenvectors form a basis of the space.

In that basis, the matrix acts diagonally.

Suppose

$$
Av_i = \lambda_i v_i
$$

for

$$
i = 1,2,\ldots,n.
$$

Any vector \(x\) can be written as

$$
x = c_1v_1 + c_2v_2 + \cdots + c_nv_n.
$$

Then

$$
Ax =
A(c_1v_1 + c_2v_2 + \cdots + c_nv_n).
$$

By linearity,

$$
Ax =
c_1Av_1 + c_2Av_2 + \cdots + c_nAv_n.
$$

Using the eigenvector equations,

$$
Ax =
c_1\lambda_1v_1
+
c_2\lambda_2v_2
+
\cdots
+
c_n\lambda_nv_n.
$$

Thus, in an eigenvector basis, the transformation simply rescales each coordinate.

This is the idea behind diagonalization.

## 60.10 Eigenvectors and Diagonalization

Let

$$
P =
\begin{bmatrix}
| & | & & | \\
v_1 & v_2 & \cdots & v_n \\
| & | & & |
\end{bmatrix}
$$

be the matrix whose columns are eigenvectors of \(A\).

Let

$$
D =
\begin{bmatrix}
\lambda_1 & 0 & \cdots & 0 \\
0 & \lambda_2 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & \lambda_n
\end{bmatrix}.
$$

Then

$$
AP = PD.
$$

If the eigenvectors are linearly independent, then \(P\) is invertible. Hence

$$
A = PDP^{-1}.
$$

This representation decomposes \(A\) into a change of basis, a diagonal scaling, and a change back to the original coordinates.

## 60.11 Repeated Eigenvalues

A repeated eigenvalue may have more than one independent eigenvector, or it may have only one.

Consider

$$
A =
\begin{bmatrix}
2 & 0 \\
0 & 2
\end{bmatrix}.
$$

Every nonzero vector is an eigenvector with eigenvalue \(2\), since

$$
Av = 2v.
$$

The eigenspace is all of \(\mathbb{R}^2\).

Now consider

$$
B =
\begin{bmatrix}
2 & 1 \\
0 & 2
\end{bmatrix}.
$$

Again, the only eigenvalue is \(2\). But

$$
B - 2I =
\begin{bmatrix}
0 & 1 \\
0 & 0
\end{bmatrix}.
$$

Solving

$$
(B-2I)v = 0
$$

gives

$$
y = 0.
$$

Thus the eigenspace is only

$$
\operatorname{span}
\left\{
\begin{bmatrix}
1 \\
0
\end{bmatrix}
\right\}.
$$

Both matrices have the same repeated eigenvalue. Their eigenvectors behave differently.

## 60.12 Defective Matrices

A matrix is called defective if it does not have enough linearly independent eigenvectors to form a basis.

The matrix

$$
B =
\begin{bmatrix}
2 & 1 \\
0 & 2
\end{bmatrix}
$$

is defective. It is a \(2 \times 2\) matrix, but it has only one independent eigenvector.

Defective matrices cannot be diagonalized. They require a more general form, such as Jordan canonical form.

Defectiveness is caused by insufficient geometric multiplicity.

## 60.13 Symmetric Matrices

Real symmetric matrices have especially good eigenvector behavior.

If

$$
A^T = A,
$$

then eigenvectors corresponding to distinct eigenvalues are orthogonal.

Moreover, a real symmetric matrix has an orthonormal basis of eigenvectors.

This is the content of the spectral theorem for real symmetric matrices.

For example,

$$
A =
\begin{bmatrix}
2 & 1 \\
1 & 2
\end{bmatrix}
$$

has eigenvectors

$$
\begin{bmatrix}
1 \\
1
\end{bmatrix}
\qquad
\text{and}
\qquad
\begin{bmatrix}
1 \\
-1
\end{bmatrix}.
$$

Their dot product is

$$
1 \cdot 1 + 1 \cdot (-1) = 0.
$$

They are orthogonal.

## 60.14 Left and Right Eigenvectors

For a matrix \(A\), a right eigenvector satisfies

$$
Av = \lambda v.
$$

A left eigenvector satisfies

$$
w^T A = \lambda w^T.
$$

Equivalently,

$$
A^T w = \lambda w.
$$

For symmetric matrices, left and right eigenvectors coincide. For general matrices, they may differ.

Left eigenvectors are important in Markov chains, sensitivity analysis, nonnormal matrices, and numerical algorithms.

## 60.15 Normalizing Eigenvectors

Because any nonzero scalar multiple of an eigenvector is also an eigenvector, it is often useful to choose a standard length.

For real vectors, one common normalization is

$$
\|v\| = 1.
$$

If

$$
v \neq 0,
$$

then the normalized eigenvector is

$$
u = \frac{v}{\|v\|}.
$$

For example,

$$
v =
\begin{bmatrix}
3 \\
4
\end{bmatrix}
$$

has norm

$$
\|v\| = 5.
$$

The normalized vector is

$$
u =
\begin{bmatrix}
3/5 \\
4/5
\end{bmatrix}.
$$

Both \(v\) and \(u\) point in the same direction. If \(v\) is an eigenvector, then \(u\) is also an eigenvector.

## 60.16 Complex Eigenvectors

When a real matrix has complex eigenvalues, its eigenvectors are usually complex.

Consider

$$
A =
\begin{bmatrix}
0 & -1 \\
1 & 0
\end{bmatrix}.
$$

This matrix rotates the plane by \(90^\circ\). Its eigenvalues are

$$
i
\qquad
\text{and}
\qquad
-i.
$$

For \(\lambda = i\), solve

$$
(A - iI)v = 0.
$$

That is,

$$
\begin{bmatrix}
-i & -1 \\
1 & -i
\end{bmatrix}
\begin{bmatrix}
x \\
y
\end{bmatrix} =
0.
$$

The first equation gives

$$
-ix - y = 0,
$$

so

$$
y = -ix.
$$

Taking \(x = 1\), one eigenvector is

$$
\begin{bmatrix}
1 \\
-i
\end{bmatrix}.
$$

Although the matrix has real entries, its eigenvectors belong to \(\mathbb{C}^2\).

## 60.17 Eigenvectors in Applications

Eigenvectors identify stable directions, dominant modes, and preferred coordinate systems.

| Area | Meaning of eigenvectors |
|---|---|
| Differential equations | Modes of exponential growth or decay |
| Mechanics | Modes of vibration |
| Markov chains | Long-term distributions and transient modes |
| Graph theory | Structural directions of graphs |
| Statistics | Principal component directions |
| Machine learning | Low-dimensional feature directions |
| Quantum mechanics | States with definite measured values |
| Numerical analysis | Directions controlling convergence |

In many applications, the eigenvalues give scale or frequency, while the eigenvectors give shape or direction.

## 60.18 Summary

An eigenvector is a nonzero vector whose direction is preserved by a linear transformation.

For a matrix \(A\), an eigenvector \(v\) satisfies

$$
Av = \lambda v.
$$

The scalar \(\lambda\) is the corresponding eigenvalue.

For each eigenvalue \(\lambda\), the eigenspace is

$$
E_\lambda = \ker(A-\lambda I).
$$

Eigenvectors belonging to distinct eigenvalues are linearly independent. If a matrix has enough independent eigenvectors to form a basis, then it can be diagonalized.

Eigenvectors reveal the directions in which a linear transformation acts in the simplest possible way.
