# Chapter 61. Characteristic Polynomial

# Chapter 61. Characteristic Polynomial

The characteristic polynomial is the polynomial equation attached to a square matrix whose roots are the eigenvalues of that matrix.

It turns the eigenvalue problem into a polynomial problem. Instead of looking directly for nonzero vectors \(v\) satisfying

$$
Av = \lambda v,
$$

we look for scalars \(\lambda\) that make a certain determinant equal to zero.

For an \(n \times n\) matrix \(A\), the characteristic polynomial is commonly written as

$$
p_A(\lambda) = \det(A - \lambda I).
$$

Some books use

$$
p_A(\lambda) = \det(\lambda I - A).
$$

These two conventions differ only by the sign factor \((-1)^n\). They have the same roots, so they give the same eigenvalues. The characteristic polynomial has degree \(n\), and its roots are exactly the eigenvalues of \(A\).

## 61.1 From Eigenvectors to a Polynomial

Start with the eigenvalue equation

$$
Av = \lambda v.
$$

Move all terms to one side:

$$
Av - \lambda v = 0.
$$

Since \(\lambda v = \lambda I v\), we have

$$
(A - \lambda I)v = 0.
$$

This is a homogeneous system. It has a nonzero solution exactly when the matrix \(A-\lambda I\) is singular. A square matrix is singular exactly when its determinant is zero. Therefore,

$$
\det(A-\lambda I)=0.
$$

This equation is called the characteristic equation. Its left side is the characteristic polynomial.

## 61.2 Definition

Let \(A\) be an \(n \times n\) matrix over a field \(F\). The characteristic polynomial of \(A\) is

$$
p_A(\lambda)=\det(A-\lambda I).
$$

The scalar \(\lambda\) is an indeterminate. The entries of \(A-\lambda I\) are polynomials in \(\lambda\). Taking the determinant produces one polynomial in \(\lambda\).

The eigenvalues of \(A\) are precisely the roots of this polynomial:

$$
p_A(\lambda)=0.
$$

Thus the characteristic polynomial is the algebraic object that encodes the eigenvalues.

## 61.3 A Two by Two Formula

Let

$$
A =
\begin{bmatrix}
a & b \\
c & d
\end{bmatrix}.
$$

Then

$$
A-\lambda I =
\begin{bmatrix}
a-\lambda & b \\
c & d-\lambda
\end{bmatrix}.
$$

The characteristic polynomial is

$$
p_A(\lambda) =
\det
\begin{bmatrix}
a-\lambda & b \\
c & d-\lambda
\end{bmatrix}.
$$

Compute the determinant:

$$
p_A(\lambda) =
(a-\lambda)(d-\lambda)-bc.
$$

Expanding gives

$$
p_A(\lambda) =
\lambda^2-(a+d)\lambda+(ad-bc).
$$

Since

$$
\operatorname{tr}(A)=a+d
$$

and

$$
\det(A)=ad-bc,
$$

we get

$$
p_A(\lambda) =
\lambda^2-\operatorname{tr}(A)\lambda+\det(A).
$$

This formula is useful for quick computations with \(2 \times 2\) matrices.

## 61.4 Example

Let

$$
A =
\begin{bmatrix}
4 & 1 \\
2 & 3
\end{bmatrix}.
$$

Then

$$
A-\lambda I =
\begin{bmatrix}
4-\lambda & 1 \\
2 & 3-\lambda
\end{bmatrix}.
$$

Hence

$$
p_A(\lambda) =
(4-\lambda)(3-\lambda)-2.
$$

Expand:

$$
p_A(\lambda) =
12-7\lambda+\lambda^2-2.
$$

Therefore,

$$
p_A(\lambda)=\lambda^2-7\lambda+10.
$$

Factor:

$$
p_A(\lambda)=(\lambda-5)(\lambda-2).
$$

The roots are

$$
\lambda=5
\qquad
\text{and}
\qquad
\lambda=2.
$$

Thus the eigenvalues of \(A\) are \(5\) and \(2\).

## 61.5 Characteristic Equation

The equation

$$
p_A(\lambda)=0
$$

is called the characteristic equation of \(A\).

For the previous matrix, the characteristic equation is

$$
\lambda^2-7\lambda+10=0.
$$

Solving it gives the eigenvalues.

The characteristic equation usually does not give the eigenvectors directly. After finding an eigenvalue \(\lambda\), one finds the eigenvectors by solving

$$
(A-\lambda I)v=0.
$$

Thus the eigenvalue computation has two stages:

| Stage | Operation | Output |
|---|---|---|
| 1 | Solve \(\det(A-\lambda I)=0\) | Eigenvalues |
| 2 | Solve \((A-\lambda I)v=0\) | Eigenvectors |

## 61.6 Degree

If \(A\) is an \(n \times n\) matrix, then \(p_A(\lambda)\) has degree \(n\).

This follows from the determinant expansion. The diagonal entries of \(A-\lambda I\) have the form

$$
a_{ii}-\lambda.
$$

The product of all diagonal terms contributes a term of degree \(n\):

$$
(-\lambda)^n.
$$

No other term can have higher degree. Hence the characteristic polynomial has degree \(n\).

With the convention

$$
p_A(\lambda)=\det(A-\lambda I),
$$

the leading term is

$$
(-1)^n\lambda^n.
$$

With the convention

$$
q_A(\lambda)=\det(\lambda I-A),
$$

the leading term is

$$
\lambda^n.
$$

The roots are the same under both conventions.

## 61.7 Constant Term

The constant term of

$$
p_A(\lambda)=\det(A-\lambda I)
$$

is found by setting \(\lambda=0\):

$$
p_A(0)=\det(A).
$$

Therefore, the constant term of the characteristic polynomial is the determinant of \(A\).

This gives an important relation. If the eigenvalues are

$$
\lambda_1,\lambda_2,\ldots,\lambda_n,
$$

counted with algebraic multiplicity, then

$$
\det(A)=\lambda_1\lambda_2\cdots\lambda_n.
$$

The determinant is the product of the eigenvalues.

## 61.8 Trace Term

For an \(n \times n\) matrix, the coefficient of the next-highest power of \(\lambda\) is determined by the trace.

With the convention

$$
q_A(\lambda)=\det(\lambda I-A),
$$

the characteristic polynomial has the form

$$
q_A(\lambda) =
\lambda^n-\operatorname{tr}(A)\lambda^{n-1}+\cdots+(-1)^n\det(A).
$$

Thus, if the eigenvalues are

$$
\lambda_1,\lambda_2,\ldots,\lambda_n,
$$

then

$$
\operatorname{tr}(A)=\lambda_1+\lambda_2+\cdots+\lambda_n.
$$

The trace is the sum of the eigenvalues, counted with algebraic multiplicity. The characteristic polynomial encodes determinant and trace among its coefficients.

## 61.9 Algebraic Multiplicity

An eigenvalue may occur more than once as a root of the characteristic polynomial.

The number of times an eigenvalue appears as a root is called its algebraic multiplicity.

For example,

$$
p_A(\lambda)=(\lambda-2)^3(\lambda+1)
$$

has eigenvalues

$$
2
\qquad
\text{and}
\qquad
-1.
$$

The eigenvalue \(2\) has algebraic multiplicity \(3\). The eigenvalue \(-1\) has algebraic multiplicity \(1\).

The sum of all algebraic multiplicities is the degree of the characteristic polynomial. For an \(n \times n\) matrix, this sum is \(n\).

## 61.10 Geometric Multiplicity Compared

The algebraic multiplicity of an eigenvalue comes from the characteristic polynomial.

The geometric multiplicity comes from the eigenspace:

$$
E_\lambda=\ker(A-\lambda I).
$$

The geometric multiplicity is

$$
\dim E_\lambda.
$$

For every eigenvalue,

$$
1
\leq
\text{geometric multiplicity}
\leq
\text{algebraic multiplicity}.
$$

The characteristic polynomial tells how many times an eigenvalue appears algebraically. The eigenspace tells how many independent eigenvectors belong to it.

These two numbers need not be equal.

## 61.11 Repeated Root Example

Consider

$$
A =
\begin{bmatrix}
2 & 1 \\
0 & 2
\end{bmatrix}.
$$

Then

$$
A-\lambda I =
\begin{bmatrix}
2-\lambda & 1 \\
0 & 2-\lambda
\end{bmatrix}.
$$

The determinant is

$$
p_A(\lambda)=(2-\lambda)^2.
$$

Thus \(\lambda=2\) is an eigenvalue with algebraic multiplicity \(2\).

Now compute the eigenspace:

$$
A-2I =
\begin{bmatrix}
0 & 1 \\
0 & 0
\end{bmatrix}.
$$

Solving

$$
(A-2I)v=0
$$

gives

$$
y=0.
$$

Hence

$$
E_2=
\operatorname{span}
\left\{
\begin{bmatrix}
1 \\
0
\end{bmatrix}
\right\}.
$$

The geometric multiplicity is \(1\).

The characteristic polynomial has a repeated root, but the matrix has only one independent eigenvector.

## 61.12 Similar Matrices

Two square matrices \(A\) and \(B\) are similar if there is an invertible matrix \(P\) such that

$$
B=P^{-1}AP.
$$

Similar matrices represent the same linear transformation in different bases.

They have the same characteristic polynomial.

Indeed,

$$
B-\lambda I =
P^{-1}AP-\lambda I.
$$

Since

$$
I=P^{-1}IP,
$$

we have

$$
B-\lambda I =
P^{-1}(A-\lambda I)P.
$$

Taking determinants,

$$
\det(B-\lambda I) =
\det(P^{-1})\det(A-\lambda I)\det(P).
$$

Since

$$
\det(P^{-1})\det(P)=1,
$$

it follows that

$$
\det(B-\lambda I)=\det(A-\lambda I).
$$

Thus similar matrices have the same characteristic polynomial and the same eigenvalues.

## 61.13 Characteristic Polynomial of Diagonal Matrices

Let

$$
D =
\begin{bmatrix}
d_1 & 0 & \cdots & 0 \\
0 & d_2 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & d_n
\end{bmatrix}.
$$

Then

$$
D-\lambda I =
\begin{bmatrix}
d_1-\lambda & 0 & \cdots & 0 \\
0 & d_2-\lambda & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & d_n-\lambda
\end{bmatrix}.
$$

The determinant of a diagonal matrix is the product of its diagonal entries, so

$$
p_D(\lambda) =
(d_1-\lambda)(d_2-\lambda)\cdots(d_n-\lambda).
$$

Therefore the eigenvalues are exactly the diagonal entries.

## 61.14 Characteristic Polynomial of Triangular Matrices

If \(A\) is upper triangular or lower triangular, then \(A-\lambda I\) is also triangular.

For a triangular matrix, the determinant is the product of the diagonal entries.

Thus, if

$$
A =
\begin{bmatrix}
a_{11} & * & \cdots & * \\
0 & a_{22} & \cdots & * \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & a_{nn}
\end{bmatrix},
$$

then

$$
p_A(\lambda) =
(a_{11}-\lambda)(a_{22}-\lambda)\cdots(a_{nn}-\lambda).
$$

The eigenvalues of a triangular matrix are its diagonal entries.

This fact is central in numerical methods, especially the QR algorithm and Schur decomposition.

## 61.15 Characteristic Polynomial and Diagonalization

If \(A\) is diagonalizable, then

$$
A=PDP^{-1},
$$

where \(D\) is diagonal.

Since \(A\) and \(D\) are similar, they have the same characteristic polynomial.

If

$$
D=
\begin{bmatrix}
\lambda_1 & 0 & \cdots & 0 \\
0 & \lambda_2 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & \lambda_n
\end{bmatrix},
$$

then

$$
p_A(t)=p_D(t) =
(\lambda_1-t)(\lambda_2-t)\cdots(\lambda_n-t)
$$

under the convention \(p_A(t)=\det(A-tI)\).

Thus diagonalization makes the characteristic polynomial transparent.

## 61.16 Complex Roots

Over the real numbers, a characteristic polynomial may have no real roots.

For example,

$$
A =
\begin{bmatrix}
0 & -1 \\
1 & 0
\end{bmatrix}
$$

has

$$
p_A(\lambda) =
\det
\begin{bmatrix}
-\lambda & -1 \\
1 & -\lambda
\end{bmatrix} =
\lambda^2+1.
$$

There are no real roots.

Over the complex numbers,

$$
\lambda^2+1=0
$$

has roots

$$
i
\qquad
\text{and}
\qquad
-i.
$$

Thus the matrix has complex eigenvalues.

For this reason, spectral theory is often developed over \(\mathbb{C}\). Over the complex numbers, every degree \(n\) characteristic polynomial has exactly \(n\) roots counted with multiplicity.

## 61.17 Characteristic Polynomial and Invertibility

A square matrix \(A\) is invertible exactly when \(0\) is not an eigenvalue.

Using the characteristic polynomial,

$$
0 \text{ is an eigenvalue}
$$

exactly when

$$
p_A(0)=0.
$$

But

$$
p_A(0)=\det(A).
$$

Therefore,

$$
A \text{ is invertible}
$$

exactly when

$$
\det(A)\neq 0.
$$

This connects three equivalent facts:

| Statement | Meaning |
|---|---|
| \(A\) is invertible | The transformation can be undone |
| \(\det(A)\neq 0\) | The matrix is nonsingular |
| \(0\) is not an eigenvalue | No nonzero vector is sent to zero |

## 61.18 Characteristic Polynomial of a Linear Transformation

The characteristic polynomial can be defined for a linear transformation

$$
T:V\to V
$$

on a finite-dimensional vector space.

Choose a basis of \(V\), and let \(A\) be the matrix of \(T\) in that basis. Define

$$
p_T(\lambda)=p_A(\lambda).
$$

This definition is well-defined because changing the basis replaces \(A\) by a similar matrix, and similar matrices have the same characteristic polynomial.

Thus the characteristic polynomial belongs to the linear transformation itself, not merely to a particular matrix representation.

## 61.19 What the Characteristic Polynomial Does Not Tell Alone

The characteristic polynomial gives the eigenvalues and their algebraic multiplicities. It also encodes determinant and trace.

However, it does not by itself determine the matrix.

Different matrices can have the same characteristic polynomial.

For example,

$$
A =
\begin{bmatrix}
2 & 0 \\
0 & 2
\end{bmatrix}
$$

and

$$
B =
\begin{bmatrix}
2 & 1 \\
0 & 2
\end{bmatrix}
$$

both have characteristic polynomial

$$
(2-\lambda)^2.
$$

But \(A\) has two independent eigenvectors, while \(B\) has only one.

To understand the full structure, one also studies eigenspaces, minimal polynomials, Jordan form, and invariant subspaces.

## 61.20 Summary

The characteristic polynomial of a square matrix \(A\) is

$$
p_A(\lambda)=\det(A-\lambda I).
$$

Its roots are the eigenvalues of \(A\). Its degree is the size of the matrix. Its constant term is \(\det(A)\), and its next-highest coefficient is governed by \(\operatorname{tr}(A)\).

The characteristic polynomial translates the eigenvalue problem into a polynomial equation. It is one of the main bridges between matrices, determinants, eigenvalues, diagonalization, and spectral theory.
