# Chapter 64. Spectral Theorem

# Chapter 64. Spectral Theorem

The spectral theorem describes the best possible form of certain linear transformations.

For a general matrix, diagonalization may fail. A matrix may lack enough eigenvectors. Its eigenvalues may be complex. Its eigenvectors may fail to be orthogonal. The spectral theorem identifies an important class of matrices for which these difficulties disappear.

For real symmetric matrices, the theorem says that every such matrix has an orthonormal basis of eigenvectors. Equivalently, every real symmetric matrix can be diagonalized by an orthogonal matrix. This is the finite-dimensional real form of the spectral theorem.

## 64.1 Symmetric Matrices

A real square matrix \(A\) is symmetric if

$$
A^T=A.
$$

This means that the entry in row \(i\), column \(j\), equals the entry in row \(j\), column \(i\):

$$
a_{ij}=a_{ji}.
$$

For example,

$$
A=
\begin{bmatrix}
2 & 1 \\
1 & 3
\end{bmatrix}
$$

is symmetric.

The matrix

$$
B=
\begin{bmatrix}
2 & 4 \\
1 & 3
\end{bmatrix}
$$

is not symmetric, because

$$
b_{12}=4
$$

but

$$
b_{21}=1.
$$

Symmetric matrices occur naturally in geometry, quadratic forms, least squares, optimization, mechanics, statistics, and graph theory.

## 64.2 Statement of the Real Spectral Theorem

Let \(A\) be a real \(n\times n\) symmetric matrix.

Then there exists an orthonormal basis of \(\mathbb{R}^n\) consisting of eigenvectors of \(A\).

Equivalently, there exists an orthogonal matrix \(Q\) and a real diagonal matrix \(D\) such that

$$
A=QDQ^T.
$$

Here

$$
Q^TQ=I,
$$

so

$$
Q^{-1}=Q^T.
$$

The columns of \(Q\) are orthonormal eigenvectors of \(A\). The diagonal entries of \(D\) are the corresponding eigenvalues.

This is stronger than ordinary diagonalization. Ordinary diagonalization gives

$$
A=PDP^{-1}.
$$

The spectral theorem gives

$$
A=QDQ^T,
$$

with \(Q\) orthogonal.

## 64.3 Why This Is Special

A general matrix may not be diagonalizable.

For example,

$$
\begin{bmatrix}
2 & 1 \\
0 & 2
\end{bmatrix}
$$

has only one independent eigenvector.

A general real matrix may also have no real eigenvectors.

For example,

$$
\begin{bmatrix}
0 & -1 \\
1 & 0
\end{bmatrix}
$$

rotates the plane by \(90^\circ\). Its eigenvalues are \(i\) and \(-i\), so it has no real eigenvectors.

A real symmetric matrix avoids both problems. It has real eigenvalues. It has enough eigenvectors. Those eigenvectors can be chosen orthonormally.

This makes symmetric matrices one of the most important classes of matrices in linear algebra.

## 64.4 Orthogonal Diagonalization

A matrix \(A\) is orthogonally diagonalizable if there exists an orthogonal matrix \(Q\) and a diagonal matrix \(D\) such that

$$
A=QDQ^T.
$$

Since

$$
Q^{-1}=Q^T,
$$

this is a special case of diagonalization:

$$
A=QDQ^{-1}.
$$

The advantage is numerical and geometric. Orthogonal matrices preserve lengths and angles. They do not distort the inner product.

If \(Q\) is orthogonal, then for all vectors \(x,y\),

$$
(Qx)\cdot(Qy)=x\cdot y.
$$

Thus orthogonal diagonalization changes coordinates by a rotation or reflection, then applies independent scaling, then changes coordinates back.

## 64.5 Matrix Form of the Theorem

Suppose

$$
q_1,q_2,\ldots,q_n
$$

are orthonormal eigenvectors of \(A\), with corresponding eigenvalues

$$
\lambda_1,\lambda_2,\ldots,\lambda_n.
$$

Let

$$
Q=
\begin{bmatrix}
| & | & & | \\
q_1 & q_2 & \cdots & q_n \\
| & | & & |
\end{bmatrix}.
$$

Let

$$
D=
\begin{bmatrix}
\lambda_1 & 0 & \cdots & 0 \\
0 & \lambda_2 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & \lambda_n
\end{bmatrix}.
$$

Then

$$
AQ=QD.
$$

Since \(Q\) is orthogonal,

$$
A=QDQ^T.
$$

The order of the columns of \(Q\) must match the order of the eigenvalues in \(D\).

## 64.6 Real Eigenvalues

A key fact behind the spectral theorem is that a real symmetric matrix has real eigenvalues.

More generally, a Hermitian matrix has real eigenvalues. A matrix \(A\) over \(\mathbb{C}\) is Hermitian if

$$
A^*=A,
$$

where \(A^*\) is the conjugate transpose.

Suppose

$$
Av=\lambda v
$$

with \(v\neq 0\). If \(A\) is real symmetric, then

$$
v^TAv
$$

is a real number.

Using the eigenvalue equation,

$$
v^TAv=v^T(\lambda v)=\lambda v^Tv.
$$

Since

$$
v^Tv>0,
$$

the scalar \(\lambda\) must be real.

Over complex inner product spaces, the same argument uses

$$
\langle Av,v\rangle=\langle v,Av\rangle
$$

for Hermitian operators. Hermitian operators have real eigenvalues.

## 64.7 Orthogonality of Eigenvectors

Another key fact is that eigenvectors belonging to distinct eigenvalues are orthogonal.

Let \(A\) be symmetric. Suppose

$$
Av=\lambda v
$$

and

$$
Aw=\mu w,
$$

where

$$
\lambda\neq \mu.
$$

Using symmetry,

$$
(Av)\cdot w = v\cdot(Aw).
$$

Substitute the eigenvalue equations:

$$
(\lambda v)\cdot w = v\cdot(\mu w).
$$

Thus

$$
\lambda(v\cdot w)=\mu(v\cdot w).
$$

So

$$
(\lambda-\mu)(v\cdot w)=0.
$$

Since \(\lambda\neq\mu\),

$$
v\cdot w=0.
$$

Therefore \(v\) and \(w\) are orthogonal.

## 64.8 Repeated Eigenvalues

If an eigenvalue has multiplicity greater than one, its eigenspace may have dimension greater than one.

Inside that eigenspace, every vector has the same eigenvalue. The spectral theorem says that we can choose an orthonormal basis for each eigenspace.

This is done using Gram-Schmidt orthogonalization.

Suppose

$$
E_\lambda
$$

has basis

$$
v_1,v_2,\ldots,v_k.
$$

Applying Gram-Schmidt produces an orthonormal basis

$$
q_1,q_2,\ldots,q_k
$$

for the same eigenspace.

Since each \(q_i\) remains in \(E_\lambda\), each \(q_i\) is still an eigenvector with eigenvalue \(\lambda\).

Thus repeated eigenvalues cause no difficulty for symmetric matrices.

## 64.9 Spectral Decomposition

The spectral theorem can also be written as a sum of rank-one projections.

If

$$
A=QDQ^T
$$

with orthonormal eigenvectors

$$
q_1,\ldots,q_n
$$

and eigenvalues

$$
\lambda_1,\ldots,\lambda_n,
$$

then

$$
A=
\lambda_1 q_1q_1^T
+
\lambda_2 q_2q_2^T
+
\cdots
+
\lambda_n q_nq_n^T.
$$

The matrix

$$
q_iq_i^T
$$

is the orthogonal projection onto the line spanned by \(q_i\).

Thus \(A\) is a weighted sum of orthogonal projections. This representation is called the spectral decomposition. The finite-dimensional spectral theorem gives such a decomposition into eigenspaces or projections.

## 64.10 Projection Form by Eigenspaces

If the distinct eigenvalues of \(A\) are

$$
\alpha_1,\alpha_2,\ldots,\alpha_k,
$$

then the space decomposes as an orthogonal direct sum

$$
\mathbb{R}^n =
E_{\alpha_1}
\oplus
E_{\alpha_2}
\oplus
\cdots
\oplus
E_{\alpha_k}.
$$

Let \(P_i\) be the orthogonal projection onto \(E_{\alpha_i}\). Then

$$
A=
\alpha_1P_1+\alpha_2P_2+\cdots+\alpha_kP_k.
$$

This form groups together repeated eigenvalues.

The projection matrices satisfy

$$
P_iP_j=0
$$

for \(i\neq j\), and

$$
P_1+P_2+\cdots+P_k=I.
$$

Thus the identity decomposes into orthogonal spectral components.

## 64.11 Example: Orthogonal Diagonalization

Let

$$
A=
\begin{bmatrix}
2 & 1 \\
1 & 2
\end{bmatrix}.
$$

The characteristic polynomial is

$$
\det(A-\lambda I) =
\det
\begin{bmatrix}
2-\lambda & 1 \\
1 & 2-\lambda
\end{bmatrix}.
$$

Thus

$$
\det(A-\lambda I) =
(2-\lambda)^2-1.
$$

So

$$
\lambda^2-4\lambda+3=0.
$$

The eigenvalues are

$$
\lambda_1=3,
\qquad
\lambda_2=1.
$$

For \(\lambda_1=3\), one eigenvector is

$$
\begin{bmatrix}
1 \\
1
\end{bmatrix}.
$$

Normalize it:

$$
q_1=
\frac{1}{\sqrt2}
\begin{bmatrix}
1 \\
1
\end{bmatrix}.
$$

For \(\lambda_2=1\), one eigenvector is

$$
\begin{bmatrix}
1 \\
-1
\end{bmatrix}.
$$

Normalize it:

$$
q_2=
\frac{1}{\sqrt2}
\begin{bmatrix}
1 \\
-1
\end{bmatrix}.
$$

Then

$$
Q=
\frac{1}{\sqrt2}
\begin{bmatrix}
1 & 1 \\
1 & -1
\end{bmatrix}
$$

and

$$
D=
\begin{bmatrix}
3 & 0 \\
0 & 1
\end{bmatrix}.
$$

The spectral theorem gives

$$
A=QDQ^T.
$$

## 64.12 Checking the Example

First observe that

$$
Q^TQ=I.
$$

Indeed,

$$
Q^TQ =
\frac12
\begin{bmatrix}
1 & 1 \\
1 & -1
\end{bmatrix}
\begin{bmatrix}
1 & 1 \\
1 & -1
\end{bmatrix} =
\begin{bmatrix}
1 & 0 \\
0 & 1
\end{bmatrix}.
$$

Now compute

$$
QDQ^T.
$$

Since

$$
QDQ^T =
\frac12
\begin{bmatrix}
1 & 1 \\
1 & -1
\end{bmatrix}
\begin{bmatrix}
3 & 0 \\
0 & 1
\end{bmatrix}
\begin{bmatrix}
1 & 1 \\
1 & -1
\end{bmatrix},
$$

we get

$$
QDQ^T =
\begin{bmatrix}
2 & 1 \\
1 & 2
\end{bmatrix}.
$$

Thus

$$
A=QDQ^T.
$$

## 64.13 Quadratic Forms

The spectral theorem is especially useful for quadratic forms.

A quadratic form on \(\mathbb{R}^n\) has the form

$$
x^TAx,
$$

where \(A\) is symmetric.

If

$$
A=QDQ^T,
$$

let

$$
y=Q^Tx.
$$

Since \(Q\) is orthogonal, this is a change to orthonormal coordinates.

Then

$$
x^TAx =
x^TQDQ^Tx =
(Q^Tx)^TD(Q^Tx) =
y^TDy.
$$

If

$$
D=\operatorname{diag}(\lambda_1,\ldots,\lambda_n),
$$

then

$$
y^TDy =
\lambda_1y_1^2+\lambda_2y_2^2+\cdots+\lambda_ny_n^2.
$$

Thus the spectral theorem removes all cross terms from a quadratic form.

## 64.14 Positive Definite Matrices

A symmetric matrix \(A\) is positive definite if

$$
x^TAx>0
$$

for every nonzero vector \(x\).

Using the spectral theorem,

$$
x^TAx =
\lambda_1y_1^2+\cdots+\lambda_ny_n^2.
$$

Therefore \(A\) is positive definite exactly when all eigenvalues are positive.

Similarly:

| Matrix type | Eigenvalue condition |
|---|---|
| Positive definite | All eigenvalues are positive |
| Positive semidefinite | All eigenvalues are nonnegative |
| Negative definite | All eigenvalues are negative |
| Negative semidefinite | All eigenvalues are nonpositive |
| Indefinite | Eigenvalues have both positive and negative signs |

This criterion is one of the main uses of the spectral theorem in optimization and analysis.

## 64.15 Matrix Powers and Functions

If

$$
A=QDQ^T,
$$

then powers of \(A\) are easy to compute:

$$
A^k=QD^kQ^T.
$$

Since \(D\) is diagonal,

$$
D^k=
\operatorname{diag}(\lambda_1^k,\ldots,\lambda_n^k).
$$

More generally, for a function \(f\),

$$
f(A)=Qf(D)Q^T,
$$

where

$$
f(D)=\operatorname{diag}(f(\lambda_1),\ldots,f(\lambda_n)).
$$

This applies to functions such as

$$
A^{1/2},
\qquad
e^A,
\qquad
\log A,
\qquad
(A+\alpha I)^{-1}.
$$

The spectral theorem therefore reduces many matrix operations to scalar operations on eigenvalues.

## 64.16 Square Roots of Positive Semidefinite Matrices

If \(A\) is symmetric positive semidefinite, all eigenvalues satisfy

$$
\lambda_i\geq 0.
$$

With

$$
A=QDQ^T,
$$

define

$$
A^{1/2}=QD^{1/2}Q^T,
$$

where

$$
D^{1/2} =
\operatorname{diag}(\sqrt{\lambda_1},\ldots,\sqrt{\lambda_n}).
$$

Then

$$
A^{1/2}A^{1/2}=A.
$$

This square root is symmetric and positive semidefinite.

Matrix square roots are used in covariance matrices, numerical analysis, optimization, and differential equations.

## 64.17 Singular Value Decomposition Connection

The spectral theorem applies to symmetric or Hermitian matrices. The singular value decomposition applies to every matrix.

For a real matrix \(B\), the matrix

$$
B^TB
$$

is symmetric positive semidefinite. Therefore it has an orthonormal eigenbasis.

The eigenvalues of \(B^TB\) are nonnegative. Their square roots are the singular values of \(B\).

Thus the singular value decomposition is built from the spectral theorem applied to \(B^TB\) and \(BB^T\). The spectral decomposition is also commonly viewed as a special case related to the singular value decomposition.

## 64.18 Complex Version

Over complex vector spaces, the corresponding class is Hermitian matrices.

A matrix \(A\) is Hermitian if

$$
A^*=A.
$$

The spectral theorem for Hermitian matrices states that there exists a unitary matrix \(U\) and a real diagonal matrix \(D\) such that

$$
A=UDU^*.
$$

Here

$$
U^*U=I.
$$

The columns of \(U\) form an orthonormal basis of eigenvectors.

More generally, every normal complex matrix is unitarily diagonalizable. A matrix is normal if

$$
AA^*=A^*A.
$$

For normal matrices, the eigenvalues may be complex, but an orthonormal eigenbasis still exists.

## 64.19 Spectral Theorem for Linear Transformations

Let \(V\) be a finite-dimensional real inner product space, and let

$$
T:V\to V
$$

be a self-adjoint linear transformation. Self-adjoint means

$$
\langle T v,w\rangle=\langle v,Tw\rangle
$$

for all \(v,w\in V\).

The spectral theorem states that \(V\) has an orthonormal basis consisting of eigenvectors of \(T\).

In such a basis, the matrix of \(T\) is diagonal.

Thus the theorem is fundamentally a statement about self-adjoint transformations, not merely about arrays of numbers.

## 64.20 Common Errors

The first common error is to assume that every diagonalizable matrix is orthogonally diagonalizable. Orthogonal diagonalization is stronger.

The second common error is to forget symmetry. A real matrix with real eigenvalues may still fail to have orthogonal eigenvectors.

The third common error is to write

$$
A=QDQ^{-1}
$$

without using

$$
Q^{-1}=Q^T.
$$

For orthogonal diagonalization, the canonical form is

$$
A=QDQ^T.
$$

The fourth common error is to treat repeated eigenvalues as a problem. For symmetric matrices, repeated eigenspaces can always be given orthonormal bases.

The fifth common error is to ignore the distinction between real symmetric, complex Hermitian, and complex normal matrices.

## 64.21 Summary

The spectral theorem states that every real symmetric matrix has an orthonormal basis of eigenvectors.

Equivalently, if \(A=A^T\), then there exists an orthogonal matrix \(Q\) and a real diagonal matrix \(D\) such that

$$
A=QDQ^T.
$$

The columns of \(Q\) are orthonormal eigenvectors. The entries of \(D\) are eigenvalues.

The theorem makes symmetric matrices structurally transparent. It explains why quadratic forms can be diagonalized by orthogonal changes of coordinates, why positive definiteness is determined by eigenvalues, and why many matrix functions reduce to scalar functions of eigenvalues.
