The spectral theorem describes the best possible form of certain linear transformations.
For a general matrix, diagonalization may fail. A matrix may lack enough eigenvectors. Its eigenvalues may be complex. Its eigenvectors may fail to be orthogonal. The spectral theorem identifies an important class of matrices for which these difficulties disappear.
For real symmetric matrices, the theorem says that every such matrix has an orthonormal basis of eigenvectors. Equivalently, every real symmetric matrix can be diagonalized by an orthogonal matrix. This is the finite-dimensional real form of the spectral theorem.
64.1 Symmetric Matrices
A real square matrix is symmetric if
This means that the entry in row , column , equals the entry in row , column :
For example,
is symmetric.
The matrix
is not symmetric, because
but
Symmetric matrices occur naturally in geometry, quadratic forms, least squares, optimization, mechanics, statistics, and graph theory.
64.2 Statement of the Real Spectral Theorem
Let be a real symmetric matrix.
Then there exists an orthonormal basis of consisting of eigenvectors of .
Equivalently, there exists an orthogonal matrix and a real diagonal matrix such that
Here
so
The columns of are orthonormal eigenvectors of . The diagonal entries of are the corresponding eigenvalues.
This is stronger than ordinary diagonalization. Ordinary diagonalization gives
The spectral theorem gives
with orthogonal.
64.3 Why This Is Special
A general matrix may not be diagonalizable.
For example,
has only one independent eigenvector.
A general real matrix may also have no real eigenvectors.
For example,
rotates the plane by . Its eigenvalues are and , so it has no real eigenvectors.
A real symmetric matrix avoids both problems. It has real eigenvalues. It has enough eigenvectors. Those eigenvectors can be chosen orthonormally.
This makes symmetric matrices one of the most important classes of matrices in linear algebra.
64.4 Orthogonal Diagonalization
A matrix is orthogonally diagonalizable if there exists an orthogonal matrix and a diagonal matrix such that
Since
this is a special case of diagonalization:
The advantage is numerical and geometric. Orthogonal matrices preserve lengths and angles. They do not distort the inner product.
If is orthogonal, then for all vectors ,
Thus orthogonal diagonalization changes coordinates by a rotation or reflection, then applies independent scaling, then changes coordinates back.
64.5 Matrix Form of the Theorem
Suppose
are orthonormal eigenvectors of , with corresponding eigenvalues
Let
Let
Then
Since is orthogonal,
The order of the columns of must match the order of the eigenvalues in .
64.6 Real Eigenvalues
A key fact behind the spectral theorem is that a real symmetric matrix has real eigenvalues.
More generally, a Hermitian matrix has real eigenvalues. A matrix over is Hermitian if
where is the conjugate transpose.
Suppose
with . If is real symmetric, then
is a real number.
Using the eigenvalue equation,
Since
the scalar must be real.
Over complex inner product spaces, the same argument uses
for Hermitian operators. Hermitian operators have real eigenvalues.
64.7 Orthogonality of Eigenvectors
Another key fact is that eigenvectors belonging to distinct eigenvalues are orthogonal.
Let be symmetric. Suppose
and
where
Using symmetry,
Substitute the eigenvalue equations:
Thus
So
Since ,
Therefore and are orthogonal.
64.8 Repeated Eigenvalues
If an eigenvalue has multiplicity greater than one, its eigenspace may have dimension greater than one.
Inside that eigenspace, every vector has the same eigenvalue. The spectral theorem says that we can choose an orthonormal basis for each eigenspace.
This is done using Gram-Schmidt orthogonalization.
Suppose
has basis
Applying Gram-Schmidt produces an orthonormal basis
for the same eigenspace.
Since each remains in , each is still an eigenvector with eigenvalue .
Thus repeated eigenvalues cause no difficulty for symmetric matrices.
64.9 Spectral Decomposition
The spectral theorem can also be written as a sum of rank-one projections.
If
with orthonormal eigenvectors
and eigenvalues
then
The matrix
is the orthogonal projection onto the line spanned by .
Thus is a weighted sum of orthogonal projections. This representation is called the spectral decomposition. The finite-dimensional spectral theorem gives such a decomposition into eigenspaces or projections.
64.10 Projection Form by Eigenspaces
If the distinct eigenvalues of are
then the space decomposes as an orthogonal direct sum
Let be the orthogonal projection onto . Then
This form groups together repeated eigenvalues.
The projection matrices satisfy
for , and
Thus the identity decomposes into orthogonal spectral components.
64.11 Example: Orthogonal Diagonalization
Let
The characteristic polynomial is
Thus
So
The eigenvalues are
For , one eigenvector is
Normalize it:
For , one eigenvector is
Normalize it:
Then
and
The spectral theorem gives
64.12 Checking the Example
First observe that
Indeed,
Now compute
Since
we get
Thus
64.13 Quadratic Forms
The spectral theorem is especially useful for quadratic forms.
A quadratic form on has the form
where is symmetric.
If
let
Since is orthogonal, this is a change to orthonormal coordinates.
Then
If
then
Thus the spectral theorem removes all cross terms from a quadratic form.
64.14 Positive Definite Matrices
A symmetric matrix is positive definite if
for every nonzero vector .
Using the spectral theorem,
Therefore is positive definite exactly when all eigenvalues are positive.
Similarly:
| Matrix type | Eigenvalue condition |
|---|---|
| Positive definite | All eigenvalues are positive |
| Positive semidefinite | All eigenvalues are nonnegative |
| Negative definite | All eigenvalues are negative |
| Negative semidefinite | All eigenvalues are nonpositive |
| Indefinite | Eigenvalues have both positive and negative signs |
This criterion is one of the main uses of the spectral theorem in optimization and analysis.
64.15 Matrix Powers and Functions
If
then powers of are easy to compute:
Since is diagonal,
More generally, for a function ,
where
This applies to functions such as
The spectral theorem therefore reduces many matrix operations to scalar operations on eigenvalues.
64.16 Square Roots of Positive Semidefinite Matrices
If is symmetric positive semidefinite, all eigenvalues satisfy
With
define
where
Then
This square root is symmetric and positive semidefinite.
Matrix square roots are used in covariance matrices, numerical analysis, optimization, and differential equations.
64.17 Singular Value Decomposition Connection
The spectral theorem applies to symmetric or Hermitian matrices. The singular value decomposition applies to every matrix.
For a real matrix , the matrix
is symmetric positive semidefinite. Therefore it has an orthonormal eigenbasis.
The eigenvalues of are nonnegative. Their square roots are the singular values of .
Thus the singular value decomposition is built from the spectral theorem applied to and . The spectral decomposition is also commonly viewed as a special case related to the singular value decomposition.
64.18 Complex Version
Over complex vector spaces, the corresponding class is Hermitian matrices.
A matrix is Hermitian if
The spectral theorem for Hermitian matrices states that there exists a unitary matrix and a real diagonal matrix such that
Here
The columns of form an orthonormal basis of eigenvectors.
More generally, every normal complex matrix is unitarily diagonalizable. A matrix is normal if
For normal matrices, the eigenvalues may be complex, but an orthonormal eigenbasis still exists.
64.19 Spectral Theorem for Linear Transformations
Let be a finite-dimensional real inner product space, and let
be a self-adjoint linear transformation. Self-adjoint means
for all .
The spectral theorem states that has an orthonormal basis consisting of eigenvectors of .
In such a basis, the matrix of is diagonal.
Thus the theorem is fundamentally a statement about self-adjoint transformations, not merely about arrays of numbers.
64.20 Common Errors
The first common error is to assume that every diagonalizable matrix is orthogonally diagonalizable. Orthogonal diagonalization is stronger.
The second common error is to forget symmetry. A real matrix with real eigenvalues may still fail to have orthogonal eigenvectors.
The third common error is to write
without using
For orthogonal diagonalization, the canonical form is
The fourth common error is to treat repeated eigenvalues as a problem. For symmetric matrices, repeated eigenspaces can always be given orthonormal bases.
The fifth common error is to ignore the distinction between real symmetric, complex Hermitian, and complex normal matrices.
64.21 Summary
The spectral theorem states that every real symmetric matrix has an orthonormal basis of eigenvectors.
Equivalently, if , then there exists an orthogonal matrix and a real diagonal matrix such that
The columns of are orthonormal eigenvectors. The entries of are eigenvalues.
The theorem makes symmetric matrices structurally transparent. It explains why quadratic forms can be diagonalized by orthogonal changes of coordinates, why positive definiteness is determined by eigenvalues, and why many matrix functions reduce to scalar functions of eigenvalues.