A symmetric matrix is a square matrix equal to its transpose.
If is symmetric, then
Equivalently, the entries of satisfy
for every pair of indices and . Thus the matrix is mirrored across its main diagonal.
Symmetric matrices are central because they behave like real self-adjoint operators. Their eigenvalues are real, eigenvectors from distinct eigenvalues are orthogonal, and they can be diagonalized by an orthogonal matrix. These properties make them one of the best-behaved classes of matrices in linear algebra.
65.1 Definition
Let be an real matrix. The matrix is symmetric if
In entries, this means
For example,
is symmetric, since each entry above the diagonal matches the corresponding entry below the diagonal.
The matrix
is not symmetric, because
but
Only square matrices can be symmetric, since the equation requires and to have the same size.
65.2 Structure of a Symmetric Matrix
A symmetric matrix has free entries on and above the diagonal. The entries below the diagonal are then determined.
For a matrix,
There are six independent entries, not nine.
In general, an symmetric matrix has
independent entries. These consist of diagonal entries and
entries above the diagonal.
65.3 Basic Examples
Every diagonal matrix is symmetric. If
then
The identity matrix is symmetric:
The zero matrix is symmetric:
If , then the outer product
is symmetric, because
For example, if
then
65.4 Symmetric and Skew-Symmetric Parts
Every square matrix can be decomposed into a symmetric part and a skew-symmetric part.
Let be any square matrix. Define
and
Then
and
Also,
Thus
The symmetric part controls quadratic expressions such as . The skew-symmetric part contributes nothing to such expressions over the real numbers, because
whenever
65.5 Symmetric Matrices and Inner Products
A symmetric matrix satisfies a compatibility identity with the Euclidean inner product:
for all vectors .
Proof:
Since
we have
If is symmetric, then . Hence
This identity is the matrix form of self-adjointness. It is the reason symmetric matrices have real eigenvalues and orthogonal eigenspaces.
65.6 Real Eigenvalues
Every real symmetric matrix has real eigenvalues.
To see the idea, suppose
for a nonzero complex vector . Use the Hermitian inner product and write . Since is real symmetric, it is also Hermitian when viewed as a complex matrix.
Then
But is real for a Hermitian matrix. Since is real and positive, must be real.
Thus symmetric matrices do not produce nonreal eigenvalues. This sharply contrasts with general real matrices, such as rotation matrices.
For example,
has eigenvalues and . The matrix is not symmetric.
65.7 Orthogonality of Eigenvectors
Eigenvectors of a symmetric matrix corresponding to distinct eigenvalues are orthogonal.
Let
and
where
Using the identity from the previous section,
Substitute the eigenvalue equations:
Therefore
So
Since
we get
Thus and are orthogonal.
65.8 Orthogonal Diagonalization
A real symmetric matrix can be orthogonally diagonalized.
This means that if , then there exists an orthogonal matrix and a real diagonal matrix such that
Here
The columns of are orthonormal eigenvectors of . The diagonal entries of are the corresponding eigenvalues.
This is the finite-dimensional spectral theorem for real symmetric matrices. It is stronger than ordinary diagonalization because the change-of-basis matrix is orthogonal.
65.9 Example of Orthogonal Diagonalization
Let
The characteristic polynomial is
Thus
Expanding,
Hence
The eigenvalues are
For , one eigenvector is
Normalize it:
For , one eigenvector is
Normalize it:
Set
and
Then
65.10 Quadratic Forms
A symmetric matrix naturally defines a quadratic form:
For example, if
then
Computing gives
The off-diagonal entries appear twice, once from each symmetric position.
Every real quadratic form can be represented by a symmetric matrix. If a non-symmetric matrix appears in , only its symmetric part matters:
65.11 Positive Definite Symmetric Matrices
A real symmetric matrix is positive definite if
for every nonzero vector .
It is positive semidefinite if
for every vector .
By the spectral theorem, if
then with ,
If
then
Therefore:
| Type | Eigenvalue condition |
|---|---|
| Positive definite | for all |
| Positive semidefinite | for all |
| Negative definite | for all |
| Negative semidefinite | for all |
| Indefinite | Eigenvalues of both signs |
This criterion is one of the main reasons symmetric matrices are important in optimization.
65.12 Symmetric Matrices in Least Squares
Symmetric matrices arise in least squares problems.
For a matrix , the normal equations are
The matrix
is always symmetric, because
It is also positive semidefinite, since
If the columns of are linearly independent, then is positive definite.
Thus least squares problems naturally lead to symmetric positive definite matrices.
65.13 Symmetric Matrices in Optimization
Second derivatives of scalar functions are organized into Hessian matrices.
If
has continuous second partial derivatives, its Hessian is
By equality of mixed partial derivatives under standard smoothness assumptions,
Thus the Hessian is symmetric.
The eigenvalues of the Hessian determine local curvature. Positive definite Hessians describe strict local minima. Negative definite Hessians describe strict local maxima. Indefinite Hessians describe saddle behavior.
65.14 Symmetric Matrices and Graphs
Undirected graphs often produce symmetric matrices.
If is an undirected graph, its adjacency matrix satisfies
This is because an edge from vertex to vertex is also an edge from vertex to vertex .
Thus the adjacency matrix of an undirected graph is symmetric.
The graph Laplacian
is also symmetric when the graph is undirected. Here is the degree matrix and is the adjacency matrix.
The eigenvalues and eigenvectors of these symmetric matrices encode connectivity, clustering, expansion, random walks, and vibration modes.
65.15 Symmetric Rank-One Matrices
A rank-one symmetric matrix often has the form
For any vector ,
Since
is a scalar,
Thus maps every vector onto the direction of .
The matrix is positive semidefinite because
If , then has rank . Its nonzero eigenvalue is
with eigenvector .
65.16 Projection Matrices
An orthogonal projection matrix is symmetric and idempotent.
A matrix is idempotent if
If is also symmetric, then it represents orthogonal projection onto a subspace.
For example, projection onto the line spanned by a unit vector is
Then
and
The eigenvalues of an orthogonal projection are only and . Vectors in the projected subspace have eigenvalue . Vectors orthogonal to it have eigenvalue .
65.17 Symmetric Matrices and Singular Value Decomposition
For any real matrix , the matrices
and
are symmetric positive semidefinite.
Indeed,
and
The spectral theorem applies to both. The eigenvalues of are nonnegative, and their square roots are the singular values of .
Thus the singular value decomposition is built from symmetric positive semidefinite matrices.
65.18 Numerical Importance
Symmetric matrices are easier and safer to handle numerically than general matrices.
Eigenvalue algorithms can exploit symmetry to reduce work and improve stability. Symmetric matrices have real eigenvalues, orthogonal eigenspaces, and orthogonal diagonalizations. These properties avoid many complications of general nonsymmetric eigenvalue problems.
In numerical linear algebra, symmetric positive definite systems are especially important. They can often be solved efficiently by Cholesky decomposition or conjugate gradient methods.
The Cholesky factorization writes a symmetric positive definite matrix as
where is lower triangular. This factorization is a standard tool for solving linear systems, optimization problems, and covariance computations.
65.19 Common Errors
The first common error is to confuse symmetric with diagonal. Every diagonal matrix is symmetric, but many symmetric matrices have nonzero off-diagonal entries.
The second common error is to assume that has the same eigenvalues as . In general, it does not.
The third common error is to use ordinary diagonalization when orthogonal diagonalization is available. For symmetric matrices, the stronger form
should be used.
The fourth common error is to forget that symmetry is field-dependent. Over complex vector spaces, the correct analogue of real symmetry is usually Hermitian symmetry:
not merely
The fifth common error is to ignore ordering. If the first column of is an eigenvector for , then the first diagonal entry of must be . The ordering of eigenvectors and eigenvalues must match.
65.20 Summary
A symmetric matrix satisfies
Its entries mirror across the main diagonal.
Real symmetric matrices have real eigenvalues, orthogonal eigenspaces for distinct eigenvalues, and orthogonal diagonalizations of the form
They are the natural matrices for quadratic forms, least squares, optimization, graph theory, projections, and the singular value decomposition.
Symmetry is a strong structural condition. It turns many difficult matrix questions into problems about orthogonal coordinates and real eigenvalues.