# Chapter 66. Hermitian Operators

# Chapter 66. Hermitian Operators

Hermitian operators are the complex analogue of real symmetric matrices.

In real inner product spaces, symmetry is expressed by

$$
A^T=A.
$$

In complex inner product spaces, the correct analogue uses the conjugate transpose:

$$
A^*=A.
$$

A matrix satisfying this condition is called Hermitian. A linear operator satisfying the corresponding inner product identity is called self-adjoint or Hermitian.

Hermitian operators are central in spectral theory. Their eigenvalues are real, eigenvectors belonging to distinct eigenvalues are orthogonal, and they admit an orthonormal eigenbasis. Equivalently, every Hermitian matrix can be diagonalized by a unitary matrix.

## 66.1 Complex Inner Product Spaces

Let \(V\) be a vector space over \(\mathbb{C}\). An inner product on \(V\) assigns to each pair of vectors \(u,v\in V\) a complex number

$$
\langle u,v\rangle.
$$

The inner product is linear in one argument and conjugate-linear in the other. With the common mathematical convention, it is linear in the first argument:

$$
\langle au+bv,w\rangle =
a\langle u,w\rangle+b\langle v,w\rangle,
$$

and conjugate-linear in the second argument:

$$
\langle u,av+bw\rangle =
\overline{a}\langle u,v\rangle+\overline{b}\langle u,w\rangle.
$$

It also satisfies conjugate symmetry:

$$
\langle u,v\rangle=\overline{\langle v,u\rangle},
$$

and positivity:

$$
\langle v,v\rangle>0
$$

for every nonzero vector \(v\).

For \(V=\mathbb{C}^n\), the standard inner product is

$$
\langle x,y\rangle =
x^*y =
\overline{x_1}y_1+\overline{x_2}y_2+\cdots+\overline{x_n}y_n,
$$

depending on convention. The essential point is that complex conjugation is part of the inner product.

## 66.2 Conjugate Transpose

For a complex matrix \(A\), the conjugate transpose is denoted by

$$
A^*.
$$

It is obtained by first transposing the matrix and then conjugating every entry:

$$
A^*=\overline{A}^{\,T}.
$$

If

$$
A=
\begin{bmatrix}
1+i & 2 \\
3i & 4-i
\end{bmatrix},
$$

then

$$
A^*=
\begin{bmatrix}
1-i & -3i \\
2 & 4+i
\end{bmatrix}.
$$

The conjugate transpose is also called the adjoint matrix.

When all entries of \(A\) are real,

$$
A^*=A^T.
$$

Thus Hermitian matrices generalize real symmetric matrices.

## 66.3 Definition of Hermitian Matrix

A complex square matrix \(A\) is Hermitian if

$$
A^*=A.
$$

Equivalently, its entries satisfy

$$
a_{ij}=\overline{a_{ji}}.
$$

This means the entries below the diagonal are conjugates of the corresponding entries above the diagonal.

For example,

$$
A=
\begin{bmatrix}
2 & 1+i \\
1-i & 3
\end{bmatrix}
$$

is Hermitian.

The diagonal entries of a Hermitian matrix must be real. Indeed, if \(a_{ii}=\overline{a_{ii}}\), then \(a_{ii}\) equals its complex conjugate, so it is real.

The matrix

$$
B=
\begin{bmatrix}
2 & 1+i \\
1+i & 3
\end{bmatrix}
$$

is not Hermitian, because the off-diagonal entries are equal rather than conjugate.

## 66.4 Hermitian Operators

Let \(V\) be a complex inner product space. A linear operator

$$
T:V\to V
$$

is Hermitian, or self-adjoint, if

$$
\langle Tv,w\rangle=\langle v,Tw\rangle
$$

for all \(v,w\in V\).

This identity says that \(T\) can be moved from one side of the inner product to the other without changing the value.

In matrix form, relative to an orthonormal basis, this condition is exactly

$$
A^*=A.
$$

Thus Hermitian matrices are the coordinate representations of Hermitian operators.

## 66.5 Relation to Real Symmetric Matrices

If a Hermitian matrix has only real entries, then

$$
A^*=A^T.
$$

Therefore the Hermitian condition becomes

$$
A^T=A.
$$

So real Hermitian matrices are precisely real symmetric matrices.

This gives the following correspondence:

| Real case | Complex case |
|---|---|
| Symmetric matrix | Hermitian matrix |
| Orthogonal matrix | Unitary matrix |
| Transpose \(A^T\) | Conjugate transpose \(A^*\) |
| Orthogonal diagonalization | Unitary diagonalization |
| Euclidean inner product | Hermitian inner product |

The complex case requires conjugation because the geometry of \(\mathbb{C}^n\) is governed by the Hermitian inner product.

## 66.6 Real Eigenvalues

Every Hermitian operator has real eigenvalues.

Let

$$
Av=\lambda v
$$

with \(v\neq 0\). Then

$$
\langle Av,v\rangle =
\langle \lambda v,v\rangle.
$$

Using linearity,

$$
\langle Av,v\rangle=\lambda\langle v,v\rangle.
$$

Since \(A\) is Hermitian,

$$
\langle Av,v\rangle=\langle v,Av\rangle.
$$

Substitute \(Av=\lambda v\):

$$
\langle v,Av\rangle =
\langle v,\lambda v\rangle.
$$

By conjugate-linearity in the second argument,

$$
\langle v,\lambda v\rangle =
\overline{\lambda}\langle v,v\rangle.
$$

Hence

$$
\lambda\langle v,v\rangle =
\overline{\lambda}\langle v,v\rangle.
$$

Since

$$
\langle v,v\rangle>0,
$$

we obtain

$$
\lambda=\overline{\lambda}.
$$

Therefore \(\lambda\) is real.

This is one of the defining strengths of Hermitian operators: although the vector space is complex, the spectral values are real.

## 66.7 Orthogonality of Eigenvectors

Eigenvectors corresponding to distinct eigenvalues of a Hermitian operator are orthogonal.

Let

$$
Av=\lambda v
$$

and

$$
Aw=\mu w,
$$

where

$$
\lambda\neq\mu.
$$

Using the Hermitian identity,

$$
\langle Av,w\rangle=\langle v,Aw\rangle.
$$

Substitute the eigenvalue equations:

$$
\langle \lambda v,w\rangle=\langle v,\mu w\rangle.
$$

This gives

$$
\lambda\langle v,w\rangle =
\overline{\mu}\langle v,w\rangle.
$$

Since Hermitian eigenvalues are real,

$$
\overline{\mu}=\mu.
$$

Thus

$$
(\lambda-\mu)\langle v,w\rangle=0.
$$

Since

$$
\lambda\neq\mu,
$$

we conclude that

$$
\langle v,w\rangle=0.
$$

So the eigenvectors are orthogonal.

## 66.8 Spectral Theorem for Hermitian Matrices

The spectral theorem for Hermitian matrices states that every Hermitian matrix has an orthonormal basis of eigenvectors.

Equivalently, if \(A=A^*\), then there exists a unitary matrix \(U\) and a real diagonal matrix \(\Lambda\) such that

$$
A=U\Lambda U^*.
$$

The columns of \(U\) are orthonormal eigenvectors of \(A\). The diagonal entries of \(\Lambda\) are the corresponding real eigenvalues.

This is the complex version of orthogonal diagonalization for real symmetric matrices. The spectral theorem says, more generally, that normal operators on finite-dimensional Hermitian spaces have orthonormal eigenbases; Hermitian operators form a particularly important subclass whose eigenvalues are real.

## 66.9 Unitary Matrices

A complex square matrix \(U\) is unitary if

$$
U^*U=I.
$$

Equivalently,

$$
U^{-1}=U^*.
$$

The columns of a unitary matrix form an orthonormal basis of \(\mathbb{C}^n\).

Unitary matrices preserve inner products:

$$
\langle Ux,Uy\rangle=\langle x,y\rangle.
$$

They also preserve norms:

$$
\|Ux\|=\|x\|.
$$

In real linear algebra, orthogonal matrices represent rotations and reflections. In complex linear algebra, unitary matrices play the same structural role.

Thus the decomposition

$$
A=U\Lambda U^*
$$

means that \(A\) acts by a unitary change of coordinates, followed by real diagonal scaling, followed by the inverse unitary change of coordinates.

## 66.10 Example of a Hermitian Matrix

Consider

$$
A=
\begin{bmatrix}
2 & i \\
-i & 2
\end{bmatrix}.
$$

Then

$$
A^*=
\begin{bmatrix}
2 & i \\
-i & 2
\end{bmatrix}=A,
$$

so \(A\) is Hermitian.

Compute its characteristic polynomial:

$$
\det(A-\lambda I) =
\det
\begin{bmatrix}
2-\lambda & i \\
-i & 2-\lambda
\end{bmatrix}.
$$

Thus

$$
\det(A-\lambda I) =
(2-\lambda)^2-i(-i).
$$

Since

$$
i(-i)=1,
$$

we get

$$
\det(A-\lambda I) =
(2-\lambda)^2-1.
$$

Hence

$$
(2-\lambda)^2-1=0.
$$

So

$$
2-\lambda=\pm 1.
$$

The eigenvalues are

$$
\lambda=1
\qquad
\text{and}
\qquad
\lambda=3.
$$

Both are real, as expected.

## 66.11 Eigenvectors in the Example

For

$$
\lambda=3,
$$

we solve

$$
(A-3I)v=0.
$$

Now

$$
A-3I=
\begin{bmatrix}
-1 & i \\
-i & -1
\end{bmatrix}.
$$

Let

$$
v=
\begin{bmatrix}
x \\
y
\end{bmatrix}.
$$

The first equation is

$$
-x+iy=0.
$$

Thus

$$
x=iy.
$$

Take

$$
y=1.
$$

Then

$$
x=i.
$$

So one eigenvector is

$$
v_1=
\begin{bmatrix}
i \\
1
\end{bmatrix}.
$$

For

$$
\lambda=1,
$$

we solve

$$
(A-I)v=0.
$$

Now

$$
A-I=
\begin{bmatrix}
1 & i \\
-i & 1
\end{bmatrix}.
$$

The first equation is

$$
x+iy=0.
$$

Thus

$$
x=-iy.
$$

Take

$$
y=1.
$$

Then

$$
x=-i.
$$

So one eigenvector is

$$
v_2=
\begin{bmatrix}
-i \\
1
\end{bmatrix}.
$$

## 66.12 Orthogonality in the Example

Using the standard Hermitian inner product, compute

$$
\langle v_1,v_2\rangle.
$$

With

$$
v_1=
\begin{bmatrix}
i \\
1
\end{bmatrix},
\qquad
v_2=
\begin{bmatrix}
-i \\
1
\end{bmatrix},
$$

we have

$$
\langle v_1,v_2\rangle =
\overline{i}(-i)+\overline{1}(1).
$$

Since

$$
\overline{i}=-i,
$$

this becomes

$$
(-i)(-i)+1.
$$

Now

$$
(-i)(-i)=-1.
$$

Therefore

$$
\langle v_1,v_2\rangle=-1+1=0.
$$

The eigenvectors are orthogonal.

Normalize them:

$$
u_1=\frac{1}{\sqrt2}
\begin{bmatrix}
i \\
1
\end{bmatrix},
\qquad
u_2=\frac{1}{\sqrt2}
\begin{bmatrix}
-i \\
1
\end{bmatrix}.
$$

Then \(u_1,u_2\) form an orthonormal basis of \(\mathbb{C}^2\).

## 66.13 Spectral Decomposition

If

$$
A=U\Lambda U^*
$$

with orthonormal eigenvectors

$$
u_1,u_2,\ldots,u_n
$$

and eigenvalues

$$
\lambda_1,\lambda_2,\ldots,\lambda_n,
$$

then

$$
A=
\lambda_1u_1u_1^*
+
\lambda_2u_2u_2^*
+
\cdots
+
\lambda_nu_nu_n^*.
$$

Each matrix

$$
u_iu_i^*
$$

is the orthogonal projection onto the one-dimensional subspace spanned by \(u_i\).

If an eigenvalue has multiplicity greater than one, one may group the terms by eigenspace. If the distinct eigenvalues are

$$
\alpha_1,\ldots,\alpha_k,
$$

and \(P_j\) is the orthogonal projection onto \(E_{\alpha_j}\), then

$$
A=\alpha_1P_1+\cdots+\alpha_kP_k.
$$

This is the spectral decomposition of a Hermitian operator.

## 66.14 Hermitian Forms and Quadratic Quantities

For a Hermitian matrix \(A\), the scalar

$$
x^*Ax
$$

is always real.

Indeed,

$$
(x^*Ax)^* =
x^*A^*x.
$$

Since \(A^*=A\),

$$
(x^*Ax)^*=x^*Ax.
$$

A complex number equal to its own conjugate is real.

The expression \(x^*Ax\) is called a Hermitian form. It is the complex analogue of the real quadratic form

$$
x^TAx.
$$

Hermitian forms appear in optimization, statistics, signal processing, numerical analysis, and quantum mechanics.

## 66.15 Positive Definite Hermitian Matrices

A Hermitian matrix \(A\) is positive definite if

$$
x^*Ax>0
$$

for every nonzero vector \(x\in\mathbb{C}^n\).

It is positive semidefinite if

$$
x^*Ax\geq 0
$$

for every \(x\).

By the spectral theorem, write

$$
A=U\Lambda U^*.
$$

Let

$$
y=U^*x.
$$

Then

$$
x^*Ax =
x^*U\Lambda U^*x =
y^*\Lambda y.
$$

If

$$
\Lambda=\operatorname{diag}(\lambda_1,\ldots,\lambda_n),
$$

then

$$
y^*\Lambda y =
\lambda_1|y_1|^2+\cdots+\lambda_n|y_n|^2.
$$

Therefore:

| Type | Eigenvalue condition |
|---|---|
| Positive definite | All eigenvalues are positive |
| Positive semidefinite | All eigenvalues are nonnegative |
| Negative definite | All eigenvalues are negative |
| Negative semidefinite | All eigenvalues are nonpositive |
| Indefinite | Eigenvalues of both signs |

The same eigenvalue criterion used for real symmetric matrices holds for Hermitian matrices.

## 66.16 Rayleigh Quotient

For a Hermitian matrix \(A\), the Rayleigh quotient of a nonzero vector \(x\) is

$$
R_A(x)=\frac{x^*Ax}{x^*x}.
$$

Since \(x^*Ax\) and \(x^*x\) are real, the Rayleigh quotient is real.

If \(x\) is an eigenvector with eigenvalue \(\lambda\), then

$$
R_A(x)=\lambda.
$$

Indeed,

$$
R_A(x) =
\frac{x^*Ax}{x^*x} =
\frac{x^*(\lambda x)}{x^*x} =
\lambda.
$$

The Rayleigh quotient connects Hermitian eigenvalues with optimization. The largest eigenvalue is the maximum value of \(R_A(x)\) over all nonzero \(x\), and the smallest eigenvalue is the minimum value.

## 66.17 Hermitian Operators in Quantum Mechanics

Hermitian operators are the standard mathematical model for observables in quantum mechanics.

The reason is spectral. Measurements are represented by eigenvalues, and possible measurement values must be real. Hermitian operators guarantee real eigenvalues.

If a system is in an eigenvector state \(v\), and an observable is represented by a Hermitian operator \(A\), then

$$
Av=\lambda v
$$

means the observable has definite value \(\lambda\) in that state.

The spectral decomposition represents the observable as a sum of measurement values times projection operators.

This is one of the most important applications of Hermitian linear algebra.

## 66.18 Hermitian Matrices in Numerical Linear Algebra

Hermitian matrices are numerically favorable.

Their eigenvalues are real. Their eigenvectors can be chosen orthonormally. Their diagonalization uses unitary matrices, which preserve norms and do not amplify errors by changing scale.

Hermitian positive definite systems can be solved using Cholesky factorization:

$$
A=LL^*,
$$

where \(L\) is lower triangular.

They also support efficient iterative methods, such as conjugate gradient methods, when the matrix is large and sparse.

Many numerical algorithms preserve Hermitian structure because losing that structure can introduce artificial complex eigenvalues or unstable behavior.

## 66.19 Hermitian, Normal, and Unitary Matrices

Hermitian matrices are part of a larger family called normal matrices.

A matrix \(A\) is normal if

$$
A^*A=AA^*.
$$

Every Hermitian matrix is normal, because if \(A^*=A\), then

$$
A^*A=A^2=AA^*.
$$

Every unitary matrix is also normal, because

$$
U^*U=UU^*=I.
$$

The spectral theorem for normal matrices says that a complex matrix is unitarily diagonalizable if and only if it is normal. Hermitian matrices are the normal matrices whose eigenvalues are real.

| Class | Defining condition | Spectral property |
|---|---|---|
| Hermitian | \(A^*=A\) | Unitarily diagonalizable with real eigenvalues |
| Unitary | \(A^*A=I\) | Unitarily diagonalizable with eigenvalues on the unit circle |
| Normal | \(A^*A=AA^*\) | Unitarily diagonalizable |
| Real symmetric | \(A^T=A\) | Orthogonally diagonalizable with real eigenvalues |

## 66.20 Common Errors

The first common error is to use the transpose instead of the conjugate transpose. Over complex vector spaces, the correct condition is

$$
A^*=A,
$$

not merely

$$
A^T=A.
$$

The second common error is to assume a Hermitian matrix must have real entries. Hermitian matrices may have complex off-diagonal entries, but those entries must occur in conjugate pairs.

The third common error is to forget that diagonal entries must be real.

The fourth common error is to use ordinary dot products without conjugation. Complex inner product geometry requires conjugation.

The fifth common error is to confuse Hermitian and unitary. Hermitian means

$$
A^*=A.
$$

Unitary means

$$
A^*A=I.
$$

A matrix may be both, but the conditions are different.

## 66.21 Summary

A Hermitian matrix satisfies

$$
A^*=A.
$$

A Hermitian operator satisfies

$$
\langle Tv,w\rangle=\langle v,Tw\rangle.
$$

Hermitian operators are the complex analogue of real symmetric operators. Their eigenvalues are real. Eigenvectors corresponding to distinct eigenvalues are orthogonal. They admit an orthonormal eigenbasis.

In matrix form, every Hermitian matrix has a unitary diagonalization

$$
A=U\Lambda U^*,
$$

where \(\Lambda\) is real diagonal.

Hermitian operators form one of the central classes of linear algebra because they combine complex vector spaces with real spectral values and orthogonal geometry.
