Skip to content

Chapter 54. Unitary and Orthogonal Matrices

Orthogonal and unitary matrices are the matrices that preserve inner product geometry. They preserve lengths, angles, distances, and orthogonality. In real vector spaces, the relevant matrices are orthogonal. In complex vector spaces, the corresponding matrices are unitary.

A real square matrix QQ is orthogonal when

QTQ=I. Q^TQ = I.

Equivalently,

Q1=QT. Q^{-1}=Q^T.

A complex square matrix UU is unitary when

UU=I, U^*U = I,

where UU^* denotes the conjugate transpose. Equivalently,

U1=U. U^{-1}=U^*.

Orthogonal matrices are real unitary matrices. Both classes preserve inner products and therefore act as rigid transformations of Euclidean or complex inner product space.

54.1 Orthogonal Matrices

Let QRn×nQ\in\mathbb{R}^{n\times n}. The matrix QQ is orthogonal if

QTQ=I. Q^TQ=I.

This condition means that the columns of QQ form an orthonormal basis of Rn\mathbb{R}^n. If

Q=[q1q2qn], Q= \begin{bmatrix} | & | & & |\\ q_1 & q_2 & \cdots & q_n\\ | & | & & | \end{bmatrix},

then

QTQ=[q1Tq1q1Tq2q1Tqnq2Tq1q2Tq2q2TqnqnTq1qnTq2qnTqn]. Q^TQ = \begin{bmatrix} q_1^Tq_1 & q_1^Tq_2 & \cdots & q_1^Tq_n\\ q_2^Tq_1 & q_2^Tq_2 & \cdots & q_2^Tq_n\\ \vdots & \vdots & \ddots & \vdots\\ q_n^Tq_1 & q_n^Tq_2 & \cdots & q_n^Tq_n \end{bmatrix}.

Thus QTQ=IQ^TQ=I means

qiTqj=δij. q_i^Tq_j=\delta_{ij}.

The columns are unit vectors, and distinct columns are orthogonal.

Since QQ is square, the same condition also implies

QQT=I. QQ^T=I.

Therefore the rows of QQ are also orthonormal.

54.2 Inverse of an Orthogonal Matrix

If QQ is orthogonal, then

QTQ=I. Q^TQ=I.

Since QQ is square, this implies

Q1=QT. Q^{-1}=Q^T.

This is one of the main computational advantages of orthogonal matrices. The inverse is obtained by transposition. No elimination or matrix inversion algorithm is required.

For example,

Q=12[1111] Q= \frac{1}{\sqrt{2}} \begin{bmatrix} 1 & 1\\ 1 & -1 \end{bmatrix}

is orthogonal because

QTQ=I. Q^TQ=I.

Also,

Q1=QT=Q. Q^{-1}=Q^T=Q.

This particular matrix is symmetric as well as orthogonal, so it is its own inverse.

54.3 Preservation of Inner Products

Orthogonal matrices preserve inner products.

Let QRn×nQ\in\mathbb{R}^{n\times n} be orthogonal. Then for any x,yRnx,y\in\mathbb{R}^n,

Qx,Qy=(Qx)T(Qy). \langle Qx,Qy\rangle = (Qx)^T(Qy).

Using matrix multiplication,

(Qx)T(Qy)=xTQTQy. (Qx)^T(Qy) = x^TQ^TQy.

Since QTQ=IQ^TQ=I,

xTQTQy=xTy. x^TQ^TQy=x^Ty.

Therefore

Qx,Qy=x,y. \langle Qx,Qy\rangle=\langle x,y\rangle.

This identity is the central meaning of orthogonality for matrices. Orthogonal matrices preserve the dot product. Since length and angle are computed from the dot product, they preserve Euclidean geometry.

54.4 Preservation of Norms

Taking y=xy=x in the inner product identity gives

Qx,Qx=x,x. \langle Qx,Qx\rangle=\langle x,x\rangle.

Thus

Qx22=x22. \|Qx\|_2^2=\|x\|_2^2.

Since norms are nonnegative,

Qx2=x2. \|Qx\|_2=\|x\|_2.

So an orthogonal matrix does not stretch or shrink vectors. It may rotate, reflect, or permute directions, but it preserves length.

This is why orthogonal matrices are called length-preserving transformations or isometries.

54.5 Preservation of Distance

For any x,yRnx,y\in\mathbb{R}^n,

QxQy2=Q(xy)2. \|Qx-Qy\|_2 = \|Q(x-y)\|_2.

Since QQ preserves norms,

Q(xy)2=xy2. \|Q(x-y)\|_2=\|x-y\|_2.

Therefore

QxQy2=xy2. \|Qx-Qy\|_2=\|x-y\|_2.

An orthogonal matrix preserves distances between points. This confirms the geometric interpretation: orthogonal matrices are rigid linear transformations.

54.6 Preservation of Angles

For nonzero vectors xx and yy, the angle θ\theta between them is determined by

cosθ=xTyx2y2. \cos\theta = \frac{x^Ty}{\|x\|_2\|y\|_2}.

After applying QQ, the corresponding value is

(Qx)T(Qy)Qx2Qy2. \frac{(Qx)^T(Qy)}{\|Qx\|_2\|Qy\|_2}.

Since QQ preserves inner products and norms, this equals

xTyx2y2. \frac{x^Ty}{\|x\|_2\|y\|_2}.

Thus orthogonal matrices preserve angles.

In particular, if

xy, x\perp y,

then

QxQy. Qx\perp Qy.

Orthogonality is preserved under orthogonal transformations.

54.7 Rotations in the Plane

The standard rotation matrix in R2\mathbb{R}^2 is

Rθ=[cosθsinθsinθcosθ]. R_\theta = \begin{bmatrix} \cos\theta & -\sin\theta\\ \sin\theta & \cos\theta \end{bmatrix}.

Its columns are

[cosθsinθ],[sinθcosθ]. \begin{bmatrix} \cos\theta\\ \sin\theta \end{bmatrix}, \qquad \begin{bmatrix} -\sin\theta\\ \cos\theta \end{bmatrix}.

Each has length one, and their dot product is

cosθsinθ+sinθcosθ=0. -\cos\theta\sin\theta+\sin\theta\cos\theta=0.

Therefore

RθTRθ=I. R_\theta^TR_\theta=I.

So RθR_\theta is orthogonal.

It has determinant

detRθ=cos2θ+sin2θ=1. \det R_\theta = \cos^2\theta+\sin^2\theta = 1.

Thus it preserves orientation as well as length and angle. It is a pure rotation.

54.8 Reflections

A reflection matrix is also orthogonal.

For example,

F=[1001] F= \begin{bmatrix} 1 & 0\\ 0 & -1 \end{bmatrix}

reflects the plane across the xx-axis. It satisfies

FTF=I. F^TF=I.

Thus it is orthogonal.

Its determinant is

detF=1. \det F=-1.

The determinant distinguishes rotations from reflections in two dimensions. Orthogonal matrices with determinant 11 preserve orientation. Orthogonal matrices with determinant 1-1 reverse orientation.

54.9 Determinant of an Orthogonal Matrix

If QQ is orthogonal, then

QTQ=I. Q^TQ=I.

Taking determinants gives

det(QTQ)=detI. \det(Q^TQ)=\det I.

Using determinant rules,

det(QT)det(Q)=1. \det(Q^T)\det(Q)=1.

Since

det(QT)=det(Q), \det(Q^T)=\det(Q),

we obtain

(detQ)2=1. (\det Q)^2=1.

Therefore

detQ=±1. \det Q=\pm 1.

Every orthogonal matrix has determinant 11 or 1-1. The converse is false. A matrix may have determinant ±1\pm 1 without being orthogonal.

54.10 The Orthogonal Group

The set of all n×nn\times n orthogonal matrices is denoted

O(n). O(n).

It is called the orthogonal group.

It is a group under matrix multiplication. If Q1,Q2O(n)Q_1,Q_2\in O(n), then

(Q1Q2)T(Q1Q2)=Q2TQ1TQ1Q2=Q2TQ2=I. (Q_1Q_2)^T(Q_1Q_2) = Q_2^TQ_1^TQ_1Q_2 = Q_2^TQ_2 = I.

Thus Q1Q2O(n)Q_1Q_2\in O(n).

The identity matrix belongs to O(n)O(n), and the inverse of an orthogonal matrix is orthogonal because

Q1=QT. Q^{-1}=Q^T.

The subgroup of orthogonal matrices with determinant 11 is denoted

SO(n). SO(n).

It is called the special orthogonal group. In R2\mathbb{R}^2 and R3\mathbb{R}^3, its elements are rotations.

54.11 Unitary Matrices

Let UCn×nU\in\mathbb{C}^{n\times n}. The matrix UU is unitary if

UU=I, U^*U=I,

where

U=UT U^*=\overline{U}^T

is the conjugate transpose.

The columns of UU form an orthonormal basis of Cn\mathbb{C}^n with respect to the standard complex inner product.

If

U=[u1u2un], U= \begin{bmatrix} | & | & & |\\ u_1 & u_2 & \cdots & u_n\\ | & | & & | \end{bmatrix},

then

UU=I U^*U=I

means

uiuj=δij. u_i^*u_j=\delta_{ij}.

Thus the columns have complex norm one and are mutually orthogonal.

54.12 Inverse of a Unitary Matrix

If UU is unitary, then

UU=I. U^*U=I.

Since UU is square, this also gives

UU=I. UU^*=I.

Therefore

U1=U. U^{-1}=U^*.

As in the real case, inversion is reduced to taking an adjoint. This is a major reason unitary transformations are preferred in complex numerical linear algebra and quantum mechanics.

54.13 Preservation of Complex Inner Products

Let UU be unitary. For x,yCnx,y\in\mathbb{C}^n,

Ux,Uy=(Ux)(Uy). \langle Ux,Uy\rangle = (Ux)^*(Uy).

Using matrix algebra,

(Ux)(Uy)=xUUy. (Ux)^*(Uy) = x^*U^*Uy.

Since UU=IU^*U=I,

xUUy=xy. x^*U^*Uy=x^*y.

Therefore

Ux,Uy=x,y. \langle Ux,Uy\rangle=\langle x,y\rangle.

Thus unitary matrices preserve the complex inner product. Consequently, they preserve norms, distances, and orthogonality in Cn\mathbb{C}^n.

54.14 Examples of Unitary Matrices

The simplest unitary matrices are complex numbers of modulus one. A 1×11\times 1 matrix

U=[z] U=[z]

is unitary exactly when

zz=1, \overline{z}z=1,

or

z=1. |z|=1.

Thus

z=eiθ z=e^{i\theta}

gives a unitary transformation of C\mathbb{C}.

A diagonal matrix

U=[eiθ1000eiθ2000eiθn] U= \begin{bmatrix} e^{i\theta_1} & 0 & \cdots & 0\\ 0 & e^{i\theta_2} & \cdots & 0\\ \vdots & \vdots & \ddots & \vdots\\ 0 & 0 & \cdots & e^{i\theta_n} \end{bmatrix}

is unitary because each diagonal entry has modulus one.

The normalized Fourier matrix is another important example. Its entries are complex roots of unity, scaled so that the columns become orthonormal.

54.15 Orthogonal Matrices as Real Unitary Matrices

If QQ is real, then the conjugate transpose QQ^* equals the ordinary transpose QTQ^T.

Thus the unitary condition

QQ=I Q^*Q=I

becomes

QTQ=I. Q^TQ=I.

Therefore a real matrix is unitary exactly when it is orthogonal.

This is why unitary matrices are the complex analogue of orthogonal matrices.

54.16 Orthogonal and Unitary Similarity

Orthogonal and unitary matrices are used for changes of orthonormal coordinates.

In the real case, a change of orthonormal basis has the form

B=QTAQ. B=Q^TAQ.

This is called an orthogonal similarity transformation.

In the complex case, the corresponding form is

B=UAU. B=U^*AU.

This is called a unitary similarity transformation.

These transformations preserve important matrix properties, including eigenvalues, rank, trace, determinant, and many norm-related quantities. They are central in spectral theory because they change coordinates without distorting inner product geometry.

54.17 Orthogonal Diagonalization

A real symmetric matrix AA can be diagonalized by an orthogonal matrix:

A=QΛQT, A=Q\Lambda Q^T,

where QQ is orthogonal and Λ\Lambda is diagonal.

Equivalently,

QTAQ=Λ. Q^TAQ=\Lambda.

The columns of QQ are orthonormal eigenvectors of AA, and the diagonal entries of Λ\Lambda are the corresponding eigenvalues.

This result is the spectral theorem for real symmetric matrices. Orthogonal diagonalization is especially important because it diagonalizes the matrix while preserving Euclidean geometry.

54.18 Unitary Diagonalization

A complex normal matrix AA can be diagonalized by a unitary matrix:

A=UΛU, A=U\Lambda U^*,

where UU is unitary and Λ\Lambda is diagonal.

Equivalently,

UAU=Λ. U^*AU=\Lambda.

The columns of UU are orthonormal eigenvectors of AA.

This is the complex spectral theorem. It includes Hermitian matrices, skew-Hermitian matrices, and unitary matrices themselves as important special cases.

Unitary diagonalization is the natural complex analogue of orthogonal diagonalization.

54.19 Eigenvalues of Orthogonal and Unitary Matrices

If UU is unitary and

Ux=λx Ux=\lambda x

for some nonzero vector xx, then

Ux2=λx2. \|Ux\|_2=\|\lambda x\|_2.

Since UU preserves norm,

x2=λx2. \|x\|_2=|\lambda|\|x\|_2.

Because x0x\ne 0,

λ=1. |\lambda|=1.

Thus every eigenvalue of a unitary matrix has modulus one.

For real orthogonal matrices, complex eigenvalues may occur, but they also have modulus one. Real eigenvalues of an orthogonal matrix must be

1or1. 1 \quad \text{or} \quad -1.

This matches the geometric interpretation. Orthogonal and unitary matrices do not expand or contract eigenvectors.

54.20 Numerical Importance

Orthogonal and unitary matrices are fundamental in numerical linear algebra.

If QQ is orthogonal, then

Qx2=x2. \|Qx\|_2=\|x\|_2.

Thus multiplying by QQ does not amplify vector errors in the Euclidean norm. The condition number of an orthogonal matrix in the 22-norm is

κ2(Q)=1. \kappa_2(Q)=1.

This is the smallest possible condition number for an invertible matrix. For this reason, stable algorithms often rely on orthogonal or unitary transformations.

Examples include:

AlgorithmOrthogonal or unitary ingredient
QR factorizationHouseholder reflections or Givens rotations
Least squaresOrthogonal reduction to triangular form
Eigenvalue algorithmsQR iteration
SVDOrthogonal or unitary singular vector matrices
Fourier methodsUnitary Fourier transforms

Orthogonal transformations are numerically safe because they preserve length and avoid artificial error growth.

54.21 Summary

Orthogonal matrices are real square matrices satisfying

QTQ=I. Q^TQ=I.

Unitary matrices are complex square matrices satisfying

UU=I. U^*U=I.

They satisfy

Q1=QT,U1=U. Q^{-1}=Q^T, \qquad U^{-1}=U^*.

They preserve inner products:

Qx,Qy=x,y, \langle Qx,Qy\rangle=\langle x,y\rangle,

and

Ux,Uy=x,y. \langle Ux,Uy\rangle=\langle x,y\rangle.

Consequently, they preserve norms, distances, angles, and orthogonality.

Geometrically, orthogonal matrices represent rotations, reflections, and combinations of them. Algebraically, they are changes between orthonormal bases. Computationally, they are the preferred transformations for stable algorithms.