# Chapter 54. Unitary and Orthogonal Matrices

# Chapter 54. Unitary and Orthogonal Matrices

Orthogonal and unitary matrices are the matrices that preserve inner product geometry. They preserve lengths, angles, distances, and orthogonality. In real vector spaces, the relevant matrices are orthogonal. In complex vector spaces, the corresponding matrices are unitary.

A real square matrix \(Q\) is orthogonal when

$$
Q^TQ = I.
$$

Equivalently,

$$
Q^{-1}=Q^T.
$$

A complex square matrix \(U\) is unitary when

$$
U^*U = I,
$$

where \(U^*\) denotes the conjugate transpose. Equivalently,

$$
U^{-1}=U^*.
$$

Orthogonal matrices are real unitary matrices. Both classes preserve inner products and therefore act as rigid transformations of Euclidean or complex inner product space.

## 54.1 Orthogonal Matrices

Let \(Q\in\mathbb{R}^{n\times n}\). The matrix \(Q\) is orthogonal if

$$
Q^TQ=I.
$$

This condition means that the columns of \(Q\) form an orthonormal basis of \(\mathbb{R}^n\). If

$$
Q=
\begin{bmatrix}
| & | & & |\\
q_1 & q_2 & \cdots & q_n\\
| & | & & |
\end{bmatrix},
$$

then

$$
Q^TQ =
\begin{bmatrix}
q_1^Tq_1 & q_1^Tq_2 & \cdots & q_1^Tq_n\\
q_2^Tq_1 & q_2^Tq_2 & \cdots & q_2^Tq_n\\
\vdots & \vdots & \ddots & \vdots\\
q_n^Tq_1 & q_n^Tq_2 & \cdots & q_n^Tq_n
\end{bmatrix}.
$$

Thus \(Q^TQ=I\) means

$$
q_i^Tq_j=\delta_{ij}.
$$

The columns are unit vectors, and distinct columns are orthogonal.

Since \(Q\) is square, the same condition also implies

$$
QQ^T=I.
$$

Therefore the rows of \(Q\) are also orthonormal.

## 54.2 Inverse of an Orthogonal Matrix

If \(Q\) is orthogonal, then

$$
Q^TQ=I.
$$

Since \(Q\) is square, this implies

$$
Q^{-1}=Q^T.
$$

This is one of the main computational advantages of orthogonal matrices. The inverse is obtained by transposition. No elimination or matrix inversion algorithm is required.

For example,

$$
Q=
\frac{1}{\sqrt{2}}
\begin{bmatrix}
1 & 1\\
1 & -1
\end{bmatrix}
$$

is orthogonal because

$$
Q^TQ=I.
$$

Also,

$$
Q^{-1}=Q^T=Q.
$$

This particular matrix is symmetric as well as orthogonal, so it is its own inverse.

## 54.3 Preservation of Inner Products

Orthogonal matrices preserve inner products.

Let \(Q\in\mathbb{R}^{n\times n}\) be orthogonal. Then for any \(x,y\in\mathbb{R}^n\),

$$
\langle Qx,Qy\rangle =
(Qx)^T(Qy).
$$

Using matrix multiplication,

$$
(Qx)^T(Qy) =
x^TQ^TQy.
$$

Since \(Q^TQ=I\),

$$
x^TQ^TQy=x^Ty.
$$

Therefore

$$
\langle Qx,Qy\rangle=\langle x,y\rangle.
$$

This identity is the central meaning of orthogonality for matrices. Orthogonal matrices preserve the dot product. Since length and angle are computed from the dot product, they preserve Euclidean geometry.

## 54.4 Preservation of Norms

Taking \(y=x\) in the inner product identity gives

$$
\langle Qx,Qx\rangle=\langle x,x\rangle.
$$

Thus

$$
\|Qx\|_2^2=\|x\|_2^2.
$$

Since norms are nonnegative,

$$
\|Qx\|_2=\|x\|_2.
$$

So an orthogonal matrix does not stretch or shrink vectors. It may rotate, reflect, or permute directions, but it preserves length.

This is why orthogonal matrices are called length-preserving transformations or isometries.

## 54.5 Preservation of Distance

For any \(x,y\in\mathbb{R}^n\),

$$
\|Qx-Qy\|_2 =
\|Q(x-y)\|_2.
$$

Since \(Q\) preserves norms,

$$
\|Q(x-y)\|_2=\|x-y\|_2.
$$

Therefore

$$
\|Qx-Qy\|_2=\|x-y\|_2.
$$

An orthogonal matrix preserves distances between points. This confirms the geometric interpretation: orthogonal matrices are rigid linear transformations.

## 54.6 Preservation of Angles

For nonzero vectors \(x\) and \(y\), the angle \(\theta\) between them is determined by

$$
\cos\theta =
\frac{x^Ty}{\|x\|_2\|y\|_2}.
$$

After applying \(Q\), the corresponding value is

$$
\frac{(Qx)^T(Qy)}{\|Qx\|_2\|Qy\|_2}.
$$

Since \(Q\) preserves inner products and norms, this equals

$$
\frac{x^Ty}{\|x\|_2\|y\|_2}.
$$

Thus orthogonal matrices preserve angles.

In particular, if

$$
x\perp y,
$$

then

$$
Qx\perp Qy.
$$

Orthogonality is preserved under orthogonal transformations.

## 54.7 Rotations in the Plane

The standard rotation matrix in \(\mathbb{R}^2\) is

$$
R_\theta =
\begin{bmatrix}
\cos\theta & -\sin\theta\\
\sin\theta & \cos\theta
\end{bmatrix}.
$$

Its columns are

$$
\begin{bmatrix}
\cos\theta\\
\sin\theta
\end{bmatrix},
\qquad
\begin{bmatrix}
-\sin\theta\\
\cos\theta
\end{bmatrix}.
$$

Each has length one, and their dot product is

$$
-\cos\theta\sin\theta+\sin\theta\cos\theta=0.
$$

Therefore

$$
R_\theta^TR_\theta=I.
$$

So \(R_\theta\) is orthogonal.

It has determinant

$$
\det R_\theta =
\cos^2\theta+\sin^2\theta =
1.
$$

Thus it preserves orientation as well as length and angle. It is a pure rotation.

## 54.8 Reflections

A reflection matrix is also orthogonal.

For example,

$$
F=
\begin{bmatrix}
1 & 0\\
0 & -1
\end{bmatrix}
$$

reflects the plane across the \(x\)-axis. It satisfies

$$
F^TF=I.
$$

Thus it is orthogonal.

Its determinant is

$$
\det F=-1.
$$

The determinant distinguishes rotations from reflections in two dimensions. Orthogonal matrices with determinant \(1\) preserve orientation. Orthogonal matrices with determinant \(-1\) reverse orientation.

## 54.9 Determinant of an Orthogonal Matrix

If \(Q\) is orthogonal, then

$$
Q^TQ=I.
$$

Taking determinants gives

$$
\det(Q^TQ)=\det I.
$$

Using determinant rules,

$$
\det(Q^T)\det(Q)=1.
$$

Since

$$
\det(Q^T)=\det(Q),
$$

we obtain

$$
(\det Q)^2=1.
$$

Therefore

$$
\det Q=\pm 1.
$$

Every orthogonal matrix has determinant \(1\) or \(-1\). The converse is false. A matrix may have determinant \(\pm 1\) without being orthogonal.

## 54.10 The Orthogonal Group

The set of all \(n\times n\) orthogonal matrices is denoted

$$
O(n).
$$

It is called the orthogonal group.

It is a group under matrix multiplication. If \(Q_1,Q_2\in O(n)\), then

$$
(Q_1Q_2)^T(Q_1Q_2) =
Q_2^TQ_1^TQ_1Q_2 =
Q_2^TQ_2 =
I.
$$

Thus \(Q_1Q_2\in O(n)\).

The identity matrix belongs to \(O(n)\), and the inverse of an orthogonal matrix is orthogonal because

$$
Q^{-1}=Q^T.
$$

The subgroup of orthogonal matrices with determinant \(1\) is denoted

$$
SO(n).
$$

It is called the special orthogonal group. In \(\mathbb{R}^2\) and \(\mathbb{R}^3\), its elements are rotations.

## 54.11 Unitary Matrices

Let \(U\in\mathbb{C}^{n\times n}\). The matrix \(U\) is unitary if

$$
U^*U=I,
$$

where

$$
U^*=\overline{U}^T
$$

is the conjugate transpose.

The columns of \(U\) form an orthonormal basis of \(\mathbb{C}^n\) with respect to the standard complex inner product.

If

$$
U=
\begin{bmatrix}
| & | & & |\\
u_1 & u_2 & \cdots & u_n\\
| & | & & |
\end{bmatrix},
$$

then

$$
U^*U=I
$$

means

$$
u_i^*u_j=\delta_{ij}.
$$

Thus the columns have complex norm one and are mutually orthogonal.

## 54.12 Inverse of a Unitary Matrix

If \(U\) is unitary, then

$$
U^*U=I.
$$

Since \(U\) is square, this also gives

$$
UU^*=I.
$$

Therefore

$$
U^{-1}=U^*.
$$

As in the real case, inversion is reduced to taking an adjoint. This is a major reason unitary transformations are preferred in complex numerical linear algebra and quantum mechanics.

## 54.13 Preservation of Complex Inner Products

Let \(U\) be unitary. For \(x,y\in\mathbb{C}^n\),

$$
\langle Ux,Uy\rangle =
(Ux)^*(Uy).
$$

Using matrix algebra,

$$
(Ux)^*(Uy) =
x^*U^*Uy.
$$

Since \(U^*U=I\),

$$
x^*U^*Uy=x^*y.
$$

Therefore

$$
\langle Ux,Uy\rangle=\langle x,y\rangle.
$$

Thus unitary matrices preserve the complex inner product. Consequently, they preserve norms, distances, and orthogonality in \(\mathbb{C}^n\).

## 54.14 Examples of Unitary Matrices

The simplest unitary matrices are complex numbers of modulus one. A \(1\times 1\) matrix

$$
U=[z]
$$

is unitary exactly when

$$
\overline{z}z=1,
$$

or

$$
|z|=1.
$$

Thus

$$
z=e^{i\theta}
$$

gives a unitary transformation of \(\mathbb{C}\).

A diagonal matrix

$$
U=
\begin{bmatrix}
e^{i\theta_1} & 0 & \cdots & 0\\
0 & e^{i\theta_2} & \cdots & 0\\
\vdots & \vdots & \ddots & \vdots\\
0 & 0 & \cdots & e^{i\theta_n}
\end{bmatrix}
$$

is unitary because each diagonal entry has modulus one.

The normalized Fourier matrix is another important example. Its entries are complex roots of unity, scaled so that the columns become orthonormal.

## 54.15 Orthogonal Matrices as Real Unitary Matrices

If \(Q\) is real, then the conjugate transpose \(Q^*\) equals the ordinary transpose \(Q^T\).

Thus the unitary condition

$$
Q^*Q=I
$$

becomes

$$
Q^TQ=I.
$$

Therefore a real matrix is unitary exactly when it is orthogonal.

This is why unitary matrices are the complex analogue of orthogonal matrices.

## 54.16 Orthogonal and Unitary Similarity

Orthogonal and unitary matrices are used for changes of orthonormal coordinates.

In the real case, a change of orthonormal basis has the form

$$
B=Q^TAQ.
$$

This is called an orthogonal similarity transformation.

In the complex case, the corresponding form is

$$
B=U^*AU.
$$

This is called a unitary similarity transformation.

These transformations preserve important matrix properties, including eigenvalues, rank, trace, determinant, and many norm-related quantities. They are central in spectral theory because they change coordinates without distorting inner product geometry.

## 54.17 Orthogonal Diagonalization

A real symmetric matrix \(A\) can be diagonalized by an orthogonal matrix:

$$
A=Q\Lambda Q^T,
$$

where \(Q\) is orthogonal and \(\Lambda\) is diagonal.

Equivalently,

$$
Q^TAQ=\Lambda.
$$

The columns of \(Q\) are orthonormal eigenvectors of \(A\), and the diagonal entries of \(\Lambda\) are the corresponding eigenvalues.

This result is the spectral theorem for real symmetric matrices. Orthogonal diagonalization is especially important because it diagonalizes the matrix while preserving Euclidean geometry.

## 54.18 Unitary Diagonalization

A complex normal matrix \(A\) can be diagonalized by a unitary matrix:

$$
A=U\Lambda U^*,
$$

where \(U\) is unitary and \(\Lambda\) is diagonal.

Equivalently,

$$
U^*AU=\Lambda.
$$

The columns of \(U\) are orthonormal eigenvectors of \(A\).

This is the complex spectral theorem. It includes Hermitian matrices, skew-Hermitian matrices, and unitary matrices themselves as important special cases.

Unitary diagonalization is the natural complex analogue of orthogonal diagonalization.

## 54.19 Eigenvalues of Orthogonal and Unitary Matrices

If \(U\) is unitary and

$$
Ux=\lambda x
$$

for some nonzero vector \(x\), then

$$
\|Ux\|_2=\|\lambda x\|_2.
$$

Since \(U\) preserves norm,

$$
\|x\|_2=|\lambda|\|x\|_2.
$$

Because \(x\ne 0\),

$$
|\lambda|=1.
$$

Thus every eigenvalue of a unitary matrix has modulus one.

For real orthogonal matrices, complex eigenvalues may occur, but they also have modulus one. Real eigenvalues of an orthogonal matrix must be

$$
1
\quad
\text{or}
\quad
-1.
$$

This matches the geometric interpretation. Orthogonal and unitary matrices do not expand or contract eigenvectors.

## 54.20 Numerical Importance

Orthogonal and unitary matrices are fundamental in numerical linear algebra.

If \(Q\) is orthogonal, then

$$
\|Qx\|_2=\|x\|_2.
$$

Thus multiplying by \(Q\) does not amplify vector errors in the Euclidean norm. The condition number of an orthogonal matrix in the \(2\)-norm is

$$
\kappa_2(Q)=1.
$$

This is the smallest possible condition number for an invertible matrix. For this reason, stable algorithms often rely on orthogonal or unitary transformations.

Examples include:

| Algorithm | Orthogonal or unitary ingredient |
|---|---|
| QR factorization | Householder reflections or Givens rotations |
| Least squares | Orthogonal reduction to triangular form |
| Eigenvalue algorithms | QR iteration |
| SVD | Orthogonal or unitary singular vector matrices |
| Fourier methods | Unitary Fourier transforms |

Orthogonal transformations are numerically safe because they preserve length and avoid artificial error growth.

## 54.21 Summary

Orthogonal matrices are real square matrices satisfying

$$
Q^TQ=I.
$$

Unitary matrices are complex square matrices satisfying

$$
U^*U=I.
$$

They satisfy

$$
Q^{-1}=Q^T,
\qquad
U^{-1}=U^*.
$$

They preserve inner products:

$$
\langle Qx,Qy\rangle=\langle x,y\rangle,
$$

and

$$
\langle Ux,Uy\rangle=\langle x,y\rangle.
$$

Consequently, they preserve norms, distances, angles, and orthogonality.

Geometrically, orthogonal matrices represent rotations, reflections, and combinations of them. Algebraically, they are changes between orthonormal bases. Computationally, they are the preferred transformations for stable algorithms.
