# Chapter 39. Reflection Operators

# Chapter 39. Reflection Operators

A reflection operator is a linear operator that mirrors vectors across a subspace. In Euclidean geometry, a reflection fixes the mirror subspace and reverses the perpendicular direction. Applying the same reflection twice returns every vector to its original position. Thus a reflection is an involution:

$$
R^2=I.
$$

In an inner product space, the most common reflection is across a hyperplane through the origin. A Householder reflection has the form

$$
H=I-2uu^T
$$

when \(u\) is a unit normal vector. It reflects vectors across the hyperplane perpendicular to \(u\). Reflections are orthogonal transformations, so they preserve lengths and angles.

## 39.1 Reflection Across a Line

Consider the reflection of \(\mathbb{R}^2\) across the \(x\)-axis. It sends

$$
\begin{bmatrix}
x\\
y
\end{bmatrix}
$$

to

$$
\begin{bmatrix}
x\\
-y
\end{bmatrix}.
$$

The corresponding matrix is

$$
R=
\begin{bmatrix}
1 & 0\\
0 & -1
\end{bmatrix}.
$$

The \(x\)-axis is fixed pointwise. Every vector on the \(x\)-axis has the form

$$
\begin{bmatrix}
x\\
0
\end{bmatrix},
$$

and

$$
R
\begin{bmatrix}
x\\
0
\end{bmatrix} =
\begin{bmatrix}
x\\
0
\end{bmatrix}.
$$

The perpendicular direction is reversed:

$$
R
\begin{bmatrix}
0\\
y
\end{bmatrix} =
\begin{bmatrix}
0\\
-y
\end{bmatrix}.
$$

Thus a reflection keeps one subspace and negates a complementary direction.

## 39.2 Reflection as a Linear Operator

Let \(V\) be a vector space. A reflection is usually an operator

$$
R:V\to V
$$

with two structural properties.

First, it is linear:

$$
R(u+v)=R(u)+R(v),
$$

and

$$
R(cv)=cR(v).
$$

Second, it is its own inverse:

$$
R^2=I.
$$

The equation \(R^2=I\) means

$$
R(R(v))=v
$$

for every \(v\in V\). Applying the same mirror operation twice restores the original vector.

However, not every operator satisfying \(R^2=I\) is called a geometric reflection. In Euclidean geometry, a reflection also has a fixed subspace and reverses an orthogonal complementary direction.

## 39.3 Fixed Subspace

The fixed subspace of a linear operator \(R:V\to V\) is

$$
\operatorname{Fix}(R)=\{v\in V:R(v)=v\}.
$$

For a reflection, this is the mirror.

For reflection across the \(x\)-axis,

$$
\operatorname{Fix}(R) =
\left\{
\begin{bmatrix}
x\\
0
\end{bmatrix}
:x\in\mathbb{R}
\right\}.
$$

This is the \(x\)-axis.

The fixed subspace is the eigenspace for eigenvalue \(1\):

$$
\operatorname{Fix}(R)=\ker(R-I).
$$

Thus the geometry of the mirror is encoded algebraically by the equation

$$
(R-I)v=0.
$$

## 39.4 Reversed Subspace

The reversed subspace is

$$
\{v\in V:R(v)=-v\}.
$$

This is the eigenspace for eigenvalue \(-1\):

$$
\ker(R+I).
$$

For reflection across the \(x\)-axis,

$$
\ker(R+I) =
\left\{
\begin{bmatrix}
0\\
y
\end{bmatrix}
:y\in\mathbb{R}
\right\}.
$$

This is the \(y\)-axis.

Thus the reflection decomposes the plane as

$$
\mathbb{R}^2=\ker(R-I)\oplus\ker(R+I).
$$

On the first subspace, \(R\) acts as the identity. On the second subspace, \(R\) acts as multiplication by \(-1\).

## 39.5 Reflections and Direct Sums

Let \(V\) be a vector space over a field where \(2\neq 0\). Suppose

$$
V=U\oplus W.
$$

Define

$$
R(u+w)=u-w
$$

for \(u\in U\) and \(w\in W\).

Then \(R\) is linear. Also,

$$
R^2(u+w)=R(u-w)=u+w.
$$

Hence

$$
R^2=I.
$$

This operator fixes \(U\) and reverses \(W\). It is a reflection across \(U\) along \(W\).

Conversely, if a linear operator satisfies

$$
R^2=I,
$$

then every vector can be decomposed into a fixed part and a reversed part:

$$
v=\frac{1}{2}(v+R(v))+\frac{1}{2}(v-R(v)).
$$

The first term lies in \(\ker(R-I)\), and the second lies in \(\ker(R+I)\). Therefore

$$
V=\ker(R-I)\oplus\ker(R+I).
$$

This shows that involutions are diagonalizable with eigenvalues \(1\) and \(-1\), provided the field does not have characteristic \(2\).

## 39.6 Orthogonal Reflections

In an inner product space, the usual geometric reflection is orthogonal.

Let \(U\) be a subspace of a real inner product space \(V\). Every vector \(v\) can be written as

$$
v=u+w,
$$

where

$$
u\in U,
\qquad
w\in U^\perp.
$$

The orthogonal reflection across \(U\) is

$$
R(v)=u-w.
$$

It fixes the subspace \(U\) and reverses the orthogonal complement \(U^\perp\).

This reflection preserves lengths. Indeed,

$$
\|R(v)\|^2=\|u-w\|^2.
$$

Since \(u\perp w\),

$$
\|u-w\|^2=\|u\|^2+\|w\|^2.
$$

Also,

$$
\|u+w\|^2=\|u\|^2+\|w\|^2.
$$

Therefore

$$
\|R(v)\|=\|v\|.
$$

So orthogonal reflections are orthogonal transformations.

## 39.7 Reflection from Projection

Reflections are closely related to projections.

Let \(P\) be the projection onto \(U\) along \(W\), so

$$
P(u+w)=u.
$$

Then the reflection fixing \(U\) and reversing \(W\) is

$$
R=2P-I.
$$

Indeed,

$$
(2P-I)(u+w)=2u-(u+w)=u-w.
$$

Conversely,

$$
P=\frac{1}{2}(I+R).
$$

Thus projections and reflections determine each other whenever the field has characteristic not equal to \(2\).

In the orthogonal case, if \(P_U\) is the orthogonal projection onto \(U\), then the orthogonal reflection across \(U\) is

$$
R=2P_U-I.
$$

## 39.8 Reflection Across a Hyperplane

Let \(u\in\mathbb{R}^n\) be a unit vector. The hyperplane perpendicular to \(u\) is

$$
u^\perp=\{x\in\mathbb{R}^n:u^Tx=0\}.
$$

Every vector \(x\in\mathbb{R}^n\) decomposes as

$$
x=(x-(u^Tx)u)+(u^Tx)u.
$$

The first part lies in \(u^\perp\). The second part lies on the line spanned by \(u\).

Reflection across the hyperplane \(u^\perp\) keeps the first part and negates the second:

$$
H(x)=x-2(u^Tx)u.
$$

In matrix form,

$$
H=I-2uu^T.
$$

This is the Householder reflection. It fixes every vector perpendicular to \(u\) and sends \(u\) to \(-u\).

## 39.9 Checking the Householder Formula

Let

$$
H=I-2uu^T,
$$

where

$$
u^Tu=1.
$$

First,

$$
H^T=(I-2uu^T)^T=I-2uu^T=H.
$$

So \(H\) is symmetric.

Next compute \(H^2\):

$$
H^2=(I-2uu^T)(I-2uu^T).
$$

Expand:

$$
H^2=I-4uu^T+4uu^Tuu^T.
$$

Since

$$
u^Tu=1,
$$

we have

$$
uu^Tuu^T=u(u^Tu)u^T=uu^T.
$$

Therefore

$$
H^2=I-4uu^T+4uu^T=I.
$$

Thus \(H\) is its own inverse.

Also,

$$
Hu=(I-2uu^T)u=u-2u(u^Tu)=u-2u=-u.
$$

If \(x\in u^\perp\), then

$$
u^Tx=0,
$$

so

$$
Hx=x-2u(u^Tx)=x.
$$

Hence \(H\) fixes the hyperplane \(u^\perp\) and reverses the normal direction.

## 39.10 Reflection Across a Line in \(\mathbb{R}^2\)

Let \(L\) be the line through the origin making angle \(\theta\) with the positive \(x\)-axis. A unit vector along the line is

$$
q=
\begin{bmatrix}
\cos\theta\\
\sin\theta
\end{bmatrix}.
$$

The orthogonal projection onto \(L\) is

$$
P=qq^T.
$$

The reflection across \(L\) is

$$
R=2P-I.
$$

Compute:

$$
qq^T=
\begin{bmatrix}
\cos^2\theta & \cos\theta\sin\theta\\
\cos\theta\sin\theta & \sin^2\theta
\end{bmatrix}.
$$

Thus

$$
R=
\begin{bmatrix}
2\cos^2\theta-1 & 2\cos\theta\sin\theta\\
2\cos\theta\sin\theta & 2\sin^2\theta-1
\end{bmatrix}.
$$

Using double-angle identities,

$$
R=
\begin{bmatrix}
\cos 2\theta & \sin 2\theta\\
\sin 2\theta & -\cos 2\theta
\end{bmatrix}.
$$

This is the standard matrix for reflection across a line through the origin at angle \(\theta\).

## 39.11 Examples in the Plane

Reflection across the \(x\)-axis corresponds to \(\theta=0\):

$$
R=
\begin{bmatrix}
1 & 0\\
0 & -1
\end{bmatrix}.
$$

Reflection across the \(y\)-axis has matrix

$$
R=
\begin{bmatrix}
-1 & 0\\
0 & 1
\end{bmatrix}.
$$

Reflection across the line \(y=x\) has \(\theta=\pi/4\), so

$$
R=
\begin{bmatrix}
0 & 1\\
1 & 0
\end{bmatrix}.
$$

It swaps coordinates:

$$
R
\begin{bmatrix}
x\\
y
\end{bmatrix} =
\begin{bmatrix}
y\\
x
\end{bmatrix}.
$$

Reflection across the line \(y=-x\) has matrix

$$
R=
\begin{bmatrix}
0 & -1\\
-1 & 0
\end{bmatrix}.
$$

It sends

$$
\begin{bmatrix}
x\\
y
\end{bmatrix}
$$

to

$$
\begin{bmatrix}
-y\\
-x
\end{bmatrix}.
$$

## 39.12 Reflection Across a Plane in \(\mathbb{R}^3\)

Let \(u\in\mathbb{R}^3\) be a unit normal vector to a plane through the origin. The reflection across the plane is

$$
H=I-2uu^T.
$$

For example, take

$$
u=
\begin{bmatrix}
0\\
0\\
1
\end{bmatrix}.
$$

Then

$$
uu^T=
\begin{bmatrix}
0&0&0\\
0&0&0\\
0&0&1
\end{bmatrix}.
$$

Thus

$$
H=I-2uu^T =
\begin{bmatrix}
1&0&0\\
0&1&0\\
0&0&-1
\end{bmatrix}.
$$

This reflects across the \(xy\)-plane:

$$
H
\begin{bmatrix}
x\\
y\\
z
\end{bmatrix} =
\begin{bmatrix}
x\\
y\\
-z
\end{bmatrix}.
$$

## 39.13 Determinant of a Reflection

An orthogonal reflection has determinant \(-1\) when it reverses one normal direction and fixes all directions in a hyperplane.

For the Householder reflection

$$
H=I-2uu^T,
$$

the eigenvalues are

$$
-1
$$

in the direction \(u\), and

$$
1
$$

on the \((n-1)\)-dimensional hyperplane \(u^\perp\).

Therefore

$$
\det(H)=(-1)\cdot 1^{n-1}=-1.
$$

This agrees with the geometric interpretation: a reflection reverses orientation.

By contrast, a rotation in \(\mathbb{R}^2\) has determinant \(1\). The determinant distinguishes orientation-preserving and orientation-reversing orthogonal transformations.

## 39.14 Trace of a Hyperplane Reflection

For a Householder reflection in \(\mathbb{R}^n\), the eigenvalues are

$$
1,\ldots,1,-1.
$$

There are \(n-1\) eigenvalues equal to \(1\) and one eigenvalue equal to \(-1\). Hence the trace is

$$
\operatorname{tr}(H)=(n-1)-1=n-2.
$$

The same result follows from the formula

$$
H=I-2uu^T.
$$

Since

$$
\operatorname{tr}(I)=n
$$

and

$$
\operatorname{tr}(uu^T)=u^Tu=1,
$$

we get

$$
\operatorname{tr}(H)=n-2.
$$

## 39.15 Reflections and Orthogonal Matrices

A real matrix \(Q\) is orthogonal if

$$
Q^TQ=I.
$$

Householder reflections are orthogonal. Since

$$
H^T=H
$$

and

$$
H^2=I,
$$

we have

$$
H^TH=H^2=I.
$$

Thus \(H\) preserves inner products:

$$
(Hx)^T(Hy)=x^TH^THy=x^Ty.
$$

It follows that \(H\) preserves lengths and angles.

Geometrically, this is expected. A reflection moves vectors without changing their lengths or the angles between them. Orthogonal transformations preserve inner products and therefore preserve lengths and angles.

## 39.16 Reflections and Eigenvalues

A reflection \(R\) satisfying

$$
R^2=I
$$

has eigenvalues only \(1\) and \(-1\), assuming the field has characteristic not equal to \(2\).

If

$$
Rv=\lambda v
$$

with \(v\neq 0\), then

$$
R^2v=R(\lambda v)=\lambda Rv=\lambda^2v.
$$

Since

$$
R^2v=v,
$$

we get

$$
\lambda^2v=v.
$$

Thus

$$
\lambda^2=1,
$$

so

$$
\lambda=1
$$

or

$$
\lambda=-1.
$$

The eigenspace for \(1\) is the fixed subspace. The eigenspace for \(-1\) is the reversed subspace.

## 39.17 Reflections and Diagonalization

A reflection is diagonalizable when the field has characteristic not equal to \(2\).

The reason is that the polynomial

$$
x^2-1=(x-1)(x+1)
$$

has distinct roots. Since a reflection satisfies

$$
R^2-I=0,
$$

its minimal polynomial divides \(x^2-1\). Therefore it has no repeated factor, and the operator is diagonalizable.

In a basis adapted to the decomposition

$$
V=\ker(R-I)\oplus\ker(R+I),
$$

the matrix of \(R\) is

$$
\begin{bmatrix}
I_r & 0\\
0 & -I_s
\end{bmatrix}.
$$

Here

$$
r=\dim\ker(R-I),
\qquad
s=\dim\ker(R+I).
$$

For a hyperplane reflection in \(\mathbb{R}^n\), this becomes

$$
\begin{bmatrix}
I_{n-1} & 0\\
0 & -1
\end{bmatrix}.
$$

## 39.18 Reflections and Projections

Let \(P\) be a projection. Then

$$
R=2P-I
$$

is a reflection-like involution:

$$
R^2=(2P-I)^2=4P^2-4P+I.
$$

Since

$$
P^2=P,
$$

we get

$$
R^2=4P-4P+I=I.
$$

Conversely, if \(R^2=I\), then

$$
P=\frac{1}{2}(I+R)
$$

is a projection:

$$
P^2=
\frac{1}{4}(I+2R+R^2) =
\frac{1}{4}(2I+2R) =
\frac{1}{2}(I+R)=P.
$$

Thus idempotent operators and involutive operators are closely related.

Projection keeps one part and kills the other. Reflection keeps one part and negates the other.

## 39.19 Composition of Reflections

Compositions of reflections produce important transformations.

In \(\mathbb{R}^2\), the composition of two reflections across lines through the origin is a rotation. If the angle from the first mirror line to the second is \(\alpha\), then the composition is a rotation by \(2\alpha\).

This can be checked using matrices. Reflection across a line at angle \(\theta\) has matrix

$$
R_\theta=
\begin{bmatrix}
\cos 2\theta & \sin 2\theta\\
\sin 2\theta & -\cos 2\theta
\end{bmatrix}.
$$

Then

$$
R_\phi R_\theta
$$

is the rotation matrix through angle

$$
2(\phi-\theta).
$$

Thus rotations can be built from reflections.

More generally, reflection groups are groups generated by reflections. They occur in geometry, Lie theory, root systems, and the study of symmetry.

## 39.20 Householder Reflections in Computation

Householder reflections are important in numerical linear algebra. They are used to transform vectors and matrices while preserving lengths.

Given a nonzero vector \(x\), one can choose a unit vector \(u\) so that

$$
H=I-2uu^T
$$

maps \(x\) to a scalar multiple of the first coordinate vector. This is the basis of Householder QR factorization.

Because \(H\) is orthogonal, it is numerically stable for many algorithms. Multiplying by \(H\) does not magnify Euclidean lengths. Householder transformations are standard tools in QR decomposition and related matrix reductions.

## 39.21 Reflection Versus Projection

Reflection and projection are related but different.

| Operator | Defining equation | Action |
|---|---|---|
| Projection | \(P^2=P\) | Keeps one part and sends the other to zero |
| Reflection | \(R^2=I\) | Keeps one part and negates the other |

For a decomposition

$$
V=U\oplus W,
$$

the projection onto \(U\) along \(W\) is

$$
P(u+w)=u.
$$

The reflection across \(U\) along \(W\) is

$$
R(u+w)=u-w.
$$

They are connected by

$$
R=2P-I
$$

and

$$
P=\frac{1}{2}(I+R).
$$

## 39.22 Summary

A reflection operator is a linear operator that fixes one subspace and reverses a complementary subspace.

In its simplest algebraic form, a reflection satisfies

$$
R^2=I.
$$

Its eigenspaces are

$$
\ker(R-I)
$$

and

$$
\ker(R+I).
$$

The first is the fixed subspace. The second is the reversed subspace.

In an inner product space, the orthogonal reflection across a subspace \(U\) sends

$$
u+w
$$

to

$$
u-w,
$$

where

$$
u\in U,
\qquad
w\in U^\perp.
$$

For a unit normal vector \(u\), the reflection across the hyperplane perpendicular to \(u\) has matrix

$$
H=I-2uu^T.
$$

Reflections are orthogonal transformations. They preserve lengths and angles, have determinant \(-1\) in the hyperplane case, and are their own inverses.

They are fundamental in geometry, matrix factorization, numerical linear algebra, and the theory of symmetry.
