Orthogonality is the inner product version of perpendicularity. In Euclidean geometry, two nonzero vectors are perpendicular when they meet at a right angle. In an inner product space, the same idea is expressed algebraically: two vectors are orthogonal when their inner product is zero. This definition extends the geometry of right angles from the plane and three-dimensional space to arbitrary finite-dimensional and infinite-dimensional vector spaces.
47.1 Orthogonal Vectors
Let be an inner product space. Two vectors are orthogonal if
We write
In with the standard dot product, this means
For example,
Then
Hence and are orthogonal.
Orthogonality is symmetric in real inner product spaces. If , then . In complex inner product spaces, conjugate symmetry gives
Thus still implies .
47.2 The Zero Vector
The zero vector is orthogonal to every vector.
Indeed, for any ,
Thus
for every .
This fact is sometimes useful, but it must be interpreted carefully. In geometry, perpendicularity usually refers to nonzero directions. In inner product spaces, orthogonality is defined algebraically, so the zero vector is orthogonal to all vectors.
47.3 Orthogonal Sets
A set of vectors
is orthogonal if every pair of distinct vectors is orthogonal:
The set is orthonormal if it is orthogonal and each vector has norm one:
This condition is often written using the Kronecker delta:
Orthogonal sets separate directions. Orthonormal sets do more: they separate directions and normalize scale.
47.4 Orthogonal Sets Are Linearly Independent
A nonzero orthogonal set is linearly independent.
Let
be an orthogonal set, and suppose each is nonzero. Assume
Take the inner product with . Then
By linearity,
All terms vanish except the -th term. Hence
Since ,
Therefore
This holds for every , so all coefficients are zero. The set is linearly independent.
47.5 Orthogonal Bases
An orthogonal basis is a basis whose vectors are pairwise orthogonal. An orthonormal basis is a basis whose vectors are pairwise orthogonal and have norm one.
If
is an orthogonal basis for , then every vector has a unique expansion
The coefficients are easy to compute. Taking the inner product with gives
Thus
Therefore
If the basis is orthonormal, then , so the formula becomes
This is one of the main advantages of orthonormal bases. Coordinates are obtained directly by inner products.
47.6 Orthogonal Subspaces
Let and be subspaces of an inner product space . The subspaces and are orthogonal if every vector in is orthogonal to every vector in :
We write
For example, in , the -axis and the -plane are orthogonal. Every vector on the -axis has the form
and every vector in the -plane has the form
Their dot product is
Thus the two subspaces are orthogonal.
47.7 Orthogonal Complement
Let be a subset of an inner product space . The orthogonal complement of , denoted , is the set of all vectors in that are orthogonal to every vector in :
If is a subspace, then consists of all vectors perpendicular to the entire subspace.
For example, in , let
Then is the -axis, and
- $$
- S^\perp =
- \left{
- \begin{bmatrix}
- 0 \
- y \
- z
- \end{bmatrix}
- y,z\in\mathbb{R} \right}. $$
Thus is the -plane.
47.8 The Orthogonal Complement Is a Subspace
For any subset , the orthogonal complement is a subspace of .
First, , since
for every .
Now let , and let be scalars. For every ,
Therefore
So is closed under linear combinations, and hence it is a subspace.
This is important because the orthogonal complement of even an arbitrary set is automatically linear.
47.9 Orthogonal Complement of a Span
The orthogonal complement of a set is the same as the orthogonal complement of its span:
Indeed, if a vector is orthogonal to every vector in , then it is orthogonal to every linear combination of vectors in . Conversely, since , any vector orthogonal to the span is orthogonal to .
This fact allows one to compute orthogonal complements using a spanning set rather than every vector in a subspace.
For example, if
then
if and only if
Thus finding becomes a system of homogeneous linear equations.
47.10 Dimension Formula
If is a subspace of a finite-dimensional inner product space , then
Also,
The intersection is trivial because if , then and is orthogonal to every vector in . In particular, is orthogonal to itself:
Positive definiteness gives
In finite dimensions, these facts imply that every vector in can be written uniquely as a sum of a vector in and a vector in :
47.11 Orthogonal Decomposition
The equation
means that each vector has a unique decomposition
where
The vector is the component of inside . The vector is the component of perpendicular to .
This decomposition is central to projection and approximation. It separates a vector into an explained part and a residual part.
In finite-dimensional Euclidean space, this is the familiar operation of dropping a perpendicular from a point to a line, plane, or higher-dimensional subspace.
47.12 Projection onto a One-Dimensional Subspace
Let , and let
The orthogonal projection of onto is
The residual vector is
This residual is orthogonal to . Indeed,
Thus
is an orthogonal decomposition.
47.13 Projection onto an Orthogonal Basis
Suppose
where are nonzero orthogonal vectors. Then the projection of onto is
If the vectors are orthonormal, this simplifies to
The residual
lies in .
This formula is computationally simple because each basis direction can be handled independently.
47.14 Projection Matrices
A projection matrix is a square matrix satisfying
This equation says that applying the projection twice has the same effect as applying it once. A real projection matrix is an orthogonal projection matrix when, in addition,
Equivalently, an orthogonal projection matrix satisfies
For complex matrices, the transpose is replaced by the conjugate transpose:
Projection matrices formalize the geometric idea of projecting vectors onto subspaces. Orthogonal projection matrices are self-adjoint idempotent operators.
47.15 Projection onto a Column Space
Let be an matrix with linearly independent columns. The column space of is
The orthogonal projection of onto has the form
The residual
must be orthogonal to every column of . This condition is
Therefore
These are the normal equations.
If is invertible, then
Thus the projection is
The projection matrix onto is
This formula is fundamental in least squares.
47.16 Orthogonality and Least Squares
A system
may have no exact solution when is not in the column space of . In that case, one seeks an approximate solution such that
is as close as possible to .
The error is
The least squares principle chooses so that
is minimized.
The geometric condition for the minimum is
That is,
This again gives
Thus least squares is an orthogonality problem. The best approximation is obtained when the residual is perpendicular to the approximation space.
47.17 Orthogonal Decomposition of Row, Column, and Null Spaces
For a real matrix , the four fundamental subspaces are related by orthogonality.
The null space of is orthogonal to the row space of :
Indeed, means every row of has dot product zero with .
Similarly, the null space of , also called the left null space of , is orthogonal to the column space of :
These relationships are a central part of the fundamental theorem of linear algebra.
They explain how solutions, constraints, residuals, and images fit together geometrically.
47.18 Pythagorean Theorem
If , then
Proof:
By linearity and symmetry,
Since ,
Therefore
More generally, if are pairwise orthogonal, then
47.19 Orthogonality in Function Spaces
Orthogonality is not limited to coordinate vectors.
Let be a space of real-valued functions on with inner product
Then and are orthogonal if
For example, on ,
because
Orthogonal functions are central in Fourier series, approximation theory, differential equations, and signal processing.
In this setting, projection becomes approximation by functions. A function can be projected onto a subspace spanned by simpler functions, such as polynomials or trigonometric functions.
47.20 Summary
Orthogonality generalizes perpendicularity to inner product spaces. Two vectors are orthogonal when their inner product is zero. Orthogonal sets of nonzero vectors are linearly independent. Orthogonal and orthonormal bases give simple coordinate formulas.
The orthogonal complement is the set of all vectors orthogonal to a set . It is always a subspace, and in finite-dimensional inner product spaces a subspace satisfies
This decomposition leads directly to projection. Projection gives the closest vector in a subspace, and the residual is orthogonal to that subspace. This principle underlies least squares, approximation, Fourier methods, and much of numerical linear algebra.