An orthonormal basis is a basis made from unit vectors that are mutually orthogonal. It is one of the most useful structures in linear algebra because it combines two properties at once: every vector can be expressed uniquely in the basis, and the coefficients are found directly by inner products.
In an ordinary basis, coordinates may require solving a linear system. In an orthonormal basis, coordinates are obtained by taking dot products. This makes orthonormal bases central in projection, least squares, Fourier analysis, numerical linear algebra, signal processing, and spectral theory. Standard references define an orthonormal set as an orthogonal set of unit vectors, and such a set is linearly independent.
49.1 Unit Vectors
Let be an inner product space. A vector is a unit vector if
Since the norm is induced by the inner product,
Thus is a unit vector exactly when
For example, in ,
are unit vectors.
The vector
is not a unit vector because
To make it a unit vector, divide by its length:
This process is called normalization.
49.2 Normalization
If , its normalization is
Then
Thus is a unit vector in the same direction as .
Normalization changes the length of a vector but not its direction. It is the operation that converts an orthogonal basis into an orthonormal basis. If are nonzero orthogonal vectors, then
form an orthonormal set. This follows because scaling nonzero orthogonal vectors preserves orthogonality and gives each vector length one.
49.3 Orthonormal Sets
A set of vectors
is orthonormal if
Equivalently,
where is the Kronecker delta.
The condition contains two requirements. When ,
so every vector has length one. When ,
so distinct vectors are orthogonal.
For example,
form an orthonormal set in . Indeed,
and
49.4 Orthonormal Bases
An orthonormal basis is an orthonormal set that is also a basis.
Thus form an orthonormal basis for if:
| Property | Meaning |
|---|---|
| Spanning | Every vector in is a linear combination of the |
| Linear independence | The representation is unique |
| Orthogonality | Distinct basis vectors have inner product zero |
| Normalization | Each basis vector has length one |
In a finite-dimensional inner product space, any orthonormal set with vectors is automatically an orthonormal basis. This follows because every orthonormal set is linearly independent, and a linearly independent set with the dimension of the space is a basis.
49.5 Orthonormal Sets Are Linearly Independent
Every orthonormal set is linearly independent.
Let
be an orthonormal set, and suppose
Take the inner product with . Then
By linearity,
All terms vanish except the -th term. Since , we get
This holds for every . Therefore all coefficients are zero, so the set is linearly independent.
This proof is one reason orthonormal systems are algebraically clean. The inner product isolates one coordinate at a time.
49.6 Coordinates in an Orthonormal Basis
Let
be an orthonormal basis for . Every vector has a unique representation
To find , take the inner product with :
Using orthonormality,
Therefore
Hence
This is the coordinate formula for an orthonormal basis. It replaces solving a system by taking inner products. Coordinate formulas of this form are a primary advantage of orthogonal and orthonormal bases.
49.7 Coordinate Vector
If is an orthonormal basis, then the coordinate vector of relative to is
For example, let
and let
Then
and
Thus
Therefore
49.8 Matrix Form
Let be the matrix whose columns are the orthonormal vectors
Then
The condition that the columns are orthonormal is
in the real case.
In the complex case, the condition is
where is the conjugate transpose.
If is square, then
in the real case, and
in the complex case.
A real square matrix with orthonormal columns is called an orthogonal matrix. A complex square matrix with orthonormal columns is called a unitary matrix.
49.9 Orthogonal Matrices
A real square matrix is orthogonal if
Since is square, this also implies
Thus
Orthogonal matrices preserve inner products:
They also preserve norms:
Therefore orthogonal matrices represent rigid linear transformations: rotations, reflections, and combinations of them.
They do not stretch or shrink Euclidean length.
49.10 Unitary Matrices
In complex vector spaces, the analogue of an orthogonal matrix is a unitary matrix.
A square complex matrix is unitary if
Then
Unitary matrices preserve complex inner products:
They also preserve Euclidean norm:
Unitary matrices appear throughout spectral theory, quantum mechanics, Fourier analysis, and numerical linear algebra. The discrete Fourier transform matrix, after proper normalization, is unitary.
49.11 Projection onto an Orthonormal Basis
Let be a subspace with orthonormal basis
The orthogonal projection of onto is
This formula follows from the coordinate formula, applied only to the subspace .
The residual
is orthogonal to every basis vector :
Therefore
Projection with an orthonormal basis avoids the matrix inverse appearing in the general projection formula.
49.12 Projection Matrix
Let be an matrix with orthonormal columns. Thus
The projection of onto is
Hence the projection matrix is
This is simpler than the general formula
The simplification occurs because . Orthonormal columns remove the need to invert the Gram matrix.
The matrix satisfies
and
Thus it is an orthogonal projection matrix.
49.13 Parseval Identity
Let be an orthonormal basis for . If
then
Since
we have
This is Parseval’s identity in finite-dimensional form.
It says that the squared length of a vector equals the sum of the squared magnitudes of its orthonormal coordinates.
In , this generalizes the usual formula
49.14 Bessel Inequality
If is an orthonormal set, but not necessarily a basis for all of , then
This is Bessel’s inequality.
It says that the energy captured by projection onto the span of cannot exceed the total energy of .
The inequality becomes an equality exactly when lies in the span of the orthonormal set. In that case, the set captures all of .
49.15 Distance to a Subspace
Let have orthonormal basis . The closest vector in to is
The distance from to is
Using orthogonal decomposition,
Then
Since
we obtain
This formula is useful in least squares and approximation.
49.16 Change of Orthonormal Basis
Let and be two orthonormal bases of a real inner product space. The change-of-basis matrix from -coordinates to -coordinates is orthogonal.
Indeed, both coordinate systems preserve inner products and lengths. Therefore the transformation between them also preserves inner products and lengths.
In matrix form, if and are the matrices with columns and , then
The change-of-basis matrix is
It is orthogonal because
Thus moving between orthonormal coordinate systems is numerically stable and geometrically rigid.
49.17 Orthonormal Bases and Least Squares
In least squares, one often wants to approximate by a vector in the column space of a matrix.
If the columns of are not orthonormal, the projection formula is
If the columns are orthonormal, write the matrix as . Then
The least squares coefficients are
This is simpler and more stable than solving the normal equations.
This observation motivates the QR factorization. Instead of working directly with , we factor it as
where has orthonormal columns and is upper triangular. The orthonormal factor carries the geometry of the column space.
49.18 Orthonormal Bases in Function Spaces
Orthonormal bases also occur in spaces of functions.
For functions on an interval , an inner product may be defined by
A sequence of functions is orthonormal if
For example, trigonometric functions form orthogonal and, after scaling, orthonormal systems on intervals such as . In Fourier analysis, a function is represented by coefficients obtained from inner products:
This is the infinite-dimensional analogue of coordinates in an orthonormal basis.
49.19 Numerical Importance
Orthonormal bases are important in numerical computation because they control error.
When a matrix has orthonormal columns,
so multiplying by does not amplify Euclidean length. This gives stable algorithms for projections, least squares, eigenvalue computations, and matrix factorizations.
In contrast, a poorly conditioned basis may distort coordinates. Small errors in a vector can become large errors in its coordinate representation.
For this reason, numerical linear algebra often replaces arbitrary bases by orthonormal bases. The Gram-Schmidt process, Householder reflections, and Givens rotations are standard methods for doing this.
49.20 Summary
An orthonormal basis is a basis satisfying
It gives the expansion
Thus coordinates are obtained by inner products.
If is the matrix whose columns are an orthonormal basis, then
When is square,
Orthonormal bases simplify projection, least squares, coordinate changes, and norm computations. They preserve geometry and improve numerical stability.