An inner product is a rule that assigns a scalar to a pair of vectors in a way that generalizes the geometric notions of length, angle, and orthogonality. An inner product turns a vector space into a geometric space. Once an inner product is available, one can define norms, distances, orthogonal projections, orthonormal bases, and many geometric decompositions. (en.wikipedia.org)
For real vector spaces, the inner product is usually written
⟨u,v⟩.
In Rn, the standard inner product is the dot product
⟨u,v⟩=uTv.
30.1 Definition
Let V be a vector space over R. An inner product on V is a function
⟨⋅,⋅⟩:V×V→R
satisfying the following properties for all u,v,w∈V and all scalars c∈R.
Symmetry
⟨u,v⟩=⟨v,u⟩.
Linearity in the First Variable
⟨u+v,w⟩=⟨u,w⟩+⟨v,w⟩,⟨cu,v⟩=c⟨u,v⟩.
Positive Definiteness
⟨v,v⟩≥0,
with equality only when
v=0.
A vector space together with an inner product is called an inner product space. (en.wikipedia.org)
30.2 Complex Inner Products
For complex vector spaces, the definition changes slightly.
An inner product
⟨⋅,⋅⟩:V×V→C
must satisfy:
Conjugate Symmetry
⟨u,v⟩=⟨v,u⟩.
Sesquilinearity
⟨u+v,w⟩=⟨u,w⟩+⟨v,w⟩,⟨cu,v⟩=c⟨u,v⟩,
and
⟨u,cv⟩=c⟨u,v⟩.
Positive Definiteness
⟨v,v⟩≥0,
with equality only for v=0.
Complex inner products are conjugate-linear in one variable and linear in the other.
30.3 Standard Inner Product on Rn
For vectors
u=u1u2⋮un,v=v1v2⋮vn,
the standard inner product is
⟨u,v⟩=u1v1+⋯+unvn.
Equivalently,
⟨u,v⟩=uTv.
This is the ordinary dot product.
For example,
⟨123,4−12⟩=1⋅4+2(−1)+3⋅2=8.
30.4 Norms from Inner Products
The length, or norm, of a vector is defined by
∥v∥=⟨v,v⟩.
For the standard inner product on Rn,
∥v∥=v12+⋯+vn2.
This is the Euclidean length.
For example,
[34]=32+42=5.
The norm measures magnitude.
30.5 Distance
The distance between vectors u and v is
d(u,v)=∥u−v∥.
Thus the geometry of the space comes entirely from the inner product.
In Rn,
d(u,v)=(u1−v1)2+⋯+(un−vn)2.
This is the ordinary Euclidean distance formula.
30.6 Orthogonality
Vectors u and v are orthogonal if
⟨u,v⟩=0.
Orthogonality generalizes perpendicularity.
For example,
u=[12],v=[2−1]
satisfy
⟨u,v⟩=1⋅2+2(−1)=0.
Thus u and v are orthogonal.
30.7 Orthogonal Sets
A set of vectors
{v1,…,vk}
is orthogonal if
⟨vi,vj⟩=0
whenever
i=j.
If additionally
∥vi∥=1
for every i, then the set is orthonormal.
Orthogonal sets are automatically linearly independent if none of the vectors is zero.
Indeed, suppose
c1v1+⋯+ckvk=0.
Take the inner product with vi:
ci⟨vi,vi⟩=0.
Since
⟨vi,vi⟩>0,
we get
ci=0.
Thus all coefficients vanish.
30.8 Pythagorean Theorem
If u and v are orthogonal, then
∥u+v∥2=∥u∥2+∥v∥2.
Proof:
∥u+v∥2=⟨u+v,u+v⟩.
Expand:
=⟨u,u⟩+⟨u,v⟩+⟨v,u⟩+⟨v,v⟩.
Orthogonality gives
⟨u,v⟩=0.
Thus
∥u+v∥2=∥u∥2+∥v∥2.
This generalizes the classical Pythagorean theorem.
30.9 Cauchy-Schwarz Inequality
For all vectors u,v,
∣⟨u,v⟩∣≤∥u∥∥v∥.
Equality holds exactly when u and v are linearly dependent. (en.wikipedia.org)
This inequality is fundamental. It bounds the size of the inner product in terms of vector lengths.
In Rn, it becomes
∣uTv∣≤uTuvTv.
30.10 Triangle Inequality
For all vectors u,v,
∥u+v∥≤∥u∥+∥v∥.
This follows from the Cauchy-Schwarz inequality.
It expresses the geometric fact that the direct path is shortest.
Norms derived from inner products always satisfy the triangle inequality.
30.11 Angle Between Vectors
For nonzero vectors u,v, define the angle θ by
cosθ=∥u∥∥v∥⟨u,v⟩.
The Cauchy-Schwarz inequality ensures that the fraction lies between −1 and 1.
If
⟨u,v⟩=0,
then
cosθ=0,
so
θ=2π.
Thus orthogonality corresponds to a right angle.
30.12 Inner Products on Function Spaces
Inner products are not limited to coordinate vectors.
For continuous functions on an interval [a,b], define
⟨f,g⟩=∫abf(x)g(x)dx.
This is an inner product on suitable function spaces.
For example, on [−1,1],
⟨1,x⟩=∫−11xdx=0.
Thus the functions 1 and x are orthogonal.
Function-space inner products are central in Fourier analysis, differential equations, and approximation theory.
30.13 Weighted Inner Products
Different inner products can exist on the same vector space.
For example, on Rn,
⟨u,v⟩A=uTAv,
where A is a symmetric positive definite matrix.
If
A=[2001],
then
⟨[x1x2],[y1y2]⟩A=2x1y1+x2y2.
This changes the geometry of the space. Lengths and angles are measured differently.
30.14 Orthogonal Complements
Let U⊆V. The orthogonal complement of U is
U⊥={v∈V:⟨v,u⟩=0 for all u∈U}.
It is a subspace of V.
For example, if
U=span([11])⊆R2,
then
U⊥=span([1−1]).
The orthogonal complement contains all vectors perpendicular to the subspace.
30.15 Orthogonal Decomposition
If U is a finite-dimensional subspace of an inner product space V, then every vector v∈V can be written uniquely as
v=u+w,
where
u∈U,w∈U⊥.
Thus
V=U⊕U⊥.
This decomposition separates a vector into a part inside the subspace and a part perpendicular to it.
30.16 Projection onto a Vector
Let u=0. The orthogonal projection of v onto u is
proju(v)=⟨u,u⟩⟨v,u⟩u.
This is the component of v in the direction of u.
For example, let
u=[11],v=[31].
Then
⟨v,u⟩=4,⟨u,u⟩=2.
Therefore
proju(v)=2[11]=[22].
30.17 Projection onto a Subspace
If
{u1,…,uk}
is an orthonormal basis of a subspace U, then the projection of v onto U is
projU(v)=⟨v,u1⟩u1+⋯+⟨v,uk⟩uk.
The error vector
v−projU(v)
lies in U⊥.
Orthogonal projection is the basis of least-squares approximation.
30.18 Orthonormal Bases
A basis
{u1,…,un}
is orthonormal if
⟨ui,uj⟩=δij.
In an orthonormal basis, coordinates are especially simple.
If
v=c1u1+⋯+cnun,
then
ci=⟨v,ui⟩.
Thus the coordinates are obtained directly by inner products.
30.19 Gram Matrix
Let
v1,…,vk
be vectors in an inner product space. The Gram matrix is
G=(⟨vi,vj⟩).
It records all pairwise inner products.
The Gram matrix is symmetric in the real case and Hermitian in the complex case.
The vectors are linearly independent exactly when the Gram matrix is invertible. (en.wikipedia.org)
30.20 Summary
An inner product gives a vector space geometric structure by defining lengths, angles, and orthogonality.
The key ideas are:
Concept
Meaning
Inner product
Scalar-valued bilinear pairing
Norm
∥v∥=⟨v,v⟩
Distance
d(u,v)=∥u−v∥
Orthogonality
⟨u,v⟩=0
Orthonormal set
Orthogonal vectors of unit length
Cauchy-Schwarz inequality
(
Orthogonal complement
U⊥
Projection
Closest vector in a subspace
Orthonormal basis
Basis with ⟨ui,uj⟩=δij
Gram matrix
Matrix of pairwise inner products
Inner products connect algebra and geometry. They allow vector spaces to support geometric reasoning about perpendicularity, approximation, decomposition, and distance.
← → section · ↑ ↓ slide · Space next · F fullscreen · Esc exit