Skip to content

Chapter 30. Inner Products

An inner product is a rule that assigns a scalar to a pair of vectors in a way that generalizes the geometric notions of length, angle, and orthogonality. An inner product turns a vector space into a geometric space. Once an inner product is available, one can define norms, distances, orthogonal projections, orthonormal bases, and many geometric decompositions. (en.wikipedia.org)

For real vector spaces, the inner product is usually written

u,v. \langle u,v\rangle.

In Rn\mathbb{R}^n, the standard inner product is the dot product

u,v=uTv. \langle u,v\rangle=u^Tv.

30.1 Definition

Let VV be a vector space over R\mathbb{R}. An inner product on VV is a function

,:V×VR \langle \cdot,\cdot\rangle:V\times V\to\mathbb{R}

satisfying the following properties for all u,v,wVu,v,w\in V and all scalars cRc\in\mathbb{R}.

Symmetry

u,v=v,u. \langle u,v\rangle=\langle v,u\rangle.

Linearity in the First Variable

u+v,w=u,w+v,w, \langle u+v,w\rangle = \langle u,w\rangle+\langle v,w\rangle, cu,v=cu,v. \langle cu,v\rangle = c\langle u,v\rangle.

Positive Definiteness

v,v0, \langle v,v\rangle\geq0,

with equality only when

v=0. v=0.

A vector space together with an inner product is called an inner product space. (en.wikipedia.org)

30.2 Complex Inner Products

For complex vector spaces, the definition changes slightly.

An inner product

,:V×VC \langle \cdot,\cdot\rangle:V\times V\to\mathbb{C}

must satisfy:

Conjugate Symmetry

u,v=v,u. \langle u,v\rangle=\overline{\langle v,u\rangle}.

Sesquilinearity

u+v,w=u,w+v,w, \langle u+v,w\rangle = \langle u,w\rangle+\langle v,w\rangle, cu,v=cu,v, \langle cu,v\rangle = c\langle u,v\rangle,

and

u,cv=cu,v. \langle u,cv\rangle = \overline{c}\langle u,v\rangle.

Positive Definiteness

v,v0, \langle v,v\rangle\geq0,

with equality only for v=0v=0.

Complex inner products are conjugate-linear in one variable and linear in the other.

30.3 Standard Inner Product on Rn\mathbb{R}^n

For vectors

u=[u1u2un],v=[v1v2vn], u= \begin{bmatrix} u_1\\ u_2\\ \vdots\\ u_n \end{bmatrix}, \qquad v= \begin{bmatrix} v_1\\ v_2\\ \vdots\\ v_n \end{bmatrix},

the standard inner product is

u,v=u1v1++unvn. \langle u,v\rangle = u_1v_1+\cdots+u_nv_n.

Equivalently,

u,v=uTv. \langle u,v\rangle=u^Tv.

This is the ordinary dot product.

For example,

[123],[412]=14+2(1)+32=8. \left\langle \begin{bmatrix} 1\\ 2\\ 3 \end{bmatrix}, \begin{bmatrix} 4\\ -1\\ 2 \end{bmatrix} \right\rangle = 1\cdot4+2(-1)+3\cdot2 = 8.

30.4 Norms from Inner Products

The length, or norm, of a vector is defined by

v=v,v. \|v\|=\sqrt{\langle v,v\rangle}.

For the standard inner product on Rn\mathbb{R}^n,

v=v12++vn2. \|v\| = \sqrt{v_1^2+\cdots+v_n^2}.

This is the Euclidean length.

For example,

[34]=32+42=5. \left\| \begin{bmatrix} 3\\ 4 \end{bmatrix} \right\| = \sqrt{3^2+4^2} = 5.

The norm measures magnitude.

30.5 Distance

The distance between vectors uu and vv is

d(u,v)=uv. d(u,v)=\|u-v\|.

Thus the geometry of the space comes entirely from the inner product.

In Rn\mathbb{R}^n,

d(u,v)=(u1v1)2++(unvn)2. d(u,v) = \sqrt{(u_1-v_1)^2+\cdots+(u_n-v_n)^2}.

This is the ordinary Euclidean distance formula.

30.6 Orthogonality

Vectors uu and vv are orthogonal if

u,v=0. \langle u,v\rangle=0.

Orthogonality generalizes perpendicularity.

For example,

u=[12],v=[21] u= \begin{bmatrix} 1\\ 2 \end{bmatrix}, \qquad v= \begin{bmatrix} 2\\ -1 \end{bmatrix}

satisfy

u,v=12+2(1)=0. \langle u,v\rangle = 1\cdot2+2(-1) = 0.

Thus uu and vv are orthogonal.

30.7 Orthogonal Sets

A set of vectors

{v1,,vk} \{v_1,\ldots,v_k\}

is orthogonal if

vi,vj=0 \langle v_i,v_j\rangle=0

whenever

ij. i\neq j.

If additionally

vi=1 \|v_i\|=1

for every ii, then the set is orthonormal.

Orthogonal sets are automatically linearly independent if none of the vectors is zero.

Indeed, suppose

c1v1++ckvk=0. c_1v_1+\cdots+c_kv_k=0.

Take the inner product with viv_i:

civi,vi=0. c_i\langle v_i,v_i\rangle=0.

Since

vi,vi>0, \langle v_i,v_i\rangle>0,

we get

ci=0. c_i=0.

Thus all coefficients vanish.

30.8 Pythagorean Theorem

If uu and vv are orthogonal, then

u+v2=u2+v2. \|u+v\|^2=\|u\|^2+\|v\|^2.

Proof:

u+v2=u+v,u+v. \|u+v\|^2 = \langle u+v,u+v\rangle.

Expand:

=u,u+u,v+v,u+v,v. = \langle u,u\rangle + \langle u,v\rangle + \langle v,u\rangle + \langle v,v\rangle.

Orthogonality gives

u,v=0. \langle u,v\rangle=0.

Thus

u+v2=u2+v2. \|u+v\|^2 = \|u\|^2+\|v\|^2.

This generalizes the classical Pythagorean theorem.

30.9 Cauchy-Schwarz Inequality

For all vectors u,vu,v,

u,vuv. |\langle u,v\rangle| \leq \|u\|\|v\|.

Equality holds exactly when uu and vv are linearly dependent. (en.wikipedia.org)

This inequality is fundamental. It bounds the size of the inner product in terms of vector lengths.

In Rn\mathbb{R}^n, it becomes

uTvuTuvTv. |u^Tv| \leq \sqrt{u^Tu}\sqrt{v^Tv}.

30.10 Triangle Inequality

For all vectors u,vu,v,

u+vu+v. \|u+v\| \leq \|u\|+\|v\|.

This follows from the Cauchy-Schwarz inequality.

It expresses the geometric fact that the direct path is shortest.

Norms derived from inner products always satisfy the triangle inequality.

30.11 Angle Between Vectors

For nonzero vectors u,vu,v, define the angle θ\theta by

cosθ=u,vuv. \cos\theta = \frac{\langle u,v\rangle}{\|u\|\|v\|}.

The Cauchy-Schwarz inequality ensures that the fraction lies between 1-1 and 11.

If

u,v=0, \langle u,v\rangle=0,

then

cosθ=0, \cos\theta=0,

so

θ=π2. \theta=\frac{\pi}{2}.

Thus orthogonality corresponds to a right angle.

30.12 Inner Products on Function Spaces

Inner products are not limited to coordinate vectors.

For continuous functions on an interval [a,b][a,b], define

f,g=abf(x)g(x)dx. \langle f,g\rangle = \int_a^b f(x)g(x)\,dx.

This is an inner product on suitable function spaces.

For example, on [1,1][-1,1],

1,x=11xdx=0. \langle 1,x\rangle = \int_{-1}^1 x\,dx = 0.

Thus the functions 11 and xx are orthogonal.

Function-space inner products are central in Fourier analysis, differential equations, and approximation theory.

30.13 Weighted Inner Products

Different inner products can exist on the same vector space.

For example, on Rn\mathbb{R}^n,

u,vA=uTAv, \langle u,v\rangle_A=u^TAv,

where AA is a symmetric positive definite matrix.

If

A=[2001], A= \begin{bmatrix} 2&0\\ 0&1 \end{bmatrix},

then

[x1x2],[y1y2]A=2x1y1+x2y2. \left\langle \begin{bmatrix} x_1\\ x_2 \end{bmatrix}, \begin{bmatrix} y_1\\ y_2 \end{bmatrix} \right\rangle_A = 2x_1y_1+x_2y_2.

This changes the geometry of the space. Lengths and angles are measured differently.

30.14 Orthogonal Complements

Let UVU\subseteq V. The orthogonal complement of UU is

U={vV:v,u=0 for all uU}. U^\perp = \{v\in V:\langle v,u\rangle=0 \text{ for all }u\in U\}.

It is a subspace of VV.

For example, if

U=span([11])R2, U= \operatorname{span} \left( \begin{bmatrix} 1\\ 1 \end{bmatrix} \right) \subseteq\mathbb{R}^2,

then

U=span([11]). U^\perp = \operatorname{span} \left( \begin{bmatrix} 1\\ -1 \end{bmatrix} \right).

The orthogonal complement contains all vectors perpendicular to the subspace.

30.15 Orthogonal Decomposition

If UU is a finite-dimensional subspace of an inner product space VV, then every vector vVv\in V can be written uniquely as

v=u+w, v=u+w,

where

uU,wU. u\in U, \qquad w\in U^\perp.

Thus

V=UU. V=U\oplus U^\perp.

This decomposition separates a vector into a part inside the subspace and a part perpendicular to it.

30.16 Projection onto a Vector

Let u0u\neq0. The orthogonal projection of vv onto uu is

proju(v)=v,uu,uu. \operatorname{proj}_u(v) = \frac{\langle v,u\rangle}{\langle u,u\rangle}u.

This is the component of vv in the direction of uu.

For example, let

u=[11],v=[31]. u= \begin{bmatrix} 1\\ 1 \end{bmatrix}, \qquad v= \begin{bmatrix} 3\\ 1 \end{bmatrix}.

Then

v,u=4,u,u=2. \langle v,u\rangle=4, \qquad \langle u,u\rangle=2.

Therefore

proju(v)=2[11]=[22]. \operatorname{proj}_u(v) = 2 \begin{bmatrix} 1\\ 1 \end{bmatrix} = \begin{bmatrix} 2\\ 2 \end{bmatrix}.

30.17 Projection onto a Subspace

If

{u1,,uk} \{u_1,\ldots,u_k\}

is an orthonormal basis of a subspace UU, then the projection of vv onto UU is

projU(v)=v,u1u1++v,ukuk. \operatorname{proj}_U(v) = \langle v,u_1\rangle u_1 +\cdots+ \langle v,u_k\rangle u_k.

The error vector

vprojU(v) v-\operatorname{proj}_U(v)

lies in UU^\perp.

Orthogonal projection is the basis of least-squares approximation.

30.18 Orthonormal Bases

A basis

{u1,,un} \{u_1,\ldots,u_n\}

is orthonormal if

ui,uj=δij. \langle u_i,u_j\rangle=\delta_{ij}.

In an orthonormal basis, coordinates are especially simple.

If

v=c1u1++cnun, v=c_1u_1+\cdots+c_nu_n,

then

ci=v,ui. c_i=\langle v,u_i\rangle.

Thus the coordinates are obtained directly by inner products.

30.19 Gram Matrix

Let

v1,,vk v_1,\ldots,v_k

be vectors in an inner product space. The Gram matrix is

G=(vi,vj). G=(\langle v_i,v_j\rangle).

It records all pairwise inner products.

The Gram matrix is symmetric in the real case and Hermitian in the complex case.

The vectors are linearly independent exactly when the Gram matrix is invertible. (en.wikipedia.org)

30.20 Summary

An inner product gives a vector space geometric structure by defining lengths, angles, and orthogonality.

The key ideas are:

ConceptMeaning
Inner productScalar-valued bilinear pairing
Normv=v,v\|v\|=\sqrt{\langle v,v\rangle}
Distanced(u,v)=uvd(u,v)=\|u-v\|
Orthogonalityu,v=0\langle u,v\rangle=0
Orthonormal setOrthogonal vectors of unit length
Cauchy-Schwarz inequality(
Orthogonal complementUU^\perp
ProjectionClosest vector in a subspace
Orthonormal basisBasis with ui,uj=δij\langle u_i,u_j\rangle=\delta_{ij}
Gram matrixMatrix of pairwise inner products

Inner products connect algebra and geometry. They allow vector spaces to support geometric reasoning about perpendicularity, approximation, decomposition, and distance.