# Chapter 30. Inner Products

# Chapter 30. Inner Products

An inner product is a rule that assigns a scalar to a pair of vectors in a way that generalizes the geometric notions of length, angle, and orthogonality. An inner product turns a vector space into a geometric space. Once an inner product is available, one can define norms, distances, orthogonal projections, orthonormal bases, and many geometric decompositions. ([en.wikipedia.org](https://en.wikipedia.org/wiki/Inner_product_space?utm_source=chatgpt.com))

For real vector spaces, the inner product is usually written

$$
\langle u,v\rangle.
$$

In \(\mathbb{R}^n\), the standard inner product is the dot product

$$
\langle u,v\rangle=u^Tv.
$$

## 30.1 Definition

Let \(V\) be a vector space over \(\mathbb{R}\). An inner product on \(V\) is a function

$$
\langle \cdot,\cdot\rangle:V\times V\to\mathbb{R}
$$

satisfying the following properties for all \(u,v,w\in V\) and all scalars \(c\in\mathbb{R}\).

### Symmetry

$$
\langle u,v\rangle=\langle v,u\rangle.
$$

### Linearity in the First Variable

$$
\langle u+v,w\rangle =
\langle u,w\rangle+\langle v,w\rangle,
$$

$$
\langle cu,v\rangle =
c\langle u,v\rangle.
$$

### Positive Definiteness

$$
\langle v,v\rangle\geq0,
$$

with equality only when

$$
v=0.
$$

A vector space together with an inner product is called an inner product space. ([en.wikipedia.org](https://en.wikipedia.org/wiki/Inner_product_space?utm_source=chatgpt.com))

## 30.2 Complex Inner Products

For complex vector spaces, the definition changes slightly.

An inner product

$$
\langle \cdot,\cdot\rangle:V\times V\to\mathbb{C}
$$

must satisfy:

### Conjugate Symmetry

$$
\langle u,v\rangle=\overline{\langle v,u\rangle}.
$$

### Sesquilinearity

$$
\langle u+v,w\rangle =
\langle u,w\rangle+\langle v,w\rangle,
$$

$$
\langle cu,v\rangle =
c\langle u,v\rangle,
$$

and

$$
\langle u,cv\rangle =
\overline{c}\langle u,v\rangle.
$$

### Positive Definiteness

$$
\langle v,v\rangle\geq0,
$$

with equality only for \(v=0\).

Complex inner products are conjugate-linear in one variable and linear in the other.

## 30.3 Standard Inner Product on \(\mathbb{R}^n\)

For vectors

$$
u=
\begin{bmatrix}
u_1\\
u_2\\
\vdots\\
u_n
\end{bmatrix},
\qquad
v=
\begin{bmatrix}
v_1\\
v_2\\
\vdots\\
v_n
\end{bmatrix},
$$

the standard inner product is

$$
\langle u,v\rangle =
u_1v_1+\cdots+u_nv_n.
$$

Equivalently,

$$
\langle u,v\rangle=u^Tv.
$$

This is the ordinary dot product.

For example,

$$
\left\langle
\begin{bmatrix}
1\\
2\\
3
\end{bmatrix},
\begin{bmatrix}
4\\
-1\\
2
\end{bmatrix}
\right\rangle =
1\cdot4+2(-1)+3\cdot2 =
8.
$$

## 30.4 Norms from Inner Products

The length, or norm, of a vector is defined by

$$
\|v\|=\sqrt{\langle v,v\rangle}.
$$

For the standard inner product on \(\mathbb{R}^n\),

$$
\|v\| =
\sqrt{v_1^2+\cdots+v_n^2}.
$$

This is the Euclidean length.

For example,

$$
\left\|
\begin{bmatrix}
3\\
4
\end{bmatrix}
\right\| =
\sqrt{3^2+4^2} =
5.
$$

The norm measures magnitude.

## 30.5 Distance

The distance between vectors \(u\) and \(v\) is

$$
d(u,v)=\|u-v\|.
$$

Thus the geometry of the space comes entirely from the inner product.

In \(\mathbb{R}^n\),

$$
d(u,v) =
\sqrt{(u_1-v_1)^2+\cdots+(u_n-v_n)^2}.
$$

This is the ordinary Euclidean distance formula.

## 30.6 Orthogonality

Vectors \(u\) and \(v\) are orthogonal if

$$
\langle u,v\rangle=0.
$$

Orthogonality generalizes perpendicularity.

For example,

$$
u=
\begin{bmatrix}
1\\
2
\end{bmatrix},
\qquad
v=
\begin{bmatrix}
2\\
-1
\end{bmatrix}
$$

satisfy

$$
\langle u,v\rangle =
1\cdot2+2(-1) =
0.
$$

Thus \(u\) and \(v\) are orthogonal.

## 30.7 Orthogonal Sets

A set of vectors

$$
\{v_1,\ldots,v_k\}
$$

is orthogonal if

$$
\langle v_i,v_j\rangle=0
$$

whenever

$$
i\neq j.
$$

If additionally

$$
\|v_i\|=1
$$

for every \(i\), then the set is orthonormal.

Orthogonal sets are automatically linearly independent if none of the vectors is zero.

Indeed, suppose

$$
c_1v_1+\cdots+c_kv_k=0.
$$

Take the inner product with \(v_i\):

$$
c_i\langle v_i,v_i\rangle=0.
$$

Since

$$
\langle v_i,v_i\rangle>0,
$$

we get

$$
c_i=0.
$$

Thus all coefficients vanish.

## 30.8 Pythagorean Theorem

If \(u\) and \(v\) are orthogonal, then

$$
\|u+v\|^2=\|u\|^2+\|v\|^2.
$$

Proof:

$$
\|u+v\|^2 =
\langle u+v,u+v\rangle.
$$

Expand:

$$ =
\langle u,u\rangle
+
\langle u,v\rangle
+
\langle v,u\rangle
+
\langle v,v\rangle.
$$

Orthogonality gives

$$
\langle u,v\rangle=0.
$$

Thus

$$
\|u+v\|^2 =
\|u\|^2+\|v\|^2.
$$

This generalizes the classical Pythagorean theorem.

## 30.9 Cauchy-Schwarz Inequality

For all vectors \(u,v\),

$$
|\langle u,v\rangle|
\leq
\|u\|\|v\|.
$$

Equality holds exactly when \(u\) and \(v\) are linearly dependent. ([en.wikipedia.org](https://en.wikipedia.org/wiki/Cauchy%E2%80%93Schwarz_inequality?utm_source=chatgpt.com))

This inequality is fundamental. It bounds the size of the inner product in terms of vector lengths.

In \(\mathbb{R}^n\), it becomes

$$
|u^Tv|
\leq
\sqrt{u^Tu}\sqrt{v^Tv}.
$$

## 30.10 Triangle Inequality

For all vectors \(u,v\),

$$
\|u+v\|
\leq
\|u\|+\|v\|.
$$

This follows from the Cauchy-Schwarz inequality.

It expresses the geometric fact that the direct path is shortest.

Norms derived from inner products always satisfy the triangle inequality.

## 30.11 Angle Between Vectors

For nonzero vectors \(u,v\), define the angle \(\theta\) by

$$
\cos\theta =
\frac{\langle u,v\rangle}{\|u\|\|v\|}.
$$

The Cauchy-Schwarz inequality ensures that the fraction lies between \(-1\) and \(1\).

If

$$
\langle u,v\rangle=0,
$$

then

$$
\cos\theta=0,
$$

so

$$
\theta=\frac{\pi}{2}.
$$

Thus orthogonality corresponds to a right angle.

## 30.12 Inner Products on Function Spaces

Inner products are not limited to coordinate vectors.

For continuous functions on an interval \([a,b]\), define

$$
\langle f,g\rangle =
\int_a^b f(x)g(x)\,dx.
$$

This is an inner product on suitable function spaces.

For example, on \([-1,1]\),

$$
\langle 1,x\rangle =
\int_{-1}^1 x\,dx =
0.
$$

Thus the functions \(1\) and \(x\) are orthogonal.

Function-space inner products are central in Fourier analysis, differential equations, and approximation theory.

## 30.13 Weighted Inner Products

Different inner products can exist on the same vector space.

For example, on \(\mathbb{R}^n\),

$$
\langle u,v\rangle_A=u^TAv,
$$

where \(A\) is a symmetric positive definite matrix.

If

$$
A=
\begin{bmatrix}
2&0\\
0&1
\end{bmatrix},
$$

then

$$
\left\langle
\begin{bmatrix}
x_1\\
x_2
\end{bmatrix},
\begin{bmatrix}
y_1\\
y_2
\end{bmatrix}
\right\rangle_A =
2x_1y_1+x_2y_2.
$$

This changes the geometry of the space. Lengths and angles are measured differently.

## 30.14 Orthogonal Complements

Let \(U\subseteq V\). The orthogonal complement of \(U\) is

$$
U^\perp =
\{v\in V:\langle v,u\rangle=0
\text{ for all }u\in U\}.
$$

It is a subspace of \(V\).

For example, if

$$
U=
\operatorname{span}
\left(
\begin{bmatrix}
1\\
1
\end{bmatrix}
\right)
\subseteq\mathbb{R}^2,
$$

then

$$
U^\perp =
\operatorname{span}
\left(
\begin{bmatrix}
1\\
-1
\end{bmatrix}
\right).
$$

The orthogonal complement contains all vectors perpendicular to the subspace.

## 30.15 Orthogonal Decomposition

If \(U\) is a finite-dimensional subspace of an inner product space \(V\), then every vector \(v\in V\) can be written uniquely as

$$
v=u+w,
$$

where

$$
u\in U,
\qquad
w\in U^\perp.
$$

Thus

$$
V=U\oplus U^\perp.
$$

This decomposition separates a vector into a part inside the subspace and a part perpendicular to it.

## 30.16 Projection onto a Vector

Let \(u\neq0\). The orthogonal projection of \(v\) onto \(u\) is

$$
\operatorname{proj}_u(v) =
\frac{\langle v,u\rangle}{\langle u,u\rangle}u.
$$

This is the component of \(v\) in the direction of \(u\).

For example, let

$$
u=
\begin{bmatrix}
1\\
1
\end{bmatrix},
\qquad
v=
\begin{bmatrix}
3\\
1
\end{bmatrix}.
$$

Then

$$
\langle v,u\rangle=4,
\qquad
\langle u,u\rangle=2.
$$

Therefore

$$
\operatorname{proj}_u(v) =
2
\begin{bmatrix}
1\\
1
\end{bmatrix} =
\begin{bmatrix}
2\\
2
\end{bmatrix}.
$$

## 30.17 Projection onto a Subspace

If

$$
\{u_1,\ldots,u_k\}
$$

is an orthonormal basis of a subspace \(U\), then the projection of \(v\) onto \(U\) is

$$
\operatorname{proj}_U(v) =
\langle v,u_1\rangle u_1
+\cdots+
\langle v,u_k\rangle u_k.
$$

The error vector

$$
v-\operatorname{proj}_U(v)
$$

lies in \(U^\perp\).

Orthogonal projection is the basis of least-squares approximation.

## 30.18 Orthonormal Bases

A basis

$$
\{u_1,\ldots,u_n\}
$$

is orthonormal if

$$
\langle u_i,u_j\rangle=\delta_{ij}.
$$

In an orthonormal basis, coordinates are especially simple.

If

$$
v=c_1u_1+\cdots+c_nu_n,
$$

then

$$
c_i=\langle v,u_i\rangle.
$$

Thus the coordinates are obtained directly by inner products.

## 30.19 Gram Matrix

Let

$$
v_1,\ldots,v_k
$$

be vectors in an inner product space. The Gram matrix is

$$
G=(\langle v_i,v_j\rangle).
$$

It records all pairwise inner products.

The Gram matrix is symmetric in the real case and Hermitian in the complex case.

The vectors are linearly independent exactly when the Gram matrix is invertible. ([en.wikipedia.org](https://en.wikipedia.org/wiki/Gram_matrix?utm_source=chatgpt.com))

## 30.20 Summary

An inner product gives a vector space geometric structure by defining lengths, angles, and orthogonality.

The key ideas are:

| Concept | Meaning |
|---|---|
| Inner product | Scalar-valued bilinear pairing |
| Norm | \(\|v\|=\sqrt{\langle v,v\rangle}\) |
| Distance | \(d(u,v)=\|u-v\|\) |
| Orthogonality | \(\langle u,v\rangle=0\) |
| Orthonormal set | Orthogonal vectors of unit length |
| Cauchy-Schwarz inequality | \(|\langle u,v\rangle|\leq\|u\|\|v\|\) |
| Orthogonal complement | \(U^\perp\) |
| Projection | Closest vector in a subspace |
| Orthonormal basis | Basis with \(\langle u_i,u_j\rangle=\delta_{ij}\) |
| Gram matrix | Matrix of pairwise inner products |

Inner products connect algebra and geometry. They allow vector spaces to support geometric reasoning about perpendicularity, approximation, decomposition, and distance.
