# Chapter 19. Span and Linear Combination

# Chapter 19. Span and Linear Combination

A linear combination is a vector built from other vectors by scalar multiplication and addition. The span of a set of vectors is the set of all vectors that can be built in this way. These two ideas connect vector arithmetic with geometry, systems of equations, subspaces, basis, rank, and dimension.

If \(v_1,\ldots,v_k\) are vectors in a vector space \(V\) over a field \(F\), then a linear combination has the form

$$
c_1v_1 + c_2v_2 + \cdots + c_kv_k,
$$

where \(c_1,\ldots,c_k \in F\). The span of \(v_1,\ldots,v_k\) is the set of all such linear combinations.

## 19.1 Linear Combinations

Let \(V\) be a vector space over a field \(F\). Let

$$
v_1, v_2, \ldots, v_k \in V.
$$

A vector \(v \in V\) is called a linear combination of \(v_1,\ldots,v_k\) if there exist scalars

$$
c_1,c_2,\ldots,c_k \in F
$$

such that

$$
v = c_1v_1 + c_2v_2 + \cdots + c_kv_k.
$$

The scalars \(c_1,\ldots,c_k\) are called coefficients.

Only two operations are used: scalar multiplication and vector addition. No products of vectors appear. No nonlinear functions appear. This is why the expression is called linear.

## 19.2 First Examples

In \(\mathbb{R}^2\), let

$$
v_1 =
\begin{bmatrix}
1 \\
0
\end{bmatrix},
\qquad
v_2 =
\begin{bmatrix}
0 \\
1
\end{bmatrix}.
$$

Then

$$
3v_1 - 2v_2 =
3
\begin{bmatrix}
1 \\
0
\end{bmatrix} -
2
\begin{bmatrix}
0 \\
1
\end{bmatrix} =
\begin{bmatrix}
3 \\
-2
\end{bmatrix}.
$$

Thus

$$
\begin{bmatrix}
3 \\
-2
\end{bmatrix}
$$

is a linear combination of \(v_1\) and \(v_2\).

More generally,

$$
a
\begin{bmatrix}
1 \\
0
\end{bmatrix}
+
b
\begin{bmatrix}
0 \\
1
\end{bmatrix} =
\begin{bmatrix}
a \\
b
\end{bmatrix}.
$$

Every vector in \(\mathbb{R}^2\) can be written as a linear combination of these two vectors.

## 19.3 Linear Combinations in \(\mathbb{R}^n\)

Let

$$
v_1 =
\begin{bmatrix}
1 \\
2 \\
0
\end{bmatrix},
\qquad
v_2 =
\begin{bmatrix}
-1 \\
1 \\
3
\end{bmatrix}.
$$

A general linear combination is

$$
av_1 + bv_2 =
a
\begin{bmatrix}
1 \\
2 \\
0
\end{bmatrix}
+
b
\begin{bmatrix}
-1 \\
1 \\
3
\end{bmatrix}.
$$

Computing component by component,

$$
av_1 + bv_2 =
\begin{bmatrix}
a-b \\
2a+b \\
3b
\end{bmatrix}.
$$

As \(a\) and \(b\) vary over \(\mathbb{R}\), this expression produces many vectors in \(\mathbb{R}^3\). It does not produce all of \(\mathbb{R}^3\). It produces a plane through the origin.

## 19.4 Span

The span of vectors \(v_1,\ldots,v_k\) is the set of all their linear combinations:

$$
\operatorname{span}(v_1,\ldots,v_k) =
\{c_1v_1 + \cdots + c_kv_k : c_1,\ldots,c_k \in F\}.
$$

The span is also written as

$$
\operatorname{span}\{v_1,\ldots,v_k\}.
$$

Both notations mean the same thing.

If

$$
W = \operatorname{span}(v_1,\ldots,v_k),
$$

then the vectors \(v_1,\ldots,v_k\) are said to span \(W\), or generate \(W\).

The word generate is useful. A spanning set is a collection of vectors from which the whole subspace can be generated by linear combinations.

## 19.5 Span as a Subspace

The span of any set of vectors is a subspace.

Let

$$
W = \operatorname{span}(v_1,\ldots,v_k).
$$

Then \(W\) contains all vectors of the form

$$
c_1v_1 + \cdots + c_kv_k.
$$

First, \(W\) contains the zero vector, because

$$
0v_1 + 0v_2 + \cdots + 0v_k = 0.
$$

Next, let

$$
u = a_1v_1 + \cdots + a_kv_k
$$

and

$$
w = b_1v_1 + \cdots + b_kv_k.
$$

Then

$$
u+w =
(a_1+b_1)v_1 + \cdots + (a_k+b_k)v_k.
$$

This is again a linear combination of \(v_1,\ldots,v_k\). Hence

$$
u+w \in W.
$$

For a scalar \(c\),

$$
cu =
(ca_1)v_1 + \cdots + (ca_k)v_k.
$$

This is also a linear combination of the same vectors. Hence

$$
cu \in W.
$$

Therefore \(W\) is a subspace.

## 19.6 The Smallest Subspace Containing a Set

The span of a set is the smallest subspace containing that set.

Let

$$
S = \{v_1,\ldots,v_k\}.
$$

The span contains every \(v_i\), since

$$
v_i = 0v_1 + \cdots + 1v_i + \cdots + 0v_k.
$$

Now suppose \(U\) is any subspace containing \(v_1,\ldots,v_k\). Since \(U\) is closed under scalar multiplication and addition, it must contain every linear combination

$$
c_1v_1 + \cdots + c_kv_k.
$$

Therefore

$$
\operatorname{span}(v_1,\ldots,v_k) \subseteq U.
$$

Thus the span is contained in every subspace that contains the original vectors.

## 19.7 Geometric Meaning of Span

In \(\mathbb{R}^2\), the span of one nonzero vector is a line through the origin.

If

$$
v =
\begin{bmatrix}
2 \\
1
\end{bmatrix},
$$

then

$$
\operatorname{span}(v) =
\left\{
c
\begin{bmatrix}
2 \\
1
\end{bmatrix}
: c \in \mathbb{R}
\right\}.
$$

This set is the line through the origin in the direction of \(v\).

In \(\mathbb{R}^3\), the span of one nonzero vector is also a line through the origin. The span of two nonparallel vectors is a plane through the origin. The span of three suitable vectors may be all of \(\mathbb{R}^3\).

The word suitable means linearly independent. This will be made precise in the next chapter.

## 19.8 Spanning the Plane

Let

$$
v_1 =
\begin{bmatrix}
1 \\
1
\end{bmatrix},
\qquad
v_2 =
\begin{bmatrix}
1 \\
-1
\end{bmatrix}.
$$

We ask whether these vectors span \(\mathbb{R}^2\).

A general linear combination is

$$
av_1 + bv_2 =
a
\begin{bmatrix}
1 \\
1
\end{bmatrix}
+
b
\begin{bmatrix}
1 \\
-1
\end{bmatrix} =
\begin{bmatrix}
a+b \\
a-b
\end{bmatrix}.
$$

Given an arbitrary vector

$$
\begin{bmatrix}
x \\
y
\end{bmatrix},
$$

we need

$$
a+b = x
$$

and

$$
a-b = y.
$$

Adding these equations gives

$$
2a = x+y,
$$

so

$$
a = \frac{x+y}{2}.
$$

Subtracting gives

$$
2b = x-y,
$$

so

$$
b = \frac{x-y}{2}.
$$

For every \(x,y \in \mathbb{R}\), such scalars \(a\) and \(b\) exist. Therefore

$$
\operatorname{span}(v_1,v_2) = \mathbb{R}^2.
$$

## 19.9 Failing to Span the Plane

Let

$$
v_1 =
\begin{bmatrix}
1 \\
2
\end{bmatrix},
\qquad
v_2 =
\begin{bmatrix}
2 \\
4
\end{bmatrix}.
$$

A general linear combination is

$$
av_1 + bv_2 =
a
\begin{bmatrix}
1 \\
2
\end{bmatrix}
+
b
\begin{bmatrix}
2 \\
4
\end{bmatrix} =
\begin{bmatrix}
a+2b \\
2a+4b
\end{bmatrix}.
$$

The second component is always twice the first component:

$$
2a+4b = 2(a+2b).
$$

Thus every vector in the span lies on the line

$$
y = 2x.
$$

Therefore

$$
\operatorname{span}(v_1,v_2)
\neq
\mathbb{R}^2.
$$

The second vector gives no new direction, since

$$
v_2 = 2v_1.
$$

It is redundant.

## 19.10 Span and Systems of Equations

Span questions are equivalent to systems of linear equations.

Let

$$
A =
\begin{bmatrix}
1 & 2 \\
3 & 4 \\
5 & 6
\end{bmatrix}.
$$

The columns are

$$
a_1 =
\begin{bmatrix}
1 \\
3 \\
5
\end{bmatrix},
\qquad
a_2 =
\begin{bmatrix}
2 \\
4 \\
6
\end{bmatrix}.
$$

A vector \(b \in \mathbb{R}^3\) belongs to the column space of \(A\) exactly when there exists \(x \in \mathbb{R}^2\) such that

$$
Ax = b.
$$

Writing

$$
x =
\begin{bmatrix}
x_1 \\
x_2
\end{bmatrix},
$$

we get

$$
Ax = x_1a_1 + x_2a_2.
$$

Thus solving

$$
Ax = b
$$

means deciding whether \(b\) is a linear combination of the columns of \(A\).

This is one of the central interpretations of matrix multiplication.

## 19.11 Column Space

The column space of a matrix \(A\) is the span of its columns:

$$
\operatorname{Col}(A) =
\operatorname{span}(a_1,\ldots,a_n).
$$

If \(A\) is an \(m \times n\) matrix, then

$$
\operatorname{Col}(A) \subseteq \mathbb{R}^m.
$$

The column space is the set of all possible outputs of the transformation

$$
x \mapsto Ax.
$$

Therefore

$$
Ax = b
$$

has a solution if and only if

$$
b \in \operatorname{Col}(A).
$$

This gives a geometric interpretation of consistency for linear systems.

## 19.12 Spanning Sets

A set \(S\) spans a vector space \(V\) if

$$
\operatorname{span}(S) = V.
$$

In this case, every vector in \(V\) can be expressed as a linear combination of vectors from \(S\).

For example,

$$
\left\{
\begin{bmatrix}
1 \\
0
\end{bmatrix},
\begin{bmatrix}
0 \\
1
\end{bmatrix}
\right\}
$$

spans \(\mathbb{R}^2\).

The set

$$
\left\{
\begin{bmatrix}
1 \\
0
\end{bmatrix},
\begin{bmatrix}
0 \\
1
\end{bmatrix},
\begin{bmatrix}
1 \\
1
\end{bmatrix}
\right\}
$$

also spans \(\mathbb{R}^2\), but it contains a redundant vector because

$$
\begin{bmatrix}
1 \\
1
\end{bmatrix} =
\begin{bmatrix}
1 \\
0
\end{bmatrix}
+
\begin{bmatrix}
0 \\
1
\end{bmatrix}.
$$

A spanning set can have redundant vectors. A basis cannot.

## 19.13 Redundancy

A vector in a spanning set is redundant if it can be written as a linear combination of the other vectors.

Suppose

$$
v_k \in \operatorname{span}(v_1,\ldots,v_{k-1}).
$$

Then

$$
\operatorname{span}(v_1,\ldots,v_{k-1},v_k) =
\operatorname{span}(v_1,\ldots,v_{k-1}).
$$

Adding \(v_k\) does not enlarge the span.

This observation is the bridge from spanning sets to bases. A basis is a spanning set with all redundancy removed.

## 19.14 Linear Combinations of Polynomials

Let

$$
p_1(x)=1,
\qquad
p_2(x)=x,
\qquad
p_3(x)=x^2.
$$

A general linear combination is

$$
a p_1(x) + b p_2(x) + c p_3(x) =
a + bx + cx^2.
$$

Thus

$$
\operatorname{span}(1,x,x^2) =
P_2,
$$

the vector space of all polynomials of degree at most \(2\).

Now consider

$$
1+x,
\qquad
1-x.
$$

Their span contains all polynomials of the form

$$
a(1+x)+b(1-x) =
(a+b)+(a-b)x.
$$

This is every polynomial of degree at most \(1\). Therefore

$$
\operatorname{span}(1+x,1-x) = P_1.
$$

## 19.15 Linear Combinations of Matrices

Matrices of the same size form a vector space.

Let

$$
A =
\begin{bmatrix}
1 & 0 \\
0 & 0
\end{bmatrix},
\quad
B =
\begin{bmatrix}
0 & 1 \\
0 & 0
\end{bmatrix},
\quad
C =
\begin{bmatrix}
0 & 0 \\
1 & 0
\end{bmatrix},
\quad
D =
\begin{bmatrix}
0 & 0 \\
0 & 1
\end{bmatrix}.
$$

A general linear combination is

$$
aA+bB+cC+dD =
\begin{bmatrix}
a & b \\
c & d
\end{bmatrix}.
$$

Therefore

$$
\operatorname{span}(A,B,C,D) =
M_{2 \times 2}(\mathbb{R}).
$$

These four matrices behave like the standard basis vectors for \(2 \times 2\) matrices.

## 19.16 Linear Combinations of Functions

Function spaces also use the same idea.

Let

$$
f_1(x)=1,
\qquad
f_2(x)=\cos x,
\qquad
f_3(x)=\sin x.
$$

A general linear combination is

$$
a f_1(x)+b f_2(x)+c f_3(x) =
a+b\cos x+c\sin x.
$$

The span is

$$
\operatorname{span}(1,\cos x,\sin x) =
\{a+b\cos x+c\sin x : a,b,c \in \mathbb{R}\}.
$$

This is a three-dimensional subspace of the vector space of all real-valued functions.

The same formal definition applies even though the vectors are functions instead of columns.

## 19.17 Finite Linear Combinations

A linear combination always uses finitely many vectors.

If \(S\) is an infinite subset of a vector space, then

$$
\operatorname{span}(S)
$$

means the set of all finite linear combinations of vectors from \(S\).

For example, the space of all polynomials is spanned by

$$
1,x,x^2,x^3,\ldots.
$$

Every individual polynomial uses only finitely many powers of \(x\). For instance,

$$
3 - 2x + 7x^5
$$

uses only

$$
1,\ x,\ x^5.
$$

Infinite sums require additional notions of convergence. Those belong to analysis, not elementary vector space theory.

## 19.18 Span and Dimension

The dimension of a span is the number of independent directions generated by the vectors.

A single nonzero vector spans a one-dimensional subspace.

Two nonparallel vectors span a two-dimensional subspace.

Three vectors in \(\mathbb{R}^3\) may span:

| Case | Span |
|---|---|
| All zero | \(\{0\}\) |
| All multiples of one nonzero vector | A line |
| Contained in one plane through the origin | A plane |
| Independent directions | \(\mathbb{R}^3\) |

Thus the number of vectors alone does not determine the dimension of the span. Their relationships determine it.

## 19.19 Span and Linear Independence

Span and linear independence are complementary ideas.

Span asks whether the vectors are enough.

Linear independence asks whether the vectors are necessary.

A list of vectors can fail in two ways:

| Property | Failure |
|---|---|
| Spanning | The list misses some directions |
| Linear independence | The list contains redundant directions |

A basis has both properties. It spans the space and has no redundancy.

This makes span the first half of the basis concept.

## 19.20 Summary

A linear combination is a sum of scalar multiples of vectors. The span of a set of vectors is the collection of all such linear combinations.

The key ideas are:

| Concept | Meaning |
|---|---|
| Linear combination | \(c_1v_1+\cdots+c_kv_k\) |
| Coefficients | Scalars used in a linear combination |
| Span | Set of all linear combinations |
| Spanning set | A set whose span is the whole space |
| Redundant vector | A vector already in the span of the others |
| Column space | Span of matrix columns |
| Generated subspace | Another name for a span |

Span gives the language for describing what vectors can be built from a given set. It turns linear algebra into a study of generation, redundancy, dimension, and structure.
