Skip to content

Chapter 19. Span and Linear Combination

A linear combination is a vector built from other vectors by scalar multiplication and addition. The span of a set of vectors is the set of all vectors that can be built in this way. These two ideas connect vector arithmetic with geometry, systems of equations, subspaces, basis, rank, and dimension.

If v1,,vkv_1,\ldots,v_k are vectors in a vector space VV over a field FF, then a linear combination has the form

c1v1+c2v2++ckvk, c_1v_1 + c_2v_2 + \cdots + c_kv_k,

where c1,,ckFc_1,\ldots,c_k \in F. The span of v1,,vkv_1,\ldots,v_k is the set of all such linear combinations.

19.1 Linear Combinations

Let VV be a vector space over a field FF. Let

v1,v2,,vkV. v_1, v_2, \ldots, v_k \in V.

A vector vVv \in V is called a linear combination of v1,,vkv_1,\ldots,v_k if there exist scalars

c1,c2,,ckF c_1,c_2,\ldots,c_k \in F

such that

v=c1v1+c2v2++ckvk. v = c_1v_1 + c_2v_2 + \cdots + c_kv_k.

The scalars c1,,ckc_1,\ldots,c_k are called coefficients.

Only two operations are used: scalar multiplication and vector addition. No products of vectors appear. No nonlinear functions appear. This is why the expression is called linear.

19.2 First Examples

In R2\mathbb{R}^2, let

v1=[10],v2=[01]. v_1 = \begin{bmatrix} 1 \\ 0 \end{bmatrix}, \qquad v_2 = \begin{bmatrix} 0 \\ 1 \end{bmatrix}.

Then

3v12v2=3[10]2[01]=[32]. 3v_1 - 2v_2 = 3 \begin{bmatrix} 1 \\ 0 \end{bmatrix} - 2 \begin{bmatrix} 0 \\ 1 \end{bmatrix} = \begin{bmatrix} 3 \\ -2 \end{bmatrix}.

Thus

[32] \begin{bmatrix} 3 \\ -2 \end{bmatrix}

is a linear combination of v1v_1 and v2v_2.

More generally,

a[10]+b[01]=[ab]. a \begin{bmatrix} 1 \\ 0 \end{bmatrix} + b \begin{bmatrix} 0 \\ 1 \end{bmatrix} = \begin{bmatrix} a \\ b \end{bmatrix}.

Every vector in R2\mathbb{R}^2 can be written as a linear combination of these two vectors.

19.3 Linear Combinations in Rn\mathbb{R}^n

Let

v1=[120],v2=[113]. v_1 = \begin{bmatrix} 1 \\ 2 \\ 0 \end{bmatrix}, \qquad v_2 = \begin{bmatrix} -1 \\ 1 \\ 3 \end{bmatrix}.

A general linear combination is

av1+bv2=a[120]+b[113]. av_1 + bv_2 = a \begin{bmatrix} 1 \\ 2 \\ 0 \end{bmatrix} + b \begin{bmatrix} -1 \\ 1 \\ 3 \end{bmatrix}.

Computing component by component,

av1+bv2=[ab2a+b3b]. av_1 + bv_2 = \begin{bmatrix} a-b \\ 2a+b \\ 3b \end{bmatrix}.

As aa and bb vary over R\mathbb{R}, this expression produces many vectors in R3\mathbb{R}^3. It does not produce all of R3\mathbb{R}^3. It produces a plane through the origin.

19.4 Span

The span of vectors v1,,vkv_1,\ldots,v_k is the set of all their linear combinations:

span(v1,,vk)={c1v1++ckvk:c1,,ckF}. \operatorname{span}(v_1,\ldots,v_k) = \{c_1v_1 + \cdots + c_kv_k : c_1,\ldots,c_k \in F\}.

The span is also written as

span{v1,,vk}. \operatorname{span}\{v_1,\ldots,v_k\}.

Both notations mean the same thing.

If

W=span(v1,,vk), W = \operatorname{span}(v_1,\ldots,v_k),

then the vectors v1,,vkv_1,\ldots,v_k are said to span WW, or generate WW.

The word generate is useful. A spanning set is a collection of vectors from which the whole subspace can be generated by linear combinations.

19.5 Span as a Subspace

The span of any set of vectors is a subspace.

Let

W=span(v1,,vk). W = \operatorname{span}(v_1,\ldots,v_k).

Then WW contains all vectors of the form

c1v1++ckvk. c_1v_1 + \cdots + c_kv_k.

First, WW contains the zero vector, because

0v1+0v2++0vk=0. 0v_1 + 0v_2 + \cdots + 0v_k = 0.

Next, let

u=a1v1++akvk u = a_1v_1 + \cdots + a_kv_k

and

w=b1v1++bkvk. w = b_1v_1 + \cdots + b_kv_k.

Then

u+w=(a1+b1)v1++(ak+bk)vk. u+w = (a_1+b_1)v_1 + \cdots + (a_k+b_k)v_k.

This is again a linear combination of v1,,vkv_1,\ldots,v_k. Hence

u+wW. u+w \in W.

For a scalar cc,

cu=(ca1)v1++(cak)vk. cu = (ca_1)v_1 + \cdots + (ca_k)v_k.

This is also a linear combination of the same vectors. Hence

cuW. cu \in W.

Therefore WW is a subspace.

19.6 The Smallest Subspace Containing a Set

The span of a set is the smallest subspace containing that set.

Let

S={v1,,vk}. S = \{v_1,\ldots,v_k\}.

The span contains every viv_i, since

vi=0v1++1vi++0vk. v_i = 0v_1 + \cdots + 1v_i + \cdots + 0v_k.

Now suppose UU is any subspace containing v1,,vkv_1,\ldots,v_k. Since UU is closed under scalar multiplication and addition, it must contain every linear combination

c1v1++ckvk. c_1v_1 + \cdots + c_kv_k.

Therefore

span(v1,,vk)U. \operatorname{span}(v_1,\ldots,v_k) \subseteq U.

Thus the span is contained in every subspace that contains the original vectors.

19.7 Geometric Meaning of Span

In R2\mathbb{R}^2, the span of one nonzero vector is a line through the origin.

If

v=[21], v = \begin{bmatrix} 2 \\ 1 \end{bmatrix},

then

$$
\operatorname{span}(v) =
\left{
c
\begin{bmatrix}
2 \
1
\end{bmatrix}
c \in \mathbb{R} \right}. $$

This set is the line through the origin in the direction of vv.

In R3\mathbb{R}^3, the span of one nonzero vector is also a line through the origin. The span of two nonparallel vectors is a plane through the origin. The span of three suitable vectors may be all of R3\mathbb{R}^3.

The word suitable means linearly independent. This will be made precise in the next chapter.

19.8 Spanning the Plane

Let

v1=[11],v2=[11]. v_1 = \begin{bmatrix} 1 \\ 1 \end{bmatrix}, \qquad v_2 = \begin{bmatrix} 1 \\ -1 \end{bmatrix}.

We ask whether these vectors span R2\mathbb{R}^2.

A general linear combination is

av1+bv2=a[11]+b[11]=[a+bab]. av_1 + bv_2 = a \begin{bmatrix} 1 \\ 1 \end{bmatrix} + b \begin{bmatrix} 1 \\ -1 \end{bmatrix} = \begin{bmatrix} a+b \\ a-b \end{bmatrix}.

Given an arbitrary vector

[xy], \begin{bmatrix} x \\ y \end{bmatrix},

we need

a+b=x a+b = x

and

ab=y. a-b = y.

Adding these equations gives

2a=x+y, 2a = x+y,

so

a=x+y2. a = \frac{x+y}{2}.

Subtracting gives

2b=xy, 2b = x-y,

so

b=xy2. b = \frac{x-y}{2}.

For every x,yRx,y \in \mathbb{R}, such scalars aa and bb exist. Therefore

span(v1,v2)=R2. \operatorname{span}(v_1,v_2) = \mathbb{R}^2.

19.9 Failing to Span the Plane

Let

v1=[12],v2=[24]. v_1 = \begin{bmatrix} 1 \\ 2 \end{bmatrix}, \qquad v_2 = \begin{bmatrix} 2 \\ 4 \end{bmatrix}.

A general linear combination is

av1+bv2=a[12]+b[24]=[a+2b2a+4b]. av_1 + bv_2 = a \begin{bmatrix} 1 \\ 2 \end{bmatrix} + b \begin{bmatrix} 2 \\ 4 \end{bmatrix} = \begin{bmatrix} a+2b \\ 2a+4b \end{bmatrix}.

The second component is always twice the first component:

2a+4b=2(a+2b). 2a+4b = 2(a+2b).

Thus every vector in the span lies on the line

y=2x. y = 2x.

Therefore

span(v1,v2)R2. \operatorname{span}(v_1,v_2) \neq \mathbb{R}^2.

The second vector gives no new direction, since

v2=2v1. v_2 = 2v_1.

It is redundant.

19.10 Span and Systems of Equations

Span questions are equivalent to systems of linear equations.

Let

A=[123456]. A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{bmatrix}.

The columns are

a1=[135],a2=[246]. a_1 = \begin{bmatrix} 1 \\ 3 \\ 5 \end{bmatrix}, \qquad a_2 = \begin{bmatrix} 2 \\ 4 \\ 6 \end{bmatrix}.

A vector bR3b \in \mathbb{R}^3 belongs to the column space of AA exactly when there exists xR2x \in \mathbb{R}^2 such that

Ax=b. Ax = b.

Writing

x=[x1x2], x = \begin{bmatrix} x_1 \\ x_2 \end{bmatrix},

we get

Ax=x1a1+x2a2. Ax = x_1a_1 + x_2a_2.

Thus solving

Ax=b Ax = b

means deciding whether bb is a linear combination of the columns of AA.

This is one of the central interpretations of matrix multiplication.

19.11 Column Space

The column space of a matrix AA is the span of its columns:

Col(A)=span(a1,,an). \operatorname{Col}(A) = \operatorname{span}(a_1,\ldots,a_n).

If AA is an m×nm \times n matrix, then

Col(A)Rm. \operatorname{Col}(A) \subseteq \mathbb{R}^m.

The column space is the set of all possible outputs of the transformation

xAx. x \mapsto Ax.

Therefore

Ax=b Ax = b

has a solution if and only if

bCol(A). b \in \operatorname{Col}(A).

This gives a geometric interpretation of consistency for linear systems.

19.12 Spanning Sets

A set SS spans a vector space VV if

span(S)=V. \operatorname{span}(S) = V.

In this case, every vector in VV can be expressed as a linear combination of vectors from SS.

For example,

{[10],[01]} \left\{ \begin{bmatrix} 1 \\ 0 \end{bmatrix}, \begin{bmatrix} 0 \\ 1 \end{bmatrix} \right\}

spans R2\mathbb{R}^2.

The set

{[10],[01],[11]} \left\{ \begin{bmatrix} 1 \\ 0 \end{bmatrix}, \begin{bmatrix} 0 \\ 1 \end{bmatrix}, \begin{bmatrix} 1 \\ 1 \end{bmatrix} \right\}

also spans R2\mathbb{R}^2, but it contains a redundant vector because

[11]=[10]+[01]. \begin{bmatrix} 1 \\ 1 \end{bmatrix} = \begin{bmatrix} 1 \\ 0 \end{bmatrix} + \begin{bmatrix} 0 \\ 1 \end{bmatrix}.

A spanning set can have redundant vectors. A basis cannot.

19.13 Redundancy

A vector in a spanning set is redundant if it can be written as a linear combination of the other vectors.

Suppose

vkspan(v1,,vk1). v_k \in \operatorname{span}(v_1,\ldots,v_{k-1}).

Then

span(v1,,vk1,vk)=span(v1,,vk1). \operatorname{span}(v_1,\ldots,v_{k-1},v_k) = \operatorname{span}(v_1,\ldots,v_{k-1}).

Adding vkv_k does not enlarge the span.

This observation is the bridge from spanning sets to bases. A basis is a spanning set with all redundancy removed.

19.14 Linear Combinations of Polynomials

Let

p1(x)=1,p2(x)=x,p3(x)=x2. p_1(x)=1, \qquad p_2(x)=x, \qquad p_3(x)=x^2.

A general linear combination is

ap1(x)+bp2(x)+cp3(x)=a+bx+cx2. a p_1(x) + b p_2(x) + c p_3(x) = a + bx + cx^2.

Thus

span(1,x,x2)=P2, \operatorname{span}(1,x,x^2) = P_2,

the vector space of all polynomials of degree at most 22.

Now consider

1+x,1x. 1+x, \qquad 1-x.

Their span contains all polynomials of the form

a(1+x)+b(1x)=(a+b)+(ab)x. a(1+x)+b(1-x) = (a+b)+(a-b)x.

This is every polynomial of degree at most 11. Therefore

span(1+x,1x)=P1. \operatorname{span}(1+x,1-x) = P_1.

19.15 Linear Combinations of Matrices

Matrices of the same size form a vector space.

Let

A=[1000],B=[0100],C=[0010],D=[0001]. A = \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}, \quad B = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}, \quad C = \begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix}, \quad D = \begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix}.

A general linear combination is

aA+bB+cC+dD=[abcd]. aA+bB+cC+dD = \begin{bmatrix} a & b \\ c & d \end{bmatrix}.

Therefore

span(A,B,C,D)=M2×2(R). \operatorname{span}(A,B,C,D) = M_{2 \times 2}(\mathbb{R}).

These four matrices behave like the standard basis vectors for 2×22 \times 2 matrices.

19.16 Linear Combinations of Functions

Function spaces also use the same idea.

Let

f1(x)=1,f2(x)=cosx,f3(x)=sinx. f_1(x)=1, \qquad f_2(x)=\cos x, \qquad f_3(x)=\sin x.

A general linear combination is

af1(x)+bf2(x)+cf3(x)=a+bcosx+csinx. a f_1(x)+b f_2(x)+c f_3(x) = a+b\cos x+c\sin x.

The span is

span(1,cosx,sinx)={a+bcosx+csinx:a,b,cR}. \operatorname{span}(1,\cos x,\sin x) = \{a+b\cos x+c\sin x : a,b,c \in \mathbb{R}\}.

This is a three-dimensional subspace of the vector space of all real-valued functions.

The same formal definition applies even though the vectors are functions instead of columns.

19.17 Finite Linear Combinations

A linear combination always uses finitely many vectors.

If SS is an infinite subset of a vector space, then

span(S) \operatorname{span}(S)

means the set of all finite linear combinations of vectors from SS.

For example, the space of all polynomials is spanned by

1,x,x2,x3,. 1,x,x^2,x^3,\ldots.

Every individual polynomial uses only finitely many powers of xx. For instance,

32x+7x5 3 - 2x + 7x^5

uses only

1, x, x5. 1,\ x,\ x^5.

Infinite sums require additional notions of convergence. Those belong to analysis, not elementary vector space theory.

19.18 Span and Dimension

The dimension of a span is the number of independent directions generated by the vectors.

A single nonzero vector spans a one-dimensional subspace.

Two nonparallel vectors span a two-dimensional subspace.

Three vectors in R3\mathbb{R}^3 may span:

CaseSpan
All zero{0}\{0\}
All multiples of one nonzero vectorA line
Contained in one plane through the originA plane
Independent directionsR3\mathbb{R}^3

Thus the number of vectors alone does not determine the dimension of the span. Their relationships determine it.

19.19 Span and Linear Independence

Span and linear independence are complementary ideas.

Span asks whether the vectors are enough.

Linear independence asks whether the vectors are necessary.

A list of vectors can fail in two ways:

PropertyFailure
SpanningThe list misses some directions
Linear independenceThe list contains redundant directions

A basis has both properties. It spans the space and has no redundancy.

This makes span the first half of the basis concept.

19.20 Summary

A linear combination is a sum of scalar multiples of vectors. The span of a set of vectors is the collection of all such linear combinations.

The key ideas are:

ConceptMeaning
Linear combinationc1v1++ckvkc_1v_1+\cdots+c_kv_k
CoefficientsScalars used in a linear combination
SpanSet of all linear combinations
Spanning setA set whose span is the whole space
Redundant vectorA vector already in the span of the others
Column spaceSpan of matrix columns
Generated subspaceAnother name for a span

Span gives the language for describing what vectors can be built from a given set. It turns linear algebra into a study of generation, redundancy, dimension, and structure.