Skip to content

Chapter 20. Linear Independence

Linear independence measures whether a list of vectors contains redundancy. A list is linearly independent when no vector in the list can be built from the others. It is linearly dependent when at least one vector is unnecessary for generating the same span. Equivalently, vectors v1,,vkv_1,\ldots,v_k are linearly independent when the only solution of c1v1++ckvk=0c_1v_1+\cdots+c_kv_k=0 is the trivial solution c1==ck=0c_1=\cdots=c_k=0.

Linear independence is the second half of the basis concept. Span asks whether the vectors are enough to generate a space. Linear independence asks whether any of them can be removed.

20.1 Linear Relations

Let VV be a vector space over a field FF. Let

v1,,vkV. v_1,\ldots,v_k \in V.

A linear relation among these vectors is an equation of the form

c1v1++ckvk=0, c_1v_1+\cdots+c_kv_k=0,

where

c1,,ckF. c_1,\ldots,c_k \in F.

There is always at least one such relation, namely

0v1++0vk=0. 0v_1+\cdots+0v_k=0.

This is called the trivial relation.

A nontrivial relation is a relation in which at least one coefficient is not zero.

Linear dependence means that a nontrivial relation exists. Linear independence means that no nontrivial relation exists.

20.2 Definition of Linear Independence

The vectors

v1,,vk v_1,\ldots,v_k

are linearly independent if the equation

c1v1++ckvk=0 c_1v_1+\cdots+c_kv_k=0

implies

c1==ck=0. c_1=\cdots=c_k=0.

They are linearly dependent if there exist scalars, not all zero, such that

c1v1++ckvk=0. c_1v_1+\cdots+c_kv_k=0.

Thus the test for independence is always a homogeneous equation.

The right-hand side is the zero vector because independence concerns internal relations among the listed vectors. It does not ask whether an outside vector can be produced. That is a question about span.

20.3 Dependence as Redundancy

Suppose

c1v1++ckvk=0 c_1v_1+\cdots+c_kv_k=0

is a nontrivial relation. Assume cj0c_j \neq 0. Then

cjvj=ijcivi. c_jv_j = -\sum_{i \neq j} c_iv_i.

Dividing by cjc_j,

vj=ijcicjvi. v_j = -\sum_{i \neq j} \frac{c_i}{c_j}v_i.

Thus vjv_j is a linear combination of the other vectors.

This proves the basic meaning of dependence:

A list of vectors is linearly dependent exactly when one vector in the list can be written as a linear combination of the others.

This is the formal version of redundancy.

20.4 First Examples

In R2\mathbb{R}^2, the vectors

e1=[10],e2=[01] e_1= \begin{bmatrix} 1\\ 0 \end{bmatrix}, \qquad e_2= \begin{bmatrix} 0\\ 1 \end{bmatrix}

are linearly independent.

To see this, suppose

c1e1+c2e2=0. c_1e_1+c_2e_2=0.

Then

c1[10]+c2[01]=[c1c2]=[00]. c_1 \begin{bmatrix} 1\\ 0 \end{bmatrix} + c_2 \begin{bmatrix} 0\\ 1 \end{bmatrix} = \begin{bmatrix} c_1\\ c_2 \end{bmatrix} = \begin{bmatrix} 0\\ 0 \end{bmatrix}.

Therefore

c1=0,c2=0. c_1=0, \qquad c_2=0.

Only the trivial relation exists.

Now consider

v1=[12],v2=[24]. v_1= \begin{bmatrix} 1\\ 2 \end{bmatrix}, \qquad v_2= \begin{bmatrix} 2\\ 4 \end{bmatrix}.

These vectors are linearly dependent because

v2=2v1. v_2=2v_1.

Equivalently,

2v1v2=0. 2v_1-v_2=0.

This is a nontrivial relation.

20.5 One Vector

A list containing one vector,

(v), (v),

is linearly independent exactly when

v0. v \neq 0.

Indeed, the independence equation is

cv=0. cv=0.

If v0v \neq 0, then this forces

c=0. c=0.

If v=0v=0, then

1v=0, 1v=0,

so a nontrivial relation exists.

Thus a one-vector list is dependent precisely when the vector is the zero vector.

20.6 Two Vectors

Two vectors uu and vv are linearly dependent exactly when one is a scalar multiple of the other.

If

v=cu v=cu

for some scalar cc, then

cuv=0, cu-v=0,

so the two vectors are dependent.

Conversely, suppose

au+bv=0 au+bv=0

with at least one coefficient nonzero. If b0b \neq 0, then

v=abu. v=-\frac{a}{b}u.

If a0a \neq 0, then

u=bav. u=-\frac{b}{a}v.

Thus dependence of two vectors means that they point in the same one-dimensional direction.

In R2\mathbb{R}^2 or R3\mathbb{R}^3, two nonzero vectors are independent exactly when they are not parallel.

20.7 Three Vectors in R3\mathbb{R}^3

Three vectors in R3\mathbb{R}^3 are independent when they do not lie in a common plane through the origin.

For example,

e1=[100],e2=[010],e3=[001] e_1= \begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix}, \qquad e_2= \begin{bmatrix} 0\\ 1\\ 0 \end{bmatrix}, \qquad e_3= \begin{bmatrix} 0\\ 0\\ 1 \end{bmatrix}

are linearly independent.

If

c1e1+c2e2+c3e3=0, c_1e_1+c_2e_2+c_3e_3=0,

then

[c1c2c3]=[000]. \begin{bmatrix} c_1\\ c_2\\ c_3 \end{bmatrix} = \begin{bmatrix} 0\\ 0\\ 0 \end{bmatrix}.

Therefore

c1=c2=c3=0. c_1=c_2=c_3=0.

But the vectors

[100],[010],[110] \begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix}, \qquad \begin{bmatrix} 0\\ 1\\ 0 \end{bmatrix}, \qquad \begin{bmatrix} 1\\ 1\\ 0 \end{bmatrix}

are dependent, since

[110]=[100]+[010]. \begin{bmatrix} 1\\ 1\\ 0 \end{bmatrix} = \begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix} + \begin{bmatrix} 0\\ 1\\ 0 \end{bmatrix}.

The third vector lies in the plane spanned by the first two.

20.8 The Zero Vector in a List

Any list containing the zero vector is linearly dependent.

Suppose

vj=0. v_j=0.

Choose

cj=1 c_j=1

and set all other coefficients equal to zero. Then

0v1++1vj++0vk=0. 0v_1+\cdots+1v_j+\cdots+0v_k=0.

This is a nontrivial relation because cj=1c_j=1.

Therefore no linearly independent list can contain the zero vector.

This observation is often the fastest way to identify dependence.

20.9 Repeated Vectors

Any list containing the same vector twice is linearly dependent.

Suppose

vi=vj v_i=v_j

with iji \neq j. Then

vivj=0. v_i-v_j=0.

This is a nontrivial relation.

More generally, if one vector is a scalar multiple of another, then the list is dependent.

Repeated vectors create immediate redundancy because one copy gives no new direction.

20.10 More Vectors Than Dimension

In a finite-dimensional vector space of dimension nn, any list of more than nn vectors is linearly dependent.

For example, any three vectors in R2\mathbb{R}^2 are dependent. Any four vectors in R3\mathbb{R}^3 are dependent.

The reason is that a space of dimension nn has only nn independent directions. Once nn independent directions have been chosen, every additional vector lies in their span.

This theorem is one of the basic consequences of dimension.

20.11 Testing Independence by Row Reduction

Suppose vectors in Rm\mathbb{R}^m are given as columns of a matrix:

A=[v1v2vk]. A = \begin{bmatrix} | & | & & | \\ v_1 & v_2 & \cdots & v_k \\ | & | & & | \end{bmatrix}.

The vectors are linearly independent exactly when the homogeneous system

Ac=0 Ac=0

has only the trivial solution

c=0. c=0.

Here

c=[c1ck]. c= \begin{bmatrix} c_1\\ \vdots\\ c_k \end{bmatrix}.

To test independence, row reduce AA. If every column is a pivot column, then the vectors are independent. If at least one column is free, then the vectors are dependent.

This follows because free variables in the homogeneous system produce nontrivial solutions.

20.12 Example by Row Reduction

Consider

v1=[121],v2=[013],v3=[257]. v_1= \begin{bmatrix} 1\\ 2\\ 1 \end{bmatrix}, \qquad v_2= \begin{bmatrix} 0\\ 1\\ 3 \end{bmatrix}, \qquad v_3= \begin{bmatrix} 2\\ 5\\ 7 \end{bmatrix}.

Place them as columns:

A=[102215137]. A= \begin{bmatrix} 1 & 0 & 2\\ 2 & 1 & 5\\ 1 & 3 & 7 \end{bmatrix}.

We solve

Ac=0. Ac=0.

Row reduce:

[102215137][102011035][102011002]. \begin{bmatrix} 1 & 0 & 2\\ 2 & 1 & 5\\ 1 & 3 & 7 \end{bmatrix} \to \begin{bmatrix} 1 & 0 & 2\\ 0 & 1 & 1\\ 0 & 3 & 5 \end{bmatrix} \to \begin{bmatrix} 1 & 0 & 2\\ 0 & 1 & 1\\ 0 & 0 & 2 \end{bmatrix}.

There is a pivot in every column. Therefore the homogeneous system has only the trivial solution.

The vectors are linearly independent.

20.13 Example of Dependence by Row Reduction

Consider

v1=[123],v2=[246],v3=[101]. v_1= \begin{bmatrix} 1\\ 2\\ 3 \end{bmatrix}, \qquad v_2= \begin{bmatrix} 2\\ 4\\ 6 \end{bmatrix}, \qquad v_3= \begin{bmatrix} 1\\ 0\\ 1 \end{bmatrix}.

Since

v2=2v1, v_2=2v_1,

the list is already dependent. Row reduction shows the same fact.

Place the vectors as columns:

A=[121240361]. A= \begin{bmatrix} 1 & 2 & 1\\ 2 & 4 & 0\\ 3 & 6 & 1 \end{bmatrix}.

The second column is twice the first column. Hence not every column can be a pivot column.

Therefore

Ac=0 Ac=0

has a nontrivial solution, for example

c=[210]. c= \begin{bmatrix} -2\\ 1\\ 0 \end{bmatrix}.

Indeed,

2v1+v2+0v3=0. -2v_1+v_2+0v_3=0.

20.14 Independence and Span Growth

A useful way to understand independence is by span growth.

A list

v1,,vk v_1,\ldots,v_k

is linearly independent exactly when each new vector enlarges the span of the previous vectors. Equivalently,

vjspan(v1,,vj1) v_j \notin \operatorname{span}(v_1,\ldots,v_{j-1})

for every

j=1,,k. j=1,\ldots,k.

For j=1j=1, this means

v10. v_1 \neq 0.

If some vector lies in the span of earlier vectors, then adding it does not create a new direction. The list is dependent.

This viewpoint is often more intuitive than the coefficient equation.

20.15 Removing Redundant Vectors

If a list is linearly dependent, then at least one vector can be removed without changing the span.

Suppose

vj=c1v1++cj1vj1+cj+1vj+1++ckvk. v_j = c_1v_1+\cdots+c_{j-1}v_{j-1}+c_{j+1}v_{j+1}+\cdots+c_kv_k.

Then every linear combination using vjv_j can be rewritten using the other vectors.

Therefore

span(v1,,vk)=span(v1,,vj^,,vk), \operatorname{span}(v_1,\ldots,v_k) = \operatorname{span}(v_1,\ldots,\widehat{v_j},\ldots,v_k),

where the hat means that vjv_j is omitted.

This process removes redundancy from a spanning set. Repeating it eventually produces a basis in finite-dimensional spaces.

20.16 Adding Independent Vectors

If a list is linearly independent and a vector ww does not lie in its span, then the enlarged list

v1,,vk,w v_1,\ldots,v_k,w

is linearly independent.

To prove this, suppose

c1v1++ckvk+dw=0. c_1v_1+\cdots+c_kv_k+dw=0.

If d0d \neq 0, then

w=1d(c1v1++ckvk), w=-\frac{1}{d}(c_1v_1+\cdots+c_kv_k),

so ww lies in

span(v1,,vk). \operatorname{span}(v_1,\ldots,v_k).

This contradicts the assumption. Hence

d=0. d=0.

Then

c1v1++ckvk=0. c_1v_1+\cdots+c_kv_k=0.

Since the original list is independent,

c1==ck=0. c_1=\cdots=c_k=0.

Thus the enlarged list is independent.

20.17 Independence of Polynomials

Consider the polynomials

1, x, x2, , xn. 1,\ x,\ x^2,\ \ldots,\ x^n.

They are linearly independent in PnP_n.

Suppose

c0+c1x+c2x2++cnxn=0 c_0 + c_1x + c_2x^2+\cdots+c_nx^n = 0

as a polynomial.

The zero polynomial has all coefficients equal to zero. Therefore

c0=c1==cn=0. c_0=c_1=\cdots=c_n=0.

So the list is linearly independent.

This fact gives the standard basis of PnP_n:

(1,x,x2,,xn). (1,x,x^2,\ldots,x^n).

20.18 Independence of Functions

Functions can also be linearly independent.

For example,

sinx \sin x

and

cosx \cos x

are linearly independent as real-valued functions.

Suppose

asinx+bcosx=0 a\sin x+b\cos x=0

for every real xx.

Set x=0x=0. Then

asin0+bcos0=0 a\sin 0+b\cos 0=0

gives

b=0. b=0.

Set

x=π2. x=\frac{\pi}{2}.

Then

asinπ2+bcosπ2=0 a\sin \frac{\pi}{2}+b\cos \frac{\pi}{2}=0

gives

a=0. a=0.

Thus only the trivial relation exists.

Therefore sinx\sin x and cosx\cos x are linearly independent.

20.19 Infinite Sets

An infinite set of vectors is linearly independent if every finite subset is linearly independent. It is linearly dependent if at least one finite subset is linearly dependent.

For example, the infinite set

{1,x,x2,x3,} \{1,x,x^2,x^3,\ldots\}

is linearly independent in the vector space of all real polynomials.

Any finite linear relation has the form

c0+c1x++cnxn=0. c_0+c_1x+\cdots+c_nx^n=0.

As before, all coefficients must be zero.

Thus no finite nontrivial relation exists.

20.20 Summary

Linear independence formalizes the absence of redundancy. A list of vectors is independent when the zero vector has only the trivial representation as a linear combination of the list.

The key ideas are:

ConceptMeaning
Linear relationc1v1++ckvk=0c_1v_1+\cdots+c_kv_k=0
Trivial relationAll coefficients are zero
Nontrivial relationAt least one coefficient is nonzero
Linearly independentOnly the trivial relation exists
Linearly dependentA nontrivial relation exists
Redundant vectorA vector in the span of the others
Pivot column testIndependent exactly when every column is a pivot column
Dimension limitMore than nn vectors in an nn-dimensional space are dependent

Span and independence work together. Span says the vectors generate enough. Independence says they contain no excess. A basis is precisely a list that satisfies both conditions.