# Chapter 20. Linear Independence

# Chapter 20. Linear Independence

Linear independence measures whether a list of vectors contains redundancy. A list is linearly independent when no vector in the list can be built from the others. It is linearly dependent when at least one vector is unnecessary for generating the same span. Equivalently, vectors \(v_1,\ldots,v_k\) are linearly independent when the only solution of \(c_1v_1+\cdots+c_kv_k=0\) is the trivial solution \(c_1=\cdots=c_k=0\).

Linear independence is the second half of the basis concept. Span asks whether the vectors are enough to generate a space. Linear independence asks whether any of them can be removed.

## 20.1 Linear Relations

Let \(V\) be a vector space over a field \(F\). Let

$$
v_1,\ldots,v_k \in V.
$$

A linear relation among these vectors is an equation of the form

$$
c_1v_1+\cdots+c_kv_k=0,
$$

where

$$
c_1,\ldots,c_k \in F.
$$

There is always at least one such relation, namely

$$
0v_1+\cdots+0v_k=0.
$$

This is called the trivial relation.

A nontrivial relation is a relation in which at least one coefficient is not zero.

Linear dependence means that a nontrivial relation exists. Linear independence means that no nontrivial relation exists.

## 20.2 Definition of Linear Independence

The vectors

$$
v_1,\ldots,v_k
$$

are linearly independent if the equation

$$
c_1v_1+\cdots+c_kv_k=0
$$

implies

$$
c_1=\cdots=c_k=0.
$$

They are linearly dependent if there exist scalars, not all zero, such that

$$
c_1v_1+\cdots+c_kv_k=0.
$$

Thus the test for independence is always a homogeneous equation.

The right-hand side is the zero vector because independence concerns internal relations among the listed vectors. It does not ask whether an outside vector can be produced. That is a question about span.

## 20.3 Dependence as Redundancy

Suppose

$$
c_1v_1+\cdots+c_kv_k=0
$$

is a nontrivial relation. Assume \(c_j \neq 0\). Then

$$
c_jv_j =
-\sum_{i \neq j} c_iv_i.
$$

Dividing by \(c_j\),

$$
v_j =
-\sum_{i \neq j} \frac{c_i}{c_j}v_i.
$$

Thus \(v_j\) is a linear combination of the other vectors.

This proves the basic meaning of dependence:

A list of vectors is linearly dependent exactly when one vector in the list can be written as a linear combination of the others.

This is the formal version of redundancy.

## 20.4 First Examples

In \(\mathbb{R}^2\), the vectors

$$
e_1=
\begin{bmatrix}
1\\
0
\end{bmatrix},
\qquad
e_2=
\begin{bmatrix}
0\\
1
\end{bmatrix}
$$

are linearly independent.

To see this, suppose

$$
c_1e_1+c_2e_2=0.
$$

Then

$$
c_1
\begin{bmatrix}
1\\
0
\end{bmatrix}
+
c_2
\begin{bmatrix}
0\\
1
\end{bmatrix} =
\begin{bmatrix}
c_1\\
c_2
\end{bmatrix} =
\begin{bmatrix}
0\\
0
\end{bmatrix}.
$$

Therefore

$$
c_1=0,
\qquad
c_2=0.
$$

Only the trivial relation exists.

Now consider

$$
v_1=
\begin{bmatrix}
1\\
2
\end{bmatrix},
\qquad
v_2=
\begin{bmatrix}
2\\
4
\end{bmatrix}.
$$

These vectors are linearly dependent because

$$
v_2=2v_1.
$$

Equivalently,

$$
2v_1-v_2=0.
$$

This is a nontrivial relation.

## 20.5 One Vector

A list containing one vector,

$$
(v),
$$

is linearly independent exactly when

$$
v \neq 0.
$$

Indeed, the independence equation is

$$
cv=0.
$$

If \(v \neq 0\), then this forces

$$
c=0.
$$

If \(v=0\), then

$$
1v=0,
$$

so a nontrivial relation exists.

Thus a one-vector list is dependent precisely when the vector is the zero vector.

## 20.6 Two Vectors

Two vectors \(u\) and \(v\) are linearly dependent exactly when one is a scalar multiple of the other.

If

$$
v=cu
$$

for some scalar \(c\), then

$$
cu-v=0,
$$

so the two vectors are dependent.

Conversely, suppose

$$
au+bv=0
$$

with at least one coefficient nonzero. If \(b \neq 0\), then

$$
v=-\frac{a}{b}u.
$$

If \(a \neq 0\), then

$$
u=-\frac{b}{a}v.
$$

Thus dependence of two vectors means that they point in the same one-dimensional direction.

In \(\mathbb{R}^2\) or \(\mathbb{R}^3\), two nonzero vectors are independent exactly when they are not parallel.

## 20.7 Three Vectors in \(\mathbb{R}^3\)

Three vectors in \(\mathbb{R}^3\) are independent when they do not lie in a common plane through the origin.

For example,

$$
e_1=
\begin{bmatrix}
1\\
0\\
0
\end{bmatrix},
\qquad
e_2=
\begin{bmatrix}
0\\
1\\
0
\end{bmatrix},
\qquad
e_3=
\begin{bmatrix}
0\\
0\\
1
\end{bmatrix}
$$

are linearly independent.

If

$$
c_1e_1+c_2e_2+c_3e_3=0,
$$

then

$$
\begin{bmatrix}
c_1\\
c_2\\
c_3
\end{bmatrix} =
\begin{bmatrix}
0\\
0\\
0
\end{bmatrix}.
$$

Therefore

$$
c_1=c_2=c_3=0.
$$

But the vectors

$$
\begin{bmatrix}
1\\
0\\
0
\end{bmatrix},
\qquad
\begin{bmatrix}
0\\
1\\
0
\end{bmatrix},
\qquad
\begin{bmatrix}
1\\
1\\
0
\end{bmatrix}
$$

are dependent, since

$$
\begin{bmatrix}
1\\
1\\
0
\end{bmatrix} =
\begin{bmatrix}
1\\
0\\
0
\end{bmatrix}
+
\begin{bmatrix}
0\\
1\\
0
\end{bmatrix}.
$$

The third vector lies in the plane spanned by the first two.

## 20.8 The Zero Vector in a List

Any list containing the zero vector is linearly dependent.

Suppose

$$
v_j=0.
$$

Choose

$$
c_j=1
$$

and set all other coefficients equal to zero. Then

$$
0v_1+\cdots+1v_j+\cdots+0v_k=0.
$$

This is a nontrivial relation because \(c_j=1\).

Therefore no linearly independent list can contain the zero vector.

This observation is often the fastest way to identify dependence.

## 20.9 Repeated Vectors

Any list containing the same vector twice is linearly dependent.

Suppose

$$
v_i=v_j
$$

with \(i \neq j\). Then

$$
v_i-v_j=0.
$$

This is a nontrivial relation.

More generally, if one vector is a scalar multiple of another, then the list is dependent.

Repeated vectors create immediate redundancy because one copy gives no new direction.

## 20.10 More Vectors Than Dimension

In a finite-dimensional vector space of dimension \(n\), any list of more than \(n\) vectors is linearly dependent.

For example, any three vectors in \(\mathbb{R}^2\) are dependent. Any four vectors in \(\mathbb{R}^3\) are dependent.

The reason is that a space of dimension \(n\) has only \(n\) independent directions. Once \(n\) independent directions have been chosen, every additional vector lies in their span.

This theorem is one of the basic consequences of dimension.

## 20.11 Testing Independence by Row Reduction

Suppose vectors in \(\mathbb{R}^m\) are given as columns of a matrix:

$$
A =
\begin{bmatrix}
| & | & & | \\
v_1 & v_2 & \cdots & v_k \\
| & | & & |
\end{bmatrix}.
$$

The vectors are linearly independent exactly when the homogeneous system

$$
Ac=0
$$

has only the trivial solution

$$
c=0.
$$

Here

$$
c=
\begin{bmatrix}
c_1\\
\vdots\\
c_k
\end{bmatrix}.
$$

To test independence, row reduce \(A\). If every column is a pivot column, then the vectors are independent. If at least one column is free, then the vectors are dependent.

This follows because free variables in the homogeneous system produce nontrivial solutions.

## 20.12 Example by Row Reduction

Consider

$$
v_1=
\begin{bmatrix}
1\\
2\\
1
\end{bmatrix},
\qquad
v_2=
\begin{bmatrix}
0\\
1\\
3
\end{bmatrix},
\qquad
v_3=
\begin{bmatrix}
2\\
5\\
7
\end{bmatrix}.
$$

Place them as columns:

$$
A=
\begin{bmatrix}
1 & 0 & 2\\
2 & 1 & 5\\
1 & 3 & 7
\end{bmatrix}.
$$

We solve

$$
Ac=0.
$$

Row reduce:

$$
\begin{bmatrix}
1 & 0 & 2\\
2 & 1 & 5\\
1 & 3 & 7
\end{bmatrix}
\to
\begin{bmatrix}
1 & 0 & 2\\
0 & 1 & 1\\
0 & 3 & 5
\end{bmatrix}
\to
\begin{bmatrix}
1 & 0 & 2\\
0 & 1 & 1\\
0 & 0 & 2
\end{bmatrix}.
$$

There is a pivot in every column. Therefore the homogeneous system has only the trivial solution.

The vectors are linearly independent.

## 20.13 Example of Dependence by Row Reduction

Consider

$$
v_1=
\begin{bmatrix}
1\\
2\\
3
\end{bmatrix},
\qquad
v_2=
\begin{bmatrix}
2\\
4\\
6
\end{bmatrix},
\qquad
v_3=
\begin{bmatrix}
1\\
0\\
1
\end{bmatrix}.
$$

Since

$$
v_2=2v_1,
$$

the list is already dependent. Row reduction shows the same fact.

Place the vectors as columns:

$$
A=
\begin{bmatrix}
1 & 2 & 1\\
2 & 4 & 0\\
3 & 6 & 1
\end{bmatrix}.
$$

The second column is twice the first column. Hence not every column can be a pivot column.

Therefore

$$
Ac=0
$$

has a nontrivial solution, for example

$$
c=
\begin{bmatrix}
-2\\
1\\
0
\end{bmatrix}.
$$

Indeed,

$$
-2v_1+v_2+0v_3=0.
$$

## 20.14 Independence and Span Growth

A useful way to understand independence is by span growth.

A list

$$
v_1,\ldots,v_k
$$

is linearly independent exactly when each new vector enlarges the span of the previous vectors. Equivalently,

$$
v_j \notin \operatorname{span}(v_1,\ldots,v_{j-1})
$$

for every

$$
j=1,\ldots,k.
$$

For \(j=1\), this means

$$
v_1 \neq 0.
$$

If some vector lies in the span of earlier vectors, then adding it does not create a new direction. The list is dependent.

This viewpoint is often more intuitive than the coefficient equation.

## 20.15 Removing Redundant Vectors

If a list is linearly dependent, then at least one vector can be removed without changing the span.

Suppose

$$
v_j =
c_1v_1+\cdots+c_{j-1}v_{j-1}+c_{j+1}v_{j+1}+\cdots+c_kv_k.
$$

Then every linear combination using \(v_j\) can be rewritten using the other vectors.

Therefore

$$
\operatorname{span}(v_1,\ldots,v_k) =
\operatorname{span}(v_1,\ldots,\widehat{v_j},\ldots,v_k),
$$

where the hat means that \(v_j\) is omitted.

This process removes redundancy from a spanning set. Repeating it eventually produces a basis in finite-dimensional spaces.

## 20.16 Adding Independent Vectors

If a list is linearly independent and a vector \(w\) does not lie in its span, then the enlarged list

$$
v_1,\ldots,v_k,w
$$

is linearly independent.

To prove this, suppose

$$
c_1v_1+\cdots+c_kv_k+dw=0.
$$

If \(d \neq 0\), then

$$
w=-\frac{1}{d}(c_1v_1+\cdots+c_kv_k),
$$

so \(w\) lies in

$$
\operatorname{span}(v_1,\ldots,v_k).
$$

This contradicts the assumption. Hence

$$
d=0.
$$

Then

$$
c_1v_1+\cdots+c_kv_k=0.
$$

Since the original list is independent,

$$
c_1=\cdots=c_k=0.
$$

Thus the enlarged list is independent.

## 20.17 Independence of Polynomials

Consider the polynomials

$$
1,\ x,\ x^2,\ \ldots,\ x^n.
$$

They are linearly independent in \(P_n\).

Suppose

$$
c_0 + c_1x + c_2x^2+\cdots+c_nx^n = 0
$$

as a polynomial.

The zero polynomial has all coefficients equal to zero. Therefore

$$
c_0=c_1=\cdots=c_n=0.
$$

So the list is linearly independent.

This fact gives the standard basis of \(P_n\):

$$
(1,x,x^2,\ldots,x^n).
$$

## 20.18 Independence of Functions

Functions can also be linearly independent.

For example,

$$
\sin x
$$

and

$$
\cos x
$$

are linearly independent as real-valued functions.

Suppose

$$
a\sin x+b\cos x=0
$$

for every real \(x\).

Set \(x=0\). Then

$$
a\sin 0+b\cos 0=0
$$

gives

$$
b=0.
$$

Set

$$
x=\frac{\pi}{2}.
$$

Then

$$
a\sin \frac{\pi}{2}+b\cos \frac{\pi}{2}=0
$$

gives

$$
a=0.
$$

Thus only the trivial relation exists.

Therefore \(\sin x\) and \(\cos x\) are linearly independent.

## 20.19 Infinite Sets

An infinite set of vectors is linearly independent if every finite subset is linearly independent. It is linearly dependent if at least one finite subset is linearly dependent.

For example, the infinite set

$$
\{1,x,x^2,x^3,\ldots\}
$$

is linearly independent in the vector space of all real polynomials.

Any finite linear relation has the form

$$
c_0+c_1x+\cdots+c_nx^n=0.
$$

As before, all coefficients must be zero.

Thus no finite nontrivial relation exists.

## 20.20 Summary

Linear independence formalizes the absence of redundancy. A list of vectors is independent when the zero vector has only the trivial representation as a linear combination of the list.

The key ideas are:

| Concept | Meaning |
|---|---|
| Linear relation | \(c_1v_1+\cdots+c_kv_k=0\) |
| Trivial relation | All coefficients are zero |
| Nontrivial relation | At least one coefficient is nonzero |
| Linearly independent | Only the trivial relation exists |
| Linearly dependent | A nontrivial relation exists |
| Redundant vector | A vector in the span of the others |
| Pivot column test | Independent exactly when every column is a pivot column |
| Dimension limit | More than \(n\) vectors in an \(n\)-dimensional space are dependent |

Span and independence work together. Span says the vectors generate enough. Independence says they contain no excess. A basis is precisely a list that satisfies both conditions.
