Linear independence measures whether a list of vectors contains redundancy. A list is linearly independent when no vector in the list can be built from the others. It is linearly dependent when at least one vector is unnecessary for generating the same span. Equivalently, vectors are linearly independent when the only solution of is the trivial solution .
Linear independence is the second half of the basis concept. Span asks whether the vectors are enough to generate a space. Linear independence asks whether any of them can be removed.
20.1 Linear Relations
Let be a vector space over a field . Let
A linear relation among these vectors is an equation of the form
where
There is always at least one such relation, namely
This is called the trivial relation.
A nontrivial relation is a relation in which at least one coefficient is not zero.
Linear dependence means that a nontrivial relation exists. Linear independence means that no nontrivial relation exists.
20.2 Definition of Linear Independence
The vectors
are linearly independent if the equation
implies
They are linearly dependent if there exist scalars, not all zero, such that
Thus the test for independence is always a homogeneous equation.
The right-hand side is the zero vector because independence concerns internal relations among the listed vectors. It does not ask whether an outside vector can be produced. That is a question about span.
20.3 Dependence as Redundancy
Suppose
is a nontrivial relation. Assume . Then
Dividing by ,
Thus is a linear combination of the other vectors.
This proves the basic meaning of dependence:
A list of vectors is linearly dependent exactly when one vector in the list can be written as a linear combination of the others.
This is the formal version of redundancy.
20.4 First Examples
In , the vectors
are linearly independent.
To see this, suppose
Then
Therefore
Only the trivial relation exists.
Now consider
These vectors are linearly dependent because
Equivalently,
This is a nontrivial relation.
20.5 One Vector
A list containing one vector,
is linearly independent exactly when
Indeed, the independence equation is
If , then this forces
If , then
so a nontrivial relation exists.
Thus a one-vector list is dependent precisely when the vector is the zero vector.
20.6 Two Vectors
Two vectors and are linearly dependent exactly when one is a scalar multiple of the other.
If
for some scalar , then
so the two vectors are dependent.
Conversely, suppose
with at least one coefficient nonzero. If , then
If , then
Thus dependence of two vectors means that they point in the same one-dimensional direction.
In or , two nonzero vectors are independent exactly when they are not parallel.
20.7 Three Vectors in
Three vectors in are independent when they do not lie in a common plane through the origin.
For example,
are linearly independent.
If
then
Therefore
But the vectors
are dependent, since
The third vector lies in the plane spanned by the first two.
20.8 The Zero Vector in a List
Any list containing the zero vector is linearly dependent.
Suppose
Choose
and set all other coefficients equal to zero. Then
This is a nontrivial relation because .
Therefore no linearly independent list can contain the zero vector.
This observation is often the fastest way to identify dependence.
20.9 Repeated Vectors
Any list containing the same vector twice is linearly dependent.
Suppose
with . Then
This is a nontrivial relation.
More generally, if one vector is a scalar multiple of another, then the list is dependent.
Repeated vectors create immediate redundancy because one copy gives no new direction.
20.10 More Vectors Than Dimension
In a finite-dimensional vector space of dimension , any list of more than vectors is linearly dependent.
For example, any three vectors in are dependent. Any four vectors in are dependent.
The reason is that a space of dimension has only independent directions. Once independent directions have been chosen, every additional vector lies in their span.
This theorem is one of the basic consequences of dimension.
20.11 Testing Independence by Row Reduction
Suppose vectors in are given as columns of a matrix:
The vectors are linearly independent exactly when the homogeneous system
has only the trivial solution
Here
To test independence, row reduce . If every column is a pivot column, then the vectors are independent. If at least one column is free, then the vectors are dependent.
This follows because free variables in the homogeneous system produce nontrivial solutions.
20.12 Example by Row Reduction
Consider
Place them as columns:
We solve
Row reduce:
There is a pivot in every column. Therefore the homogeneous system has only the trivial solution.
The vectors are linearly independent.
20.13 Example of Dependence by Row Reduction
Consider
Since
the list is already dependent. Row reduction shows the same fact.
Place the vectors as columns:
The second column is twice the first column. Hence not every column can be a pivot column.
Therefore
has a nontrivial solution, for example
Indeed,
20.14 Independence and Span Growth
A useful way to understand independence is by span growth.
A list
is linearly independent exactly when each new vector enlarges the span of the previous vectors. Equivalently,
for every
For , this means
If some vector lies in the span of earlier vectors, then adding it does not create a new direction. The list is dependent.
This viewpoint is often more intuitive than the coefficient equation.
20.15 Removing Redundant Vectors
If a list is linearly dependent, then at least one vector can be removed without changing the span.
Suppose
Then every linear combination using can be rewritten using the other vectors.
Therefore
where the hat means that is omitted.
This process removes redundancy from a spanning set. Repeating it eventually produces a basis in finite-dimensional spaces.
20.16 Adding Independent Vectors
If a list is linearly independent and a vector does not lie in its span, then the enlarged list
is linearly independent.
To prove this, suppose
If , then
so lies in
This contradicts the assumption. Hence
Then
Since the original list is independent,
Thus the enlarged list is independent.
20.17 Independence of Polynomials
Consider the polynomials
They are linearly independent in .
Suppose
as a polynomial.
The zero polynomial has all coefficients equal to zero. Therefore
So the list is linearly independent.
This fact gives the standard basis of :
20.18 Independence of Functions
Functions can also be linearly independent.
For example,
and
are linearly independent as real-valued functions.
Suppose
for every real .
Set . Then
gives
Set
Then
gives
Thus only the trivial relation exists.
Therefore and are linearly independent.
20.19 Infinite Sets
An infinite set of vectors is linearly independent if every finite subset is linearly independent. It is linearly dependent if at least one finite subset is linearly dependent.
For example, the infinite set
is linearly independent in the vector space of all real polynomials.
Any finite linear relation has the form
As before, all coefficients must be zero.
Thus no finite nontrivial relation exists.
20.20 Summary
Linear independence formalizes the absence of redundancy. A list of vectors is independent when the zero vector has only the trivial representation as a linear combination of the list.
The key ideas are:
| Concept | Meaning |
|---|---|
| Linear relation | |
| Trivial relation | All coefficients are zero |
| Nontrivial relation | At least one coefficient is nonzero |
| Linearly independent | Only the trivial relation exists |
| Linearly dependent | A nontrivial relation exists |
| Redundant vector | A vector in the span of the others |
| Pivot column test | Independent exactly when every column is a pivot column |
| Dimension limit | More than vectors in an -dimensional space are dependent |
Span and independence work together. Span says the vectors generate enough. Independence says they contain no excess. A basis is precisely a list that satisfies both conditions.