A vector space is a set whose elements can be added together and multiplied by scalars. The scalars usually come from or , but the same definition works over any field. The two operations must satisfy the vector space axioms: closure, associativity, commutativity of addition, a zero vector, additive inverses, scalar identity, scalar associativity, and distributivity.
The purpose of the definition is to isolate the algebraic behavior shared by many different objects. Arrows in the plane, columns of numbers, polynomials, matrices, and many spaces of functions can all be vector spaces. Once the axioms are known to hold, the same theorems apply to all of them.
17.1 The Definition
Let be a field. A vector space over is a set together with two operations:
for vectors , and
for a scalar and a vector .
The elements of are called vectors. The elements of are called scalars.
The word vector no longer means only an arrow or a column of numbers. It means any element of a set that satisfies the vector space rules.
17.2 Axioms
For all and all , the following properties must hold.
| Axiom | Statement |
|---|---|
| Closure under addition | |
| Closure under scalar multiplication | |
| Associativity of addition | |
| Commutativity of addition | |
| Additive identity | There exists such that |
| Additive inverse | For each , there exists such that |
| Scalar identity | |
| Compatibility of scalar multiplication | |
| Distributivity over vector addition | |
| Distributivity over scalar addition |
These axioms are not arbitrary. They say that addition behaves like ordinary addition and that scalar multiplication interacts correctly with addition.
17.3 First Examples
The space is the standard example. Its elements are columns
where each is a real number.
Addition and scalar multiplication are defined component by component:
and
The zero vector is
The additive inverse of is
Thus is a vector space over .
17.4 Polynomials as Vectors
Let be the set of all real polynomials of degree at most :
This set is a vector space over . The vectors are polynomials. The scalars are real numbers.
If
and
then
If , then
The zero vector is the zero polynomial:
This example shows why the abstract definition is useful. A polynomial does not look like an arrow, but it behaves like a vector under addition and scalar multiplication.
17.5 Matrices as Vectors
Let be the set of all real matrices. This set is a vector space over .
Matrix addition and scalar multiplication are defined entry by entry. If
then
If , then
Here the vectors are matrices. The matrix shape must be fixed. The set of all matrices forms a vector space. The union of and matrices does not, because addition between different shapes is undefined.
17.6 Function Spaces
Let be the set of all functions from to . This is a vector space over .
For functions and , define
and
The zero vector is the zero function:
for every .
For example, if
and
then
This space is usually infinite-dimensional. It contains far more vectors than . Still, it satisfies the same vector space axioms.
17.7 Non-Examples
A set can fail to be a vector space even when it looks similar to one.
The set
is not a vector space over . It is closed under addition, but not under scalar multiplication. For example,
but
The set
is not a vector space. It does not contain the zero vector, since
It is an affine line, not a vector space.
The set of polynomials of exact degree is not a vector space. For example,
and the zero polynomial does not have exact degree .
These examples show that closure and the zero vector are often the fastest tests.
17.8 Linear Combinations
A linear combination of vectors is a vector of the form
where are scalars.
Linear combinations are the basic expressions of linear algebra. They are built using only the two vector space operations: addition and scalar multiplication.
For example, in ,
The result is again a vector in .
17.9 Span
The span of a set of vectors is the set of all linear combinations of those vectors.
If , then
The span is always a vector space inside . More precisely, it is a subspace of .
For example, in , let
Then
- $$
- \operatorname{span}(v) =
- \left{
- c
- \begin{bmatrix}
- 1 \
- 2
- \end{bmatrix}
- c \in \mathbb{R} \right}. $$
This is the line through the origin in the direction of .
17.10 Subspaces
A subspace of a vector space is a subset that is itself a vector space under the same operations.
To prove that is a subspace, it is enough to check three conditions:
| Condition | Meaning |
|---|---|
| Nonempty | |
| Closed under addition | If , then |
| Closed under scalar multiplication | If and , then |
For example,
is a subspace of .
It contains the zero vector because
If and satisfy the equation, then so does . If satisfies the equation, then so does . Therefore is a subspace.
17.11 Linear Independence
Vectors are linearly independent if the equation
has only the trivial solution
They are linearly dependent if there is a nontrivial solution. That means at least one coefficient is not zero.
Linear independence means that no vector in the list is redundant. Each vector contributes a new direction that cannot be built from the others.
For example,
are linearly independent in .
But
are linearly dependent, since
17.12 Basis
A basis of a vector space is a list of vectors that is both spanning and linearly independent.
The list spans the space, so every vector can be built from it. The list is linearly independent, so no vector in the list is redundant.
In , the standard basis is
Every vector
can be written uniquely as
This uniqueness is the main reason bases are important. A basis turns abstract vectors into coordinates.
17.13 Dimension
The dimension of a vector space is the number of vectors in any basis.
For example,
The vector space of polynomials of degree at most has dimension , because a basis is
The space has dimension , because each matrix has independent entries.
Dimension measures the number of independent parameters needed to describe a vector in the space.
17.14 Coordinates
Once a basis is chosen, every vector has coordinates.
Let
be a basis of . If
then the coordinate vector of with respect to is
Coordinates depend on the basis. The vector itself does not.
This distinction is important. A vector is an object in a vector space. A coordinate vector is a representation of that object after choosing a basis.
17.15 The Zero Vector
Every vector space has exactly one zero vector.
The zero vector satisfies
for every .
Although the symbol is used in many spaces, its meaning depends on the space.
| Space | Zero vector |
|---|---|
| Zero polynomial | |
| Zero matrix | |
| Function space | Zero function |
The zero vector is required for a vector space. Any candidate set that lacks it cannot be a vector space.
17.16 Additive Inverses
Every vector in a vector space has an additive inverse , satisfying
In , this means changing the sign of every component. For polynomials, it means changing the sign of every coefficient. For functions, it means defining
Additive inverses allow subtraction to be defined:
Thus subtraction is not a primitive operation in the definition. It is derived from addition and additive inverse.
17.17 Consequences of the Axioms
Several useful facts follow from the axioms.
For every scalar and vector ,
Also, if
then either
or
These facts are not separate axioms. They are theorems proved from the vector space axioms.
As an example, prove that . Since in the scalar field,
By distributivity,
Thus
Add the additive inverse of to both sides. The result is
Therefore
This proof shows how the axioms control all ordinary algebraic behavior.
17.18 Vector Spaces over Different Fields
The field of scalars matters.
The space is naturally a vector space over . The space is naturally a vector space over . It can also be viewed as a vector space over , but then its dimension changes.
For example, has dimension over , with basis
But has dimension over , with basis
Every complex number can be written as
Here and are real coordinates.
Thus the same set can define different vector spaces depending on the field of scalars.
17.19 Finite and Infinite Dimensional Spaces
A vector space is finite-dimensional if it has a finite basis. It is infinite-dimensional if no finite basis spans it.
The spaces
are finite-dimensional.
The space of all real polynomials is infinite-dimensional. A finite list of polynomials has a maximum degree. It cannot span polynomials of higher degree.
The space of all functions from to is also infinite-dimensional.
Finite-dimensional spaces are the main subject of elementary linear algebra. Infinite-dimensional spaces appear naturally in analysis, differential equations, Fourier theory, and functional analysis.
17.20 Summary
A vector space is a set with addition and scalar multiplication satisfying precise algebraic laws. The definition includes many objects that at first look different: coordinate vectors, matrices, polynomials, and functions.
The main ideas introduced in this chapter are:
| Concept | Meaning |
|---|---|
| Vector space | A set closed under vector addition and scalar multiplication |
| Scalar | An element of the underlying field |
| Linear combination | A sum of scalar multiples of vectors |
| Span | The set of all linear combinations |
| Subspace | A subset that is itself a vector space |
| Linear independence | No vector is redundant |
| Basis | A linearly independent spanning list |
| Dimension | The number of vectors in a basis |
| Coordinates | Scalars describing a vector relative to a basis |
Vector spaces provide the language for the rest of linear algebra. Once this structure is fixed, matrices become representations of linear maps, systems of equations become questions about subspaces, and dimension becomes the measure of independent direction.