Linear algebra is the study of vectors, vector spaces, and linear transformations.
A vector is an object that can be added to another vector and multiplied by a scalar. In elementary geometry, a vector may be drawn as an arrow. In computation, a vector is often written as a list of numbers. In abstract mathematics, a vector may be a polynomial, a function, a matrix, or another object that obeys the same algebraic rules.
The central idea is linearity. A rule is linear when it preserves addition and scalar multiplication. If a transformation is called , then linearity means
and
These two identities say that the transformation respects the structure of the space. It does not bend, tear, multiply coordinates together, or introduce nonlinear behavior. It sends linear combinations to linear combinations.
1.1 Vectors
A vector in is an ordered list of real numbers:
The numbers are called the components of . The set contains all such vectors with real components.
For example,
is a vector in .
Vectors may represent many kinds of data. In geometry, they represent displacement. In physics, they represent velocity, force, or momentum. In machine learning, they represent features. In numerical computation, they represent unknowns, measurements, or coefficients.
The same algebra applies in each case.
1.2 Vector Addition
Two vectors of the same size can be added component by component:
For example,
Vector addition combines displacements, quantities, or data records of the same type. The operation is simple, but it is one of the basic operations on which the rest of linear algebra is built.
1.3 Scalar Multiplication
A scalar is a single number. In real linear algebra, scalars are real numbers. In complex linear algebra, scalars are complex numbers.
A scalar multiplies a vector by multiplying each component:
For example,
Scalar multiplication changes the size of a vector. If the scalar is negative, it also reverses direction in the geometric interpretation.
1.4 Linear Combinations
A linear combination is an expression built from vector addition and scalar multiplication.
If are vectors and are scalars, then
is a linear combination of the vectors.
Linear combinations are one of the most important objects in the subject. They describe all vectors that can be built from a given collection of vectors. They also lead to span, basis, dimension, rank, and many other central ideas.
For example, if
then every vector in can be written as
Thus and generate the whole plane.
1.5 Matrices
A matrix is a rectangular array of numbers:
A matrix with rows and columns is called an matrix.
Matrices are used to store coefficients, represent systems of equations, and describe linear transformations. A matrix can act on a vector by matrix-vector multiplication. If is an matrix and is a vector in , then is a vector in .
For example,
This operation combines the rows of the matrix with the entries of the vector.
1.6 Linear Equations
Linear algebra begins historically with systems of linear equations. A linear equation in variables has the form
The variables appear only to the first power. There are no products such as , no powers such as , and no nonlinear functions such as .
A system of linear equations is a collection of such equations:
This system can be written compactly as
Here is the coefficient matrix, is the unknown vector, and is the right-hand side vector.
Much of linear algebra studies when this equation has a solution, whether that solution is unique, and how to compute it.
1.7 Linear Transformations
A linear transformation is a function between vector spaces that preserves vector addition and scalar multiplication.
A function
is linear if for all vectors and all scalars ,
and
Every matrix defines a linear transformation. If is an matrix, then
defines a linear transformation from to .
This gives matrices their conceptual meaning. A matrix is more than a table of numbers. It is a rule for transforming vectors.
Some matrices rotate vectors. Some stretch them. Some project them onto a line or plane. Some collapse dimensions. Some change coordinates. Linear algebra studies these transformations through their algebraic and geometric properties.
1.8 The Geometric View
In two and three dimensions, linear algebra has a direct geometric interpretation.
Vectors are points or arrows. Lines and planes are sets of vectors. Matrices transform space. A matrix may stretch, rotate, reflect, shear, or project.
For example, the matrix
doubles the first coordinate and leaves the second coordinate unchanged. It stretches the plane horizontally.
The matrix
rotates the plane by counterclockwise.
The geometric view makes many ideas easier to understand. Rank measures the dimension of the output space. The null space describes directions that are collapsed to zero. Eigenvectors describe directions that keep their line under a transformation.
1.9 The Algebraic View
The algebraic view treats vectors and matrices as objects governed by rules.
These rules include associativity, distributivity, identities, inverses, and compatibility with scalar multiplication. They allow us to manipulate equations symbolically and prove general theorems.
For example, if is invertible, then the equation
has the unique solution
This statement is algebraic. It depends on the existence of an inverse matrix and the rules of matrix multiplication.
The algebraic view is especially important in higher dimensions, where geometric intuition becomes limited.
1.10 The Computational View
Linear algebra is also a computational subject.
Many practical problems reduce to solving equations, computing decompositions, estimating eigenvalues, or approximating high-dimensional data. Numerical linear algebra studies how to do these tasks efficiently and accurately on computers.
The equation
may involve millions or billions of unknowns. In such cases, we do not compute by hand. We use algorithms that exploit structure, sparsity, approximation, and stability.
Important computational problems include:
| Problem | Typical method |
|---|---|
| Solve | Gaussian elimination, LU decomposition, iterative methods |
| Approximate inconsistent systems | Least squares |
| Find principal directions | Eigenvalue methods, singular value decomposition |
| Compress data | Low-rank approximation |
| Solve large sparse systems | Krylov subspace methods |
| Analyze graphs | Spectral methods |
This computational view connects linear algebra with scientific computing, statistics, optimization, machine learning, graphics, and engineering.
1.11 Why Linearity Matters
Linear problems are simpler than nonlinear problems because they preserve structure.
If we understand the behavior of a linear transformation on a basis, then we understand its behavior everywhere. This is a powerful reduction. Instead of studying infinitely many vectors separately, we study a finite set of basis vectors and extend by linearity.
Suppose can be written as
If is linear, then
Thus the transformation is completely determined by what it does to the basis vectors.
This principle explains why matrices are so effective. The columns of a matrix are the images of the standard basis vectors. Once those columns are known, the entire transformation is known.
1.12 Main Questions of Linear Algebra
Linear algebra repeatedly asks a small number of fundamental questions.
Given a system :
| Question | Meaning |
|---|---|
| Does a solution exist? | Is in the column space of ? |
| Is the solution unique? | Is the null space of trivial? |
| How can we compute a solution? | Which algorithm is stable and efficient? |
| How sensitive is the solution? | How does error in or affect ? |
Given a matrix :
| Question | Meaning |
|---|---|
| What does do geometrically? | Transformation view |
| What is its rank? | Dimension of its image |
| What is its null space? | Directions sent to zero |
| Is it invertible? | Can the transformation be undone? |
| What are its eigenvalues? | Fundamental scaling factors |
| Can it be decomposed? | Factorization into simpler pieces |
These questions appear throughout the book in increasingly precise forms.
1.13 The Scope of the Subject
Linear algebra has several layers.
At the first layer, it studies systems of equations and matrices. This layer is concrete and computational.
At the second layer, it studies vector spaces and linear transformations. This layer explains why matrix methods work.
At the third layer, it studies structure: dimension, rank, determinant, eigenvalues, inner products, orthogonality, and canonical forms.
At the fourth layer, it studies computation and applications: decompositions, numerical stability, data analysis, optimization, geometry, and differential equations.
A reference book must cover all of these layers. The elementary material gives the language. The abstract material gives the theory. The numerical material gives practical methods. The applications show why the subject appears so widely.
1.14 A Small Example
Consider the system
In matrix form, this is
The first equation says that lies on the line . The second equation says that lies on the line . Solving the system means finding the intersection of these two lines.
Add the two equations after rewriting:
Adding gives
so
Substitute into :
so
The solution is
This example contains several themes. The system has equations. The equations define geometric objects. The coefficients form a matrix. The solution is a vector. The matrix maps the unknown vector to the right-hand side. Later chapters develop these ideas systematically.
1.15 Summary
Linear algebra studies vectors, matrices, vector spaces, and linear transformations. Its basic operations are vector addition and scalar multiplication. Its central objects are linear combinations, systems of linear equations, matrices, subspaces, and transformations.
The subject has three major viewpoints:
| Viewpoint | Main idea |
|---|---|
| Geometric | Vectors and transformations describe space |
| Algebraic | Matrices and vector spaces obey precise rules |
| Computational | Algorithms solve large linear problems |
These viewpoints support one another. Geometry gives intuition. Algebra gives proofs. Computation gives methods. Together they make linear algebra one of the basic languages of modern mathematics, science, and engineering.