Skip to content

Appendix G. Linear Algebra Review

A vector space over a field $F$ is a set $V$ equipped with addition and scalar multiplication satisfying the usual algebraic rules.

G.1 Vector Spaces

A vector space over a field FF is a set VV equipped with addition and scalar multiplication satisfying the usual algebraic rules.

The scalars belong to FF, while the vectors belong to VV.

Examples:

Rn,Cn,Qn. \mathbb{R}^n,\qquad \mathbb{C}^n,\qquad \mathbb{Q}^n.

Polynomial spaces such as

F[x] F[x]

are also vector spaces over FF.

In number theory, vector spaces appear in:

  • modular forms,
  • representation theory,
  • cohomology,
  • field extensions,
  • lattice theory,
  • Galois representations.

G.2 Subspaces

A subset WVW\subseteq V is a subspace if:

  1. 0W0\in W,
  2. u,vW    u+vWu,v\in W \implies u+v\in W,
  3. cF, uW    cuWc\in F,\ u\in W \implies cu\in W.

Examples in R3\mathbb{R}^3:

  • lines through the origin,
  • planes through the origin.

Subspaces organize solutions of linear systems and invariant algebraic structures.

G.3 Linear Combinations

Given vectors

v1,,vn, v_1,\ldots,v_n,

a linear combination is an expression

a1v1++anvn a_1v_1+\cdots+a_nv_n

with coefficients aiFa_i\in F.

The span of the vectors is the set of all linear combinations:

span(v1,,vn). \operatorname{span}(v_1,\ldots,v_n).

Spanning sets describe how complicated vectors are built from simpler ones.

G.4 Linear Independence

Vectors

v1,,vn v_1,\ldots,v_n

are linearly independent if

a1v1++anvn=0 a_1v_1+\cdots+a_nv_n=0

implies

a1==an=0. a_1=\cdots=a_n=0.

Otherwise they are linearly dependent.

Linear independence measures whether vectors contain genuinely new information.

In number theory, independence appears in:

  • algebraic number fields,
  • transcendence theory,
  • lattices,
  • character theory.

G.5 Bases and Dimension

A basis of a vector space is a linearly independent spanning set.

Every vector in the space has a unique representation as a linear combination of basis vectors.

The number of basis vectors is the dimension of the space.

Examples:

(1,0),(0,1) (1,0),(0,1)

form a basis of R2\mathbb{R}^2.

The polynomials

1,x,x2,,xn 1,x,x^2,\ldots,x^n

form a basis for polynomials of degree at most nn.

Field extensions are vector spaces. For example,

Q(2) \mathbb{Q}(\sqrt{2})

has basis

1,2 1,\sqrt{2}

over Q\mathbb{Q}, so its dimension is 22.

G.6 Matrices

An m×nm\times n matrix over a field FF is a rectangular array

A=(aij). A=(a_{ij}).

Matrices represent linear transformations and systems of equations.

Matrix addition and multiplication obey associative and distributive laws.

The identity matrix satisfies

IA=AI=A. IA=AI=A.

Matrices are central in computational number theory, representation theory, and arithmetic geometry.

G.7 Systems of Linear Equations

A linear system may be written:

Ax=b. Ax=b.

Solutions are found using row reduction or matrix inversion.

Gaussian elimination systematically transforms matrices into simpler forms.

Linear systems appear in:

  • lattice reduction,
  • modular arithmetic,
  • coding theory,
  • elliptic curve algorithms,
  • algebraic geometry.

G.8 Rank

The rank of a matrix is the dimension of its row space or column space.

It measures the number of independent linear constraints.

For a matrix AA,

rank(A)min(m,n). \operatorname{rank}(A)\le \min(m,n).

A square matrix is invertible exactly when its rank equals its size.

Rank is fundamental in solving linear systems and studying bilinear forms.

G.9 Determinants

The determinant of an n×nn\times n matrix AA is a scalar denoted

det(A). \det(A).

For a 2×22\times2 matrix,

det(abcd)=adbc. \det \begin{pmatrix} a & b \\ c & d \end{pmatrix} = ad-bc.

A matrix is invertible exactly when

det(A)0. \det(A)\ne0.

Determinants measure volume scaling and orientation.

In number theory they appear in:

  • discriminants,
  • lattice covolumes,
  • Jacobians,
  • algebraic geometry.

G.10 Eigenvalues and Eigenvectors

A nonzero vector vv is an eigenvector of AA if

Av=λv Av=\lambda v

for some scalar λ\lambda, called the eigenvalue.

Eigenvalues are roots of the characteristic polynomial:

det(AλI)=0. \det(A-\lambda I)=0.

Spectral methods appear throughout modern number theory:

  • Hecke operators,
  • automorphic forms,
  • graph theory,
  • random matrices,
  • arithmetic dynamics.

G.11 Inner Products

An inner product on a vector space VV is a function

,:V×VF \langle \cdot,\cdot\rangle:V\times V\to F

satisfying linearity, symmetry, and positivity.

For vectors in Rn\mathbb{R}^n,

x,y=x1y1++xnyn. \langle x,y\rangle = x_1y_1+\cdots+x_ny_n.

The associated norm is

x=x,x. \|x\|=\sqrt{\langle x,x\rangle}.

Inner products define geometry inside vector spaces.

Orthogonality is essential in Fourier analysis and modular form theory.

G.12 Orthogonality

Vectors uu and vv are orthogonal if

u,v=0. \langle u,v\rangle=0.

Orthogonal bases simplify computations and expansions.

Fourier series arise from orthogonal exponentials:

e2πinx. e^{2\pi i n x}.

Orthogonality relations also appear in:

  • Dirichlet characters,
  • representation theory,
  • harmonic analysis.

G.13 Linear Transformations

A linear transformation

T:VW T:V\to W

satisfies

T(u+v)=T(u)+T(v), T(u+v)=T(u)+T(v), T(cv)=cT(v). T(cv)=cT(v).

Every matrix defines a linear transformation.

The kernel is

ker(T)={v:T(v)=0}. \ker(T)=\{v:T(v)=0\}.

The image is

im(T)={T(v):vV}. \operatorname{im}(T)=\{T(v):v\in V\}.

The rank-nullity theorem states:

dim(V)=dim(kerT)+dim(imT). \dim(V) = \dim(\ker T) + \dim(\operatorname{im}T).

G.14 Dual Spaces

The dual space VV^* consists of linear maps

f:VF. f:V\to F.

Elements of VV^* are called linear functionals.

Duality appears naturally in:

  • representation theory,
  • cohomology,
  • harmonic analysis,
  • automorphic forms.

Modern arithmetic often studies spaces together with their dual structures.

G.15 Bilinear Forms

A bilinear form on VV is a function

B:V×VF B:V\times V\to F

that is linear in each variable.

Examples include dot products and trace pairings.

Quadratic forms arise from bilinear forms and are central in classical number theory.

A quadratic form in variables x1,,xnx_1,\ldots,x_n has the form

Q(x)=i,jaijxixj. Q(x)=\sum_{i,j}a_{ij}x_ix_j.

The study of integer solutions to quadratic equations is a major branch of arithmetic.

G.16 Lattices

A lattice in Rn\mathbb{R}^n is a discrete subgroup generated by linearly independent vectors.

For example,

Zn \mathbb{Z}^n

is the standard lattice.

Lattices are central in:

  • geometry of numbers,
  • quadratic forms,
  • cryptography,
  • modular forms,
  • sphere packing.

The determinant of a lattice measures its density.

G.17 Tensor Products

Tensor products combine vector spaces into larger multilinear structures.

Given vector spaces VV and WW, their tensor product is written

VW. V\otimes W.

Tensor products appear in:

  • representation theory,
  • cohomology,
  • motives,
  • automorphic forms.

They encode bilinear operations in linear form.

G.18 Linear Algebra over Finite Fields

Finite fields such as

Fp=Z/pZ \mathbb{F}_p=\mathbb{Z}/p\mathbb{Z}

support linear algebra just like R\mathbb{R} or C\mathbb{C}.

Vector spaces over finite fields are fundamental in:

  • coding theory,
  • cryptography,
  • finite geometry,
  • arithmetic combinatorics.

Many arithmetic problems simplify when reduced modulo primes.

G.19 Spectral Methods in Number Theory

Linear algebra enters modern arithmetic through operators and spectra.

Examples include:

OperatorArithmetic Role
Hecke operatorsmodular forms
Frobenius mapsfinite fields and cohomology
Laplaciansautomorphic spectra
Adjacency matricesexpander graphs
Transfer operatorsdynamical zeta functions

Eigenvalues frequently encode arithmetic information.

G.20 Linear Algebraic Language

ConceptNumber-Theoretic Role
Vector spacealgebraic structure
Basiscoordinate systems
Matrixlinear transformation
Determinantdiscriminants and volume
Eigenvaluespectral arithmetic data
Inner productorthogonality and harmonic analysis
Latticediscrete arithmetic geometry
Finite field linear algebracoding and cryptography

Linear algebra provides the language of structure and symmetry. Classical number theory studies integers directly. Modern number theory studies spaces generated by arithmetic objects and the linear operators acting on them.