# Appendix G. Linear Algebra Review

## G.1 Vector Spaces

A vector space over a field $F$ is a set $V$ equipped with addition and scalar multiplication satisfying the usual algebraic rules.

The scalars belong to $F$, while the vectors belong to $V$.

Examples:

$$
\mathbb{R}^n,\qquad \mathbb{C}^n,\qquad \mathbb{Q}^n.
$$

Polynomial spaces such as

$$
F[x]
$$

are also vector spaces over $F$.

In number theory, vector spaces appear in:

- modular forms,
- representation theory,
- cohomology,
- field extensions,
- lattice theory,
- Galois representations.

## G.2 Subspaces

A subset $W\subseteq V$ is a subspace if:

1. $0\in W$,
2. $u,v\in W \implies u+v\in W$,
3. $c\in F,\ u\in W \implies cu\in W$.

Examples in $\mathbb{R}^3$:

- lines through the origin,
- planes through the origin.

Subspaces organize solutions of linear systems and invariant algebraic structures.

## G.3 Linear Combinations

Given vectors

$$
v_1,\ldots,v_n,
$$

a linear combination is an expression

$$
a_1v_1+\cdots+a_nv_n
$$

with coefficients $a_i\in F$.

The span of the vectors is the set of all linear combinations:

$$
\operatorname{span}(v_1,\ldots,v_n).
$$

Spanning sets describe how complicated vectors are built from simpler ones.

## G.4 Linear Independence

Vectors

$$
v_1,\ldots,v_n
$$

are linearly independent if

$$
a_1v_1+\cdots+a_nv_n=0
$$

implies

$$
a_1=\cdots=a_n=0.
$$

Otherwise they are linearly dependent.

Linear independence measures whether vectors contain genuinely new information.

In number theory, independence appears in:

- algebraic number fields,
- transcendence theory,
- lattices,
- character theory.

## G.5 Bases and Dimension

A basis of a vector space is a linearly independent spanning set.

Every vector in the space has a unique representation as a linear combination of basis vectors.

The number of basis vectors is the dimension of the space.

Examples:

$$
(1,0),(0,1)
$$

form a basis of $\mathbb{R}^2$.

The polynomials

$$
1,x,x^2,\ldots,x^n
$$

form a basis for polynomials of degree at most $n$.

Field extensions are vector spaces. For example,

$$
\mathbb{Q}(\sqrt{2})
$$

has basis

$$
1,\sqrt{2}
$$

over $\mathbb{Q}$, so its dimension is $2$.

## G.6 Matrices

An $m\times n$ matrix over a field $F$ is a rectangular array

$$
A=(a_{ij}).
$$

Matrices represent linear transformations and systems of equations.

Matrix addition and multiplication obey associative and distributive laws.

The identity matrix satisfies

$$
IA=AI=A.
$$

Matrices are central in computational number theory, representation theory, and arithmetic geometry.

## G.7 Systems of Linear Equations

A linear system may be written:

$$
Ax=b.
$$

Solutions are found using row reduction or matrix inversion.

Gaussian elimination systematically transforms matrices into simpler forms.

Linear systems appear in:

- lattice reduction,
- modular arithmetic,
- coding theory,
- elliptic curve algorithms,
- algebraic geometry.

## G.8 Rank

The rank of a matrix is the dimension of its row space or column space.

It measures the number of independent linear constraints.

For a matrix $A$,

$$
\operatorname{rank}(A)\le \min(m,n).
$$

A square matrix is invertible exactly when its rank equals its size.

Rank is fundamental in solving linear systems and studying bilinear forms.

## G.9 Determinants

The determinant of an $n\times n$ matrix $A$ is a scalar denoted

$$
\det(A).
$$

For a $2\times2$ matrix,

$$
\det
\begin{pmatrix}
a & b \\
c & d
\end{pmatrix} =
ad-bc.
$$

A matrix is invertible exactly when

$$
\det(A)\ne0.
$$

Determinants measure volume scaling and orientation.

In number theory they appear in:

- discriminants,
- lattice covolumes,
- Jacobians,
- algebraic geometry.

## G.10 Eigenvalues and Eigenvectors

A nonzero vector $v$ is an eigenvector of $A$ if

$$
Av=\lambda v
$$

for some scalar $\lambda$, called the eigenvalue.

Eigenvalues are roots of the characteristic polynomial:

$$
\det(A-\lambda I)=0.
$$

Spectral methods appear throughout modern number theory:

- Hecke operators,
- automorphic forms,
- graph theory,
- random matrices,
- arithmetic dynamics.

## G.11 Inner Products

An inner product on a vector space $V$ is a function

$$
\langle \cdot,\cdot\rangle:V\times V\to F
$$

satisfying linearity, symmetry, and positivity.

For vectors in $\mathbb{R}^n$,

$$
\langle x,y\rangle =
x_1y_1+\cdots+x_ny_n.
$$

The associated norm is

$$
\|x\|=\sqrt{\langle x,x\rangle}.
$$

Inner products define geometry inside vector spaces.

Orthogonality is essential in Fourier analysis and modular form theory.

## G.12 Orthogonality

Vectors $u$ and $v$ are orthogonal if

$$
\langle u,v\rangle=0.
$$

Orthogonal bases simplify computations and expansions.

Fourier series arise from orthogonal exponentials:

$$
e^{2\pi i n x}.
$$

Orthogonality relations also appear in:

- Dirichlet characters,
- representation theory,
- harmonic analysis.

## G.13 Linear Transformations

A linear transformation

$$
T:V\to W
$$

satisfies

$$
T(u+v)=T(u)+T(v),
$$

$$
T(cv)=cT(v).
$$

Every matrix defines a linear transformation.

The kernel is

$$
\ker(T)=\{v:T(v)=0\}.
$$

The image is

$$
\operatorname{im}(T)=\{T(v):v\in V\}.
$$

The rank-nullity theorem states:

$$
\dim(V) =
\dim(\ker T)
+
\dim(\operatorname{im}T).
$$

## G.14 Dual Spaces

The dual space $V^*$ consists of linear maps

$$
f:V\to F.
$$

Elements of $V^*$ are called linear functionals.

Duality appears naturally in:

- representation theory,
- cohomology,
- harmonic analysis,
- automorphic forms.

Modern arithmetic often studies spaces together with their dual structures.

## G.15 Bilinear Forms

A bilinear form on $V$ is a function

$$
B:V\times V\to F
$$

that is linear in each variable.

Examples include dot products and trace pairings.

Quadratic forms arise from bilinear forms and are central in classical number theory.

A quadratic form in variables $x_1,\ldots,x_n$ has the form

$$
Q(x)=\sum_{i,j}a_{ij}x_ix_j.
$$

The study of integer solutions to quadratic equations is a major branch of arithmetic.

## G.16 Lattices

A lattice in $\mathbb{R}^n$ is a discrete subgroup generated by linearly independent vectors.

For example,

$$
\mathbb{Z}^n
$$

is the standard lattice.

Lattices are central in:

- geometry of numbers,
- quadratic forms,
- cryptography,
- modular forms,
- sphere packing.

The determinant of a lattice measures its density.

## G.17 Tensor Products

Tensor products combine vector spaces into larger multilinear structures.

Given vector spaces $V$ and $W$, their tensor product is written

$$
V\otimes W.
$$

Tensor products appear in:

- representation theory,
- cohomology,
- motives,
- automorphic forms.

They encode bilinear operations in linear form.

## G.18 Linear Algebra over Finite Fields

Finite fields such as

$$
\mathbb{F}_p=\mathbb{Z}/p\mathbb{Z}
$$

support linear algebra just like $\mathbb{R}$ or $\mathbb{C}$.

Vector spaces over finite fields are fundamental in:

- coding theory,
- cryptography,
- finite geometry,
- arithmetic combinatorics.

Many arithmetic problems simplify when reduced modulo primes.

## G.19 Spectral Methods in Number Theory

Linear algebra enters modern arithmetic through operators and spectra.

Examples include:

| Operator | Arithmetic Role |
|---|---|
| Hecke operators | modular forms |
| Frobenius maps | finite fields and cohomology |
| Laplacians | automorphic spectra |
| Adjacency matrices | expander graphs |
| Transfer operators | dynamical zeta functions |

Eigenvalues frequently encode arithmetic information.

## G.20 Linear Algebraic Language

| Concept | Number-Theoretic Role |
|---|---|
| Vector space | algebraic structure |
| Basis | coordinate systems |
| Matrix | linear transformation |
| Determinant | discriminants and volume |
| Eigenvalue | spectral arithmetic data |
| Inner product | orthogonality and harmonic analysis |
| Lattice | discrete arithmetic geometry |
| Finite field linear algebra | coding and cryptography |

Linear algebra provides the language of structure and symmetry. Classical number theory studies integers directly. Modern number theory studies spaces generated by arithmetic objects and the linear operators acting on them.

