A tridiagonal matrix is a matrix whose nonzero entries lie only on the main diagonal, the first subdiagonal, and the first superdiagonal. Tridiagonal form is central in numerical linear algebra because many eigenvalue and linear system algorithms become much faster and simpler on tridiagonal matrices.
An matrix is tridiagonal if
Thus has the structure
If the matrix is symmetric or Hermitian, then
or
Tridiagonal matrices arise naturally in discretized differential equations, orthogonal polynomial recurrences, Krylov methods, and symmetric eigenvalue algorithms. Dense symmetric matrices are typically reduced to tridiagonal form before eigenvalue iteration. (people.inf.ethz.ch)
83.1 Definition
A tridiagonal matrix has nonzero entries only on three diagonals:
| Diagonal | Allowed entries |
|---|---|
| Main diagonal | |
| First subdiagonal | |
| First superdiagonal |
All other entries must be zero.
For example,
is tridiagonal.
The matrix
is not tridiagonal because the entry in position is nonzero.
83.2 Relation to Hessenberg Form
Every tridiagonal matrix is both upper Hessenberg and lower Hessenberg.
Indeed, if
then the entry is zero. In particular:
- if , the entry is zero, so the matrix is upper Hessenberg;
- if , the entry is zero, so the matrix is lower Hessenberg.
Thus tridiagonal form is more restrictive than Hessenberg form.
For symmetric matrices, Hessenberg reduction automatically produces tridiagonal form. Symmetry forces the upper and lower structures to mirror each other.
83.3 Symmetric Tridiagonal Matrices
A symmetric tridiagonal matrix has the form
This form is extremely important in eigenvalue computation.
A dense real symmetric matrix can be reduced by orthogonal similarity to symmetric tridiagonal form:
The reduction preserves eigenvalues. Since tridiagonal matrices are much cheaper to process than dense matrices, eigenvalue algorithms usually operate on instead of directly on . (en.wikipedia.org)
83.4 Reduction by Householder Transformations
Dense symmetric matrices are reduced to tridiagonal form using Householder reflections.
A Householder reflector has the form
where
At step , the reflector zeros entries below the first subdiagonal in column . Since the matrix is symmetric, applying the reflector from both sides also zeros the corresponding row entries:
After steps, the matrix becomes symmetric tridiagonal.
The final result satisfies
where is the product of the Householder reflectors.
83.5 A Four by Four Example
A symmetric matrix
$$ A = \begin{bmatrix}
- & * & * & * \
- & * & * & * \
- & * & * & * \
- & * & * & * \end{bmatrix} $$
is reduced to
$$ T = \begin{bmatrix}
- & * & 0 & 0 \
- & * & * & 0 \ 0 & * & * & * \ 0 & 0 & * & * \end{bmatrix}. $$
Only the main diagonal and the first neighboring diagonals remain nonzero.
This sparse structure dramatically reduces storage and computational cost.
83.6 Why Tridiagonal Form Matters
The QR algorithm preserves tridiagonal structure.
Suppose is tridiagonal. If
and
then is also tridiagonal.
Thus the QR algorithm can operate entirely within the tridiagonal structure. This makes each iteration much cheaper than a dense QR iteration.
For dense symmetric eigenvalue problems, the standard pipeline is:
| Stage | Cost |
|---|---|
| Reduce dense symmetric matrix to tridiagonal form | |
| QR iteration on tridiagonal matrix | total or better in practice |
The expensive dense work occurs only once.
83.7 Storage Efficiency
A dense matrix requires
storage locations.
A tridiagonal matrix requires only
entries:
- diagonal entries,
- subdiagonal entries,
- superdiagonal entries.
For a symmetric tridiagonal matrix, only
entries are needed, since the superdiagonal and subdiagonal are identical.
This storage reduction is important for large problems.
83.8 Solving Tridiagonal Systems
Linear systems
with tridiagonal can be solved very efficiently.
A specialized form of Gaussian elimination, called the Thomas algorithm, solves such systems in linear time:
This is much cheaper than the
cost of dense elimination.
The efficiency comes from the sparse structure. Elimination creates no additional fill outside the tridiagonal pattern.
83.9 The Thomas Algorithm
Suppose
The Thomas algorithm performs forward elimination and backward substitution using only neighboring entries.
The forward sweep computes modified coefficients recursively. The backward sweep recovers the solution vector.
Because each step touches only a constant number of entries, the total cost grows linearly with .
83.10 Algorithm
The following algorithm solves a tridiagonal system.
thomas_algorithm(f, d, e, b):
n = length(d)
for i = 2 to n:
m = f[i-1] / d[i-1]
d[i] = d[i] - m * e[i-1]
b[i] = b[i] - m * b[i-1]
x[n] = b[n] / d[n]
for i = n-1 down to 1:
x[i] = (b[i] - e[i] * x[i+1]) / d[i]
return xHere:
- are diagonal entries,
- are superdiagonal entries,
- are subdiagonal entries.
The algorithm assumes no zero pivots appear during elimination.
83.11 Example
Consider
The system is
Forward elimination gives
Backward substitution yields
Thus
83.12 Eigenvalues of Symmetric Tridiagonal Matrices
Symmetric tridiagonal matrices have especially favorable eigenvalue properties.
Their eigenvalues are real because the matrix is symmetric.
Moreover, eigenvalue algorithms become much faster because each QR step preserves tridiagonal structure. Specialized algorithms for symmetric tridiagonal matrices can compute eigenvalues with very high efficiency and accuracy.
This is why dense symmetric eigenvalue computation nearly always begins with tridiagonal reduction.
83.13 Jacobi Matrices
A symmetric tridiagonal matrix with positive subdiagonal entries is often called a Jacobi matrix.
These matrices appear naturally in orthogonal polynomial theory. If
are orthogonal polynomials satisfying a three-term recurrence relation, then multiplication by in that basis is represented by a Jacobi matrix.
This connection links tridiagonal matrices to Gaussian quadrature, spectral theory, and approximation theory.
83.14 Discrete Differential Operators
Finite difference discretizations of differential equations often produce tridiagonal matrices.
For example, consider the second derivative approximation
The resulting linear system has matrix
This is symmetric tridiagonal.
Such matrices arise in heat equations, Poisson equations, wave equations, quantum mechanics, and many other discretized physical systems.
83.15 Tridiagonalization and Lanczos Iteration
Lanczos iteration is a Krylov subspace method for symmetric matrices.
Given a symmetric matrix , it constructs orthonormal vectors
such that
where is symmetric tridiagonal.
The tridiagonal structure appears because symmetry shortens the recurrence. Each new vector depends only on the previous two basis vectors.
This is analogous to how Arnoldi iteration produces Hessenberg matrices for general matrices.
83.16 Three-Term Recurrence
Lanczos iteration satisfies the recurrence
Only three neighboring basis vectors appear.
This recurrence corresponds exactly to the tridiagonal structure:
The symmetry of is what allows the Hessenberg structure to collapse into tridiagonal structure.
83.17 Positive Definite Tridiagonal Matrices
Many tridiagonal matrices are also positive definite.
For example,
is symmetric positive definite.
Such matrices arise in discretized elliptic partial differential equations.
Positive definiteness allows the use of Cholesky factorization. For tridiagonal matrices, the Cholesky factors are bidiagonal, so factorization and solves remain linear-time operations.
83.18 Comparison with Other Structured Forms
| Form | Nonzero pattern | Main use |
|---|---|---|
| Diagonal | Main diagonal only | Fully decoupled systems |
| Tridiagonal | Three diagonals | Symmetric eigenproblems, PDEs |
| Hessenberg | Main diagonal plus one subdiagonal band | General eigenproblems |
| Bidiagonal | Main diagonal plus one adjacent diagonal | SVD computation |
| Band matrix | Limited-width diagonal band | Sparse structured systems |
Tridiagonal matrices are among the simplest structured sparse matrices while still capturing important coupling between neighboring variables.
83.19 Numerical Stability
Algorithms for symmetric tridiagonal matrices are usually highly stable.
Orthogonal similarity transformations preserve symmetry and eigenvalues while controlling roundoff growth. QR iteration on symmetric tridiagonal matrices is particularly reliable and efficient.
The Thomas algorithm, however, may require pivoting if the matrix is not diagonally dominant or positive definite. Without pivoting, zero or very small pivots can cause instability.
For symmetric positive definite tridiagonal matrices, Cholesky-based methods are especially robust.
83.20 Summary
A tridiagonal matrix has nonzero entries only on the main diagonal and the first neighboring diagonals:
Symmetric matrices can be reduced by orthogonal similarity to symmetric tridiagonal form:
Tridiagonal structure greatly reduces storage and computational cost. Linear systems can be solved in linear time, and eigenvalue algorithms become much more efficient.
Tridiagonal matrices arise naturally in finite difference methods, orthogonal polynomials, Lanczos iteration, and symmetric eigenvalue computation. They form one of the most important structured matrix classes in numerical linear algebra.