A bidiagonal matrix is a matrix whose nonzero entries lie only on the main diagonal and one adjacent diagonal. Bidiagonal form is fundamental in singular value computation because dense matrices are typically reduced to bidiagonal form before applying iterative SVD algorithms.
A matrix is upper bidiagonal if its only possible nonzero entries are on the main diagonal and the first superdiagonal. It is lower bidiagonal if its only possible nonzero entries are on the main diagonal and the first subdiagonal.
An upper bidiagonal matrix has the form
Dense singular value algorithms typically reduce a matrix to bidiagonal form using Householder transformations, then apply iterative diagonalization to the bidiagonal matrix. (en.wikipedia.org)
84.1 Definition
A bidiagonal matrix has nonzero entries on only two diagonals.
For an upper bidiagonal matrix,
For a lower bidiagonal matrix,
Examples:
Upper bidiagonal:
Lower bidiagonal:
Both are sparse matrix forms with only potentially nonzero entries in the square case.
84.2 Relation to Tridiagonal Form
Bidiagonal matrices are analogous to tridiagonal matrices but even sparser.
| Form | Allowed diagonals |
|---|---|
| Diagonal | Main diagonal |
| Bidiagonal | Main diagonal plus one adjacent diagonal |
| Tridiagonal | Main diagonal plus both adjacent diagonals |
Tridiagonal matrices arise naturally in symmetric eigenvalue problems. Bidiagonal matrices arise naturally in singular value problems.
The connection is deeper than analogy. If is bidiagonal, then
and
are tridiagonal. Since singular values are related to eigenvalues of these products, bidiagonal structure leads naturally to tridiagonal structure.
84.3 Why Bidiagonal Form Matters
Singular value decomposition algorithms are expensive on general dense matrices.
A standard dense SVD pipeline therefore has two stages:
| Stage | Goal |
|---|---|
| Bidiagonal reduction | Reduce dense matrix to bidiagonal form |
| Iterative diagonalization | Compute singular values from bidiagonal matrix |
The reduction stage costs
for an matrix with . After reduction, the iterative stage becomes much cheaper because bidiagonal structure is preserved during iteration.
This mirrors the role of Hessenberg and tridiagonal forms in eigenvalue computation.
84.4 Reduction by Householder Transformations
Bidiagonal reduction uses alternating left and right Householder transformations.
Suppose
At step :
- A left Householder reflector zeros entries below the diagonal in column .
- A right Householder reflector zeros entries to the right of the first superdiagonal in row .
The process alternates until the matrix becomes upper bidiagonal.
The final decomposition has the form
where and are orthogonal matrices and is upper bidiagonal.
Equivalently,
This already resembles the structure of the singular value decomposition. The bidiagonal matrix is an intermediate step between the original matrix and the diagonal singular value matrix.
84.5 Householder Reflections
A Householder reflector has the form
where
It is orthogonal and symmetric:
Left multiplication by changes rows. Right multiplication changes columns.
In bidiagonal reduction:
- left reflectors introduce zeros in columns,
- right reflectors introduce zeros in rows.
Orthogonal transformations preserve singular values and maintain good numerical stability.
84.6 A Rectangular Example Pattern
Suppose
A dense matrix begins as
$$ \begin{bmatrix}
- & * & * & * \
- & * & * & * \
- & * & * & * \
- & * & * & * \
- & * & * & * \end{bmatrix}. $$
After bidiagonal reduction, it becomes
$$ B = \begin{bmatrix}
- & * & 0 & 0 \ 0 & * & * & 0 \ 0 & 0 & * & * \ 0 & 0 & 0 & * \ 0 & 0 & 0 & 0 \end{bmatrix}. $$
Only the main diagonal and the first superdiagonal remain potentially nonzero.
84.7 Golub-Kahan Bidiagonalization
Golub-Kahan bidiagonalization is one of the central procedures in numerical linear algebra.
Starting with a unit vector , the method constructs orthonormal sequences
and
satisfying
The resulting projected matrix is bidiagonal:
This process is closely related to Lanczos iteration and forms the basis of many iterative SVD methods.
84.8 Singular Values of a Bidiagonal Matrix
If is bidiagonal, then its singular values are the square roots of the eigenvalues of
Because is bidiagonal, the matrix
is symmetric tridiagonal.
For example, if
then
This matrix is symmetric tridiagonal.
Thus singular value computation for bidiagonal matrices is closely tied to symmetric tridiagonal eigenvalue computation.
84.9 Storage Efficiency
A dense matrix requires
storage locations.
An upper bidiagonal matrix requires only
entries when square, or approximately
essential entries in the rectangular case.
This dramatic reduction in storage improves cache efficiency and lowers computational cost in iterative algorithms.
84.10 Bidiagonal QR Iteration
After bidiagonal reduction, singular values are computed using iterative methods that preserve bidiagonal structure.
One important method is implicit zero-shift QR iteration on the associated tridiagonal matrices
or
Modern SVD algorithms often use variants specifically adapted to bidiagonal structure, such as the Golub-Reinsch algorithm or divide-and-conquer methods.
The iterative process gradually drives the superdiagonal entries toward zero. When convergence occurs, the diagonal entries approximate the singular values.
84.11 Example
Consider the upper bidiagonal matrix
Compute
The characteristic polynomial is
Expanding,
Thus the eigenvalues are
Therefore the singular values of are
84.12 Bidiagonal Matrices and Linear Systems
Bidiagonal systems are extremely easy to solve.
For an upper bidiagonal matrix,
can be solved by backward substitution.
For a lower bidiagonal matrix,
can be solved by forward substitution.
Since each equation involves at most two unknowns, the total computational cost is linear in the dimension.
84.13 Algorithm for Upper Bidiagonal Solve
Suppose
The system
can be solved as follows.
upper_bidiagonal_solve(d, e, b):
n = length(d)
x[n] = b[n] / d[n]
for i = n-1 down to 1:
x[i] =
(b[i] - e[i] * x[i+1]) / d[i]
return xOnly neighboring unknowns interact.
84.14 Bidiagonalization and Krylov Methods
Krylov subspace methods for singular values naturally produce bidiagonal matrices.
Lanczos bidiagonalization constructs orthonormal bases for the Krylov subspaces
and
The projected matrix becomes bidiagonal.
This is the basis of iterative methods such as:
| Method | Purpose |
|---|---|
| LSQR | Least squares problems |
| LSMR | Regularized least squares |
| PROPACK | Partial SVD computation |
| Lanczos bidiagonalization | Sparse singular value estimation |
The bidiagonal structure captures the short recurrence relations underlying these algorithms.
84.15 Relation to Orthogonal Polynomials
Bidiagonal matrices also appear in recurrence relations.
For example, certain orthogonal polynomial constructions lead naturally to bidiagonal operators after suitable transformations.
In numerical quadrature and approximation theory, bidiagonal and tridiagonal forms often appear together because recurrence relations couple neighboring basis functions.
84.16 Positive Bidiagonal Matrices
A bidiagonal matrix with positive diagonal entries is invertible. Its determinant is simply the product of the diagonal entries:
The inverse of a bidiagonal matrix is generally dense, but structured formulas exist. In many applications, one avoids forming the inverse explicitly and instead solves systems by substitution.
84.17 Comparison with Other Matrix Forms
| Form | Nonzero structure | Main use |
|---|---|---|
| Diagonal | Main diagonal only | Fully decoupled systems |
| Bidiagonal | Main diagonal plus one adjacent diagonal | Singular value algorithms |
| Tridiagonal | Main diagonal plus two adjacent diagonals | Symmetric eigenvalue problems |
| Hessenberg | One dense side plus one adjacent diagonal | General eigenvalue problems |
| Band matrix | Limited-width diagonal band | Structured sparse systems |
Bidiagonal form is the singular-value analogue of tridiagonal form.
84.18 Numerical Stability
Bidiagonal reduction using Householder transformations is numerically stable because orthogonal transformations preserve Euclidean norms.
The reduction stage does not significantly amplify rounding errors. Modern SVD algorithms are among the most reliable dense matrix algorithms in numerical linear algebra.
The singular values themselves are especially stable quantities. Small perturbations in the matrix cause only small perturbations in the singular values.
This stability is one reason singular value decomposition is so widely trusted in scientific computing.
84.19 Compact Representation
In practice, bidiagonal matrices are usually stored as two vectors:
| Vector | Meaning |
|---|---|
| Diagonal entries | |
| Off-diagonal entries |
This storage model is extremely compact and cache-friendly.
Many SVD algorithms operate directly on these vectors without ever forming a full matrix representation.
84.20 Summary
A bidiagonal matrix has nonzero entries only on the main diagonal and one adjacent diagonal.
Upper bidiagonal matrices satisfy
Dense matrices are commonly reduced to bidiagonal form before singular value computation:
The bidiagonal structure dramatically reduces computational cost and storage requirements. Since products such as
are symmetric tridiagonal, bidiagonal matrices connect naturally to symmetric eigenvalue algorithms.
Bidiagonal form appears in dense SVD computation, Krylov subspace methods, least squares algorithms, and sparse iterative methods. It is one of the central structured forms in numerical linear algebra.