Skip to content

Chapter 84. Bidiagonal Form

A bidiagonal matrix is a matrix whose nonzero entries lie only on the main diagonal and one adjacent diagonal. Bidiagonal form is fundamental in singular value computation because dense matrices are typically reduced to bidiagonal form before applying iterative SVD algorithms.

A matrix is upper bidiagonal if its only possible nonzero entries are on the main diagonal and the first superdiagonal. It is lower bidiagonal if its only possible nonzero entries are on the main diagonal and the first subdiagonal.

An upper bidiagonal matrix has the form

B=[d1e1000d2e2000d30en1000dn]. B = \begin{bmatrix} d_1 & e_1 & 0 & \cdots & 0 \\ 0 & d_2 & e_2 & \cdots & 0 \\ 0 & 0 & d_3 & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & e_{n-1} \\ 0 & 0 & 0 & \cdots & d_n \end{bmatrix}.

Dense singular value algorithms typically reduce a matrix to bidiagonal form using Householder transformations, then apply iterative diagonalization to the bidiagonal matrix. (en.wikipedia.org)

84.1 Definition

A bidiagonal matrix has nonzero entries on only two diagonals.

For an upper bidiagonal matrix,

bij=0unlessi=jorj=i+1. b_{ij}=0 \qquad \text{unless} \qquad i=j \quad \text{or} \quad j=i+1.

For a lower bidiagonal matrix,

bij=0unlessi=jori=j+1. b_{ij}=0 \qquad \text{unless} \qquad i=j \quad \text{or} \quad i=j+1.

Examples:

Upper bidiagonal:

[2500031000470006]. \begin{bmatrix} 2 & 5 & 0 & 0 \\ 0 & 3 & 1 & 0 \\ 0 & 0 & 4 & 7 \\ 0 & 0 & 0 & 6 \end{bmatrix}.

Lower bidiagonal:

[2000530001400076]. \begin{bmatrix} 2 & 0 & 0 & 0 \\ 5 & 3 & 0 & 0 \\ 0 & 1 & 4 & 0 \\ 0 & 0 & 7 & 6 \end{bmatrix}.

Both are sparse matrix forms with only 2n12n-1 potentially nonzero entries in the square case.

84.2 Relation to Tridiagonal Form

Bidiagonal matrices are analogous to tridiagonal matrices but even sparser.

FormAllowed diagonals
DiagonalMain diagonal
BidiagonalMain diagonal plus one adjacent diagonal
TridiagonalMain diagonal plus both adjacent diagonals

Tridiagonal matrices arise naturally in symmetric eigenvalue problems. Bidiagonal matrices arise naturally in singular value problems.

The connection is deeper than analogy. If BB is bidiagonal, then

BTB B^TB

and

BBT BB^T

are tridiagonal. Since singular values are related to eigenvalues of these products, bidiagonal structure leads naturally to tridiagonal structure.

84.3 Why Bidiagonal Form Matters

Singular value decomposition algorithms are expensive on general dense matrices.

A standard dense SVD pipeline therefore has two stages:

StageGoal
Bidiagonal reductionReduce dense matrix to bidiagonal form
Iterative diagonalizationCompute singular values from bidiagonal matrix

The reduction stage costs

O(mn2) O(mn^2)

for an m×nm \times n matrix with mnm \ge n. After reduction, the iterative stage becomes much cheaper because bidiagonal structure is preserved during iteration.

This mirrors the role of Hessenberg and tridiagonal forms in eigenvalue computation.

84.4 Reduction by Householder Transformations

Bidiagonal reduction uses alternating left and right Householder transformations.

Suppose

ARm×n,mn. A \in \mathbb{R}^{m \times n}, \qquad m \ge n.

At step kk:

  1. A left Householder reflector zeros entries below the diagonal in column kk.
  2. A right Householder reflector zeros entries to the right of the first superdiagonal in row kk.

The process alternates until the matrix becomes upper bidiagonal.

The final decomposition has the form

B=UTAV, B = U^TAV,

where UU and VV are orthogonal matrices and BB is upper bidiagonal.

Equivalently,

A=UBVT. A = UBV^T.

This already resembles the structure of the singular value decomposition. The bidiagonal matrix is an intermediate step between the original matrix and the diagonal singular value matrix.

84.5 Householder Reflections

A Householder reflector has the form

P=I2vvT, P = I - 2vv^T,

where

v2=1. \|v\|_2 = 1.

It is orthogonal and symmetric:

PTP=I,PT=P. P^TP = I, \qquad P^T=P.

Left multiplication by PP changes rows. Right multiplication changes columns.

In bidiagonal reduction:

  • left reflectors introduce zeros in columns,
  • right reflectors introduce zeros in rows.

Orthogonal transformations preserve singular values and maintain good numerical stability.

84.6 A Rectangular Example Pattern

Suppose

AR5×4. A \in \mathbb{R}^{5 \times 4}.

A dense matrix begins as

$$ \begin{bmatrix}

  • & * & * & * \
  • & * & * & * \
  • & * & * & * \
  • & * & * & * \
  • & * & * & * \end{bmatrix}. $$

After bidiagonal reduction, it becomes

$$ B = \begin{bmatrix}

  • & * & 0 & 0 \ 0 & * & * & 0 \ 0 & 0 & * & * \ 0 & 0 & 0 & * \ 0 & 0 & 0 & 0 \end{bmatrix}. $$

Only the main diagonal and the first superdiagonal remain potentially nonzero.

84.7 Golub-Kahan Bidiagonalization

Golub-Kahan bidiagonalization is one of the central procedures in numerical linear algebra.

Starting with a unit vector v1v_1, the method constructs orthonormal sequences

u1,u2, u_1,u_2,\ldots

and

v1,v2, v_1,v_2,\ldots

satisfying

Avk=αkuk+βkuk+1, Av_k = \alpha_k u_k + \beta_k u_{k+1}, ATuk=αkvk+βk1vk1. A^Tu_k = \alpha_k v_k + \beta_{k-1}v_{k-1}.

The resulting projected matrix is bidiagonal:

Bk=[α1β100α2β200α3]. B_k = \begin{bmatrix} \alpha_1 & \beta_1 & 0 & \cdots \\ 0 & \alpha_2 & \beta_2 & \cdots \\ 0 & 0 & \alpha_3 & \cdots \\ \vdots & \vdots & \vdots & \ddots \end{bmatrix}.

This process is closely related to Lanczos iteration and forms the basis of many iterative SVD methods.

84.8 Singular Values of a Bidiagonal Matrix

If BB is bidiagonal, then its singular values are the square roots of the eigenvalues of

BTB. B^TB.

Because BB is bidiagonal, the matrix

BTB B^TB

is symmetric tridiagonal.

For example, if

B=[d1e100d2e200d3], B = \begin{bmatrix} d_1 & e_1 & 0 \\ 0 & d_2 & e_2 \\ 0 & 0 & d_3 \end{bmatrix},

then

BTB=[d12d1e10d1e1d22+e12d2e20d2e2d32+e22]. B^TB = \begin{bmatrix} d_1^2 & d_1e_1 & 0 \\ d_1e_1 & d_2^2+e_1^2 & d_2e_2 \\ 0 & d_2e_2 & d_3^2+e_2^2 \end{bmatrix}.

This matrix is symmetric tridiagonal.

Thus singular value computation for bidiagonal matrices is closely tied to symmetric tridiagonal eigenvalue computation.

84.9 Storage Efficiency

A dense m×nm \times n matrix requires

mn mn

storage locations.

An upper bidiagonal matrix requires only

2n1 2n-1

entries when square, or approximately

2min(m,n) 2\min(m,n)

essential entries in the rectangular case.

This dramatic reduction in storage improves cache efficiency and lowers computational cost in iterative algorithms.

84.10 Bidiagonal QR Iteration

After bidiagonal reduction, singular values are computed using iterative methods that preserve bidiagonal structure.

One important method is implicit zero-shift QR iteration on the associated tridiagonal matrices

BTB B^TB

or

BBT. BB^T.

Modern SVD algorithms often use variants specifically adapted to bidiagonal structure, such as the Golub-Reinsch algorithm or divide-and-conquer methods.

The iterative process gradually drives the superdiagonal entries toward zero. When convergence occurs, the diagonal entries approximate the singular values.

84.11 Example

Consider the upper bidiagonal matrix

B=[3102]. B = \begin{bmatrix} 3 & 1 \\ 0 & 2 \end{bmatrix}.

Compute

BTB=[3012][3102]=[9335]. B^TB = \begin{bmatrix} 3 & 0 \\ 1 & 2 \end{bmatrix} \begin{bmatrix} 3 & 1 \\ 0 & 2 \end{bmatrix} = \begin{bmatrix} 9 & 3 \\ 3 & 5 \end{bmatrix}.

The characteristic polynomial is

det[9λ335λ]=(9λ)(5λ)9. \det \begin{bmatrix} 9-\lambda & 3 \\ 3 & 5-\lambda \end{bmatrix} = (9-\lambda)(5-\lambda)-9.

Expanding,

λ214λ+36. \lambda^2 - 14\lambda + 36.

Thus the eigenvalues are

7+13,713. 7+\sqrt{13}, \qquad 7-\sqrt{13}.

Therefore the singular values of BB are

σ1=7+13,σ2=713. \sigma_1 = \sqrt{7+\sqrt{13}}, \qquad \sigma_2 = \sqrt{7-\sqrt{13}}.

84.12 Bidiagonal Matrices and Linear Systems

Bidiagonal systems are extremely easy to solve.

For an upper bidiagonal matrix,

Bx=b Bx=b

can be solved by backward substitution.

For a lower bidiagonal matrix,

Bx=b Bx=b

can be solved by forward substitution.

Since each equation involves at most two unknowns, the total computational cost is linear in the dimension.

84.13 Algorithm for Upper Bidiagonal Solve

Suppose

B=[d1e100d2e200d3]. B = \begin{bmatrix} d_1 & e_1 & 0 & \cdots \\ 0 & d_2 & e_2 & \cdots \\ 0 & 0 & d_3 & \cdots \\ \vdots & \vdots & \vdots & \ddots \end{bmatrix}.

The system

Bx=b Bx=b

can be solved as follows.

upper_bidiagonal_solve(d, e, b):
    n = length(d)

    x[n] = b[n] / d[n]

    for i = n-1 down to 1:
        x[i] =
            (b[i] - e[i] * x[i+1]) / d[i]

    return x

Only neighboring unknowns interact.

84.14 Bidiagonalization and Krylov Methods

Krylov subspace methods for singular values naturally produce bidiagonal matrices.

Lanczos bidiagonalization constructs orthonormal bases for the Krylov subspaces

Kk(ATA,v1) \mathcal{K}_k(A^TA,v_1)

and

Kk(AAT,u1). \mathcal{K}_k(AA^T,u_1).

The projected matrix becomes bidiagonal.

This is the basis of iterative methods such as:

MethodPurpose
LSQRLeast squares problems
LSMRRegularized least squares
PROPACKPartial SVD computation
Lanczos bidiagonalizationSparse singular value estimation

The bidiagonal structure captures the short recurrence relations underlying these algorithms.

84.15 Relation to Orthogonal Polynomials

Bidiagonal matrices also appear in recurrence relations.

For example, certain orthogonal polynomial constructions lead naturally to bidiagonal operators after suitable transformations.

In numerical quadrature and approximation theory, bidiagonal and tridiagonal forms often appear together because recurrence relations couple neighboring basis functions.

84.16 Positive Bidiagonal Matrices

A bidiagonal matrix with positive diagonal entries is invertible. Its determinant is simply the product of the diagonal entries:

det(B)=i=1ndi. \det(B)=\prod_{i=1}^{n} d_i.

The inverse of a bidiagonal matrix is generally dense, but structured formulas exist. In many applications, one avoids forming the inverse explicitly and instead solves systems by substitution.

84.17 Comparison with Other Matrix Forms

FormNonzero structureMain use
DiagonalMain diagonal onlyFully decoupled systems
BidiagonalMain diagonal plus one adjacent diagonalSingular value algorithms
TridiagonalMain diagonal plus two adjacent diagonalsSymmetric eigenvalue problems
HessenbergOne dense side plus one adjacent diagonalGeneral eigenvalue problems
Band matrixLimited-width diagonal bandStructured sparse systems

Bidiagonal form is the singular-value analogue of tridiagonal form.

84.18 Numerical Stability

Bidiagonal reduction using Householder transformations is numerically stable because orthogonal transformations preserve Euclidean norms.

The reduction stage does not significantly amplify rounding errors. Modern SVD algorithms are among the most reliable dense matrix algorithms in numerical linear algebra.

The singular values themselves are especially stable quantities. Small perturbations in the matrix cause only small perturbations in the singular values.

This stability is one reason singular value decomposition is so widely trusted in scientific computing.

84.19 Compact Representation

In practice, bidiagonal matrices are usually stored as two vectors:

VectorMeaning
did_iDiagonal entries
eie_iOff-diagonal entries

This storage model is extremely compact and cache-friendly.

Many SVD algorithms operate directly on these vectors without ever forming a full matrix representation.

84.20 Summary

A bidiagonal matrix has nonzero entries only on the main diagonal and one adjacent diagonal.

Upper bidiagonal matrices satisfy

bij=0unlessi=jorj=i+1. b_{ij}=0 \qquad \text{unless} \qquad i=j \quad \text{or} \quad j=i+1.

Dense matrices are commonly reduced to bidiagonal form before singular value computation:

B=UTAV. B = U^TAV.

The bidiagonal structure dramatically reduces computational cost and storage requirements. Since products such as

BTB B^TB

are symmetric tridiagonal, bidiagonal matrices connect naturally to symmetric eigenvalue algorithms.

Bidiagonal form appears in dense SVD computation, Krylov subspace methods, least squares algorithms, and sparse iterative methods. It is one of the central structured forms in numerical linear algebra.