QR factorization writes a matrix as a product
where has orthonormal columns and is upper triangular. It is the matrix form of Gram-Schmidt orthogonalization. The columns of give an orthonormal basis for the column space of , while records the coefficients needed to reconstruct the original columns. Standard references define QR decomposition as a factorization of a matrix into an orthogonal or orthonormal factor and an upper triangular factor.
QR factorization is important because it replaces a possibly awkward basis by an orthonormal basis. This improves geometric clarity and numerical stability. It is used for least squares, eigenvalue algorithms, rank estimation, and many stable matrix computations.
53.1 The Factorization
Let
Assume the columns of are linearly independent. A reduced QR factorization of is
where
The columns of are orthonormal:
The matrix is upper triangular:
When the diagonal entries of are chosen positive, the reduced QR factorization is unique for a full-column-rank matrix.
53.2 Full and Reduced QR
There are two common forms.
The reduced QR factorization is
where
The columns of form an orthonormal basis for .
The full QR factorization is
where
is orthogonal and
has the block form
Thus
The reduced factorization is usually enough for least squares and column-space computations. The full factorization is useful when a complete orthonormal basis of is needed.
53.3 Column Interpretation
Write the columns of as
Write the columns of as
The equation
means that each column is a linear combination of the first columns of :
There are no terms involving because is upper triangular.
Thus records how the original columns of are expressed in the orthonormal basis .
53.4 Relation to Gram-Schmidt
QR factorization is obtained by applying Gram-Schmidt to the columns of .
Let
be the columns of . Gram-Schmidt produces orthonormal vectors
For each , the vector lies in
Therefore
The coefficient
is given by
for . The diagonal coefficient is
where is the orthogonal residual produced by Gram-Schmidt at step .
Putting these column equations together gives
53.5 Computing from
If
and
then multiplying on the left by gives
Since
we obtain
Thus, once is known, is obtained by taking inner products between the columns of and the columns of .
Entrywise,
For a proper QR factorization, these entries vanish below the diagonal:
This expresses the same orthogonality created by Gram-Schmidt.
53.6 Example
Let
The columns are
First,
so
Next,
Remove the component of in the direction:
Thus
Then
and
Therefore
and
Indeed,
53.7 QR and Least Squares
QR factorization gives a stable way to solve least squares problems.
Consider
Assume
where has orthonormal columns and is invertible upper triangular.
Then
The least squares solution satisfies
Thus
In computation, one solves
by back substitution. One does not form explicitly.
This method avoids solving the normal equations
That matters because forming squares the condition number in the Euclidean norm, while QR methods avoid this condition-number-squaring effect.
53.8 Projection Using QR
If and has full column rank, then
The orthogonal projection of onto is therefore
The residual is
The least squares fitted vector is
Because has orthonormal columns, this projection formula is simpler than
QR factorization replaces the original column basis by an orthonormal basis, making projection direct.
53.9 Solving Square Linear Systems
If is nonsingular and
then the system
becomes
Multiply by :
Since is upper triangular, is found by back substitution.
This gives an alternative to Gaussian elimination. QR is often more expensive than LU factorization for square systems, but it has better orthogonality properties and is useful when stability is important.
53.10 Classical Gram-Schmidt QR
Classical Gram-Schmidt computes and by projecting each column of onto all previously computed .
For , set
For , compute
Then subtract
Finally,
This method is conceptually simple. In exact arithmetic it produces a valid QR factorization. In floating point arithmetic it can lose orthogonality when the columns of are nearly linearly dependent.
53.11 Modified Gram-Schmidt QR
Modified Gram-Schmidt improves numerical behavior by subtracting projections from the current residual rather than from the original column.
For each column , begin with
For , compute
then update
After all previous directions have been removed,
Classical and modified Gram-Schmidt are equivalent in exact arithmetic. In finite precision arithmetic, modified Gram-Schmidt usually preserves orthogonality better.
53.12 Householder QR
Householder QR uses orthogonal reflections to zero out subdiagonal entries of a matrix.
A Householder reflector has the form
where
It is orthogonal:
It is also symmetric:
By choosing appropriately, a Householder reflector transforms a vector into a multiple of a coordinate vector. Applied successively to columns of , these reflectors produce an upper triangular matrix . The product of the reflectors gives the orthogonal factor . Householder transformations are a standard stable method for computing QR factorization.
53.13 Givens QR
Givens rotations compute QR factorization by zeroing individual subdiagonal entries.
A Givens rotation acts on two coordinates at a time. In the plane of those two coordinates, it has the form
where
By choosing and correctly, one can eliminate a chosen entry while preserving the Euclidean norm.
Givens rotations are especially useful for sparse matrices and for problems where entries are introduced or removed incrementally. They can zero selected entries without applying a dense transformation to the whole matrix. QR factorization can be computed by a sequence of Givens rotations.
53.14 Comparison of QR Methods
| Method | Main idea | Strength | Weakness |
|---|---|---|---|
| Classical Gram-Schmidt | Subtract projections from original columns | Simple theory | Can lose orthogonality |
| Modified Gram-Schmidt | Subtract projections from current residuals | Better numerical behavior | Still less stable than Householder |
| Householder reflections | Zero whole column tails by reflections | Stable dense QR | Less convenient for selective updates |
| Givens rotations | Zero one entry at a time | Good for sparse and incremental problems | More rotations for dense matrices |
All methods produce the same mathematical kind of factorization. Their differences matter in floating point arithmetic and large-scale computation.
53.15 Rank-Deficient Matrices
If the columns of are linearly dependent, then the reduced QR factorization with invertible does not exist.
Gram-Schmidt will produce a zero residual at some step:
This means the current column adds no new direction beyond the previous columns.
For rank-deficient matrices, one may use QR factorization with column pivoting:
where is a permutation matrix. Pivoting reorders the columns so that the more independent columns appear earlier.
This form is useful for detecting numerical rank and constructing rank-revealing factorizations. QR methods can be used for numerical rank estimation at lower cost than a full singular value decomposition in some settings.
53.16 Positive Diagonal Convention
The signs of the columns of are not fixed by orthonormality alone.
If
then changing the sign of a column and the sign of the corresponding row of gives another valid factorization.
To remove this ambiguity, one often requires
for all . Under full column rank, this convention gives uniqueness for the reduced QR factorization.
53.17 Complex QR Factorization
For a complex matrix
QR factorization has the form
where has orthonormal columns in the complex inner product and is upper triangular.
The orthonormality condition becomes
where is the conjugate transpose.
The coefficients are
If is square, it is unitary:
Complex QR appears in spectral theory, signal processing, quantum mechanics, and numerical algorithms for complex matrices.
53.18 QR and Orthogonal Changes of Coordinates
The equation
can be rewritten as
Thus applies an orthogonal change of coordinates to the columns of , turning the matrix into upper triangular form.
Because is orthogonal, this change of coordinates preserves lengths and angles:
This explains the numerical value of QR. It transforms the problem while preserving Euclidean geometry.
53.19 QR Algorithm Preview
QR factorization is also the basis of the QR algorithm for eigenvalues.
Given a square matrix , one forms
then reverses the factors:
Under suitable hypotheses and with shifts, this iteration reveals eigenvalue information. The QR algorithm is one of the central methods for computing eigenvalues of dense matrices. QR decomposition is widely used both for least squares and as the basis of the QR eigenvalue algorithm.
53.20 Summary
QR factorization writes a matrix as
The matrix has orthonormal columns:
The matrix is upper triangular.
The columns of form an orthonormal basis for the column space of . The matrix records the coordinates of the original columns of in that basis.
QR factorization is the matrix form of Gram-Schmidt. It gives stable methods for least squares, projections, and linear systems. It can be computed by Gram-Schmidt, Householder reflections, or Givens rotations. Its central advantage is that it uses orthogonal transformations, which preserve Euclidean geometry and avoid the conditioning problems caused by forming .