QR decomposition is a factorization of a matrix into an orthogonal factor and an upper triangular factor. It is one of the central decompositions in numerical linear algebra, especially for least squares problems, orthogonalization, eigenvalue algorithms, and stable solution methods.
For a real matrix , a QR decomposition has the form
where has orthonormal columns and is upper triangular. If is a square matrix and has full rank, then is an orthogonal matrix and is an upper triangular matrix. The factorization is commonly computed by Gram-Schmidt orthogonalization, Householder reflections, or Givens rotations.
78.1 Orthogonal Matrices
A real square matrix is orthogonal if
This condition means that the columns of form an orthonormal set. Each column has length , and distinct columns are perpendicular.
If
then orthogonality means
An orthogonal matrix preserves lengths and angles. For any vector ,
This follows from
This property makes orthogonal transformations numerically stable. They do not magnify errors merely by changing coordinates.
78.2 Upper Triangular Matrices
An upper triangular matrix has zero entries below the diagonal:
Triangular systems are easy to solve by substitution. This is one reason QR decomposition is useful. It converts many problems involving into problems involving , while handles an orthogonal change of coordinates.
78.3 Full QR Decomposition
Let be an real matrix with . A full QR decomposition writes
where is an orthogonal matrix and is an upper trapezoidal matrix.
The matrix has the form
where is an upper triangular matrix.
Thus
The full form is useful theoretically, but it stores more information than is often needed.
78.4 Reduced QR Decomposition
For many applications, the reduced QR decomposition is preferred.
If is an matrix with and linearly independent columns, then
where is with orthonormal columns and is upper triangular.
The matrix satisfies
However,
is generally not the identity matrix. Instead, it is the orthogonal projection onto the column space of .
The reduced decomposition stores only the orthonormal basis needed for the column space of .
78.5 Geometric Meaning
The QR decomposition separates a matrix into two parts:
The factor gives an orthonormal coordinate system for the column space of . The factor gives the coordinates of the columns of in that orthonormal system.
Let
and
The columns form an orthonormal basis for the same column space as . Each original column can be written as a combination of the :
Only the first vectors appear because is upper triangular.
78.6 QR from Gram-Schmidt
QR decomposition is closely related to Gram-Schmidt orthogonalization.
Let the columns of be
The Gram-Schmidt process constructs orthonormal vectors
from these columns.
First,
For the second column, subtract its projection onto :
Then
In general, for column ,
These formulas produce
The entries of are the projection coefficients and normalizing constants from the orthogonalization process.
78.7 A Two Column Example
Let
Its columns are
First compute
Thus
Next,
Subtract the projection:
Since
we get
Now
Therefore
Thus the reduced QR decomposition is
where
and
The columns of are orthonormal, and is upper triangular.
78.8 Solving Square Linear Systems
Suppose is a square invertible matrix and
To solve
substitute the factorization:
Multiply both sides by :
Since is upper triangular, the system
can be solved by backward substitution.
This method is usually more expensive than LU for a general square dense system, but it has strong stability properties because is orthogonal.
78.9 Least Squares Problems
QR decomposition is one of the standard methods for solving least squares problems.
Suppose is an matrix with , and suppose the system
has more equations than unknowns. The least squares problem asks for minimizing
Let
be a reduced QR decomposition. Then
Since multiplication by an orthogonal matrix preserves length, this is equivalent to
Thus the least squares solution satisfies the triangular system
This gives a stable method that avoids forming the normal equations
Avoiding is important because forming it can worsen conditioning.
78.10 Projection Interpretation
The least squares problem seeks the vector in the column space of closest to .
If
is the reduced QR decomposition, then the columns of form an orthonormal basis for the column space of .
The orthogonal projection of onto this column space is
The least squares fitted vector is therefore
Since
we obtain
Multiplying by gives
This is the same triangular system obtained by the norm-preservation argument.
78.11 Classical and Modified Gram-Schmidt
There are two common Gram-Schmidt algorithms.
Classical Gram-Schmidt computes all projection coefficients against the original vector and then subtracts the combined projection.
Modified Gram-Schmidt subtracts projections one at a time, updating the vector after each subtraction.
Mathematically, both methods produce the same exact result in exact arithmetic. Numerically, modified Gram-Schmidt is usually more stable because it reduces the loss of orthogonality caused by rounding error.
| Method | Main idea | Numerical behavior |
|---|---|---|
| Classical Gram-Schmidt | Subtract combined projection | Less stable |
| Modified Gram-Schmidt | Subtract projections sequentially | More stable |
| Householder QR | Use reflections | More stable for dense matrices |
| Givens QR | Use plane rotations | Useful for sparse or structured updates |
78.12 Householder Reflections
A Householder reflection is an orthogonal transformation that reflects vectors across a hyperplane.
It has the form
where
The matrix is symmetric and orthogonal:
Householder QR uses a sequence of reflections to introduce zeros below the diagonal. At each step, a reflection is chosen to map part of a column to a multiple of a coordinate vector.
After applying these reflections, the matrix becomes upper triangular:
Since each is orthogonal,
Thus
Householder QR is a standard dense QR method because it is stable and efficient.
78.13 Givens Rotations
A Givens rotation is an orthogonal transformation that acts on only two coordinates at a time.
In two dimensions, it has the form
where
A Givens rotation can be chosen to zero one selected entry of a vector. Applied repeatedly, these rotations can transform a matrix into upper triangular form.
Givens rotations are especially useful when the matrix is sparse. Since each rotation affects only two rows, the method can preserve more structure than a dense Householder transformation.
They are also useful for updating QR decompositions when rows or columns are added or changed.
78.14 Uniqueness
QR decomposition is not completely unique unless a sign convention is imposed.
If
then one may multiply a column of by and the corresponding row of by . The product remains unchanged.
To make the reduced QR decomposition unique for a full column rank matrix, one usually requires the diagonal entries of to be positive:
Under this convention, the reduced QR decomposition is unique.
78.15 Rank Deficiency
If the columns of are linearly dependent, then some diagonal entry of becomes zero in exact arithmetic.
For a full column rank matrix, is nonsingular. For a rank deficient matrix, is singular.
Rank deficiency creates difficulties in solving least squares problems because the minimizer may fail to be unique. In such cases, pivoted QR decomposition is often used.
A column-pivoted QR decomposition has the form
where is a permutation matrix that reorders the columns of . The pivoting helps reveal numerical rank.
78.16 Determinants from QR
For a square matrix
the determinant satisfies
For an orthogonal matrix,
Since is triangular,
Therefore
If only the absolute value is needed,
For large matrices, one often computes
78.17 QR Algorithm for Eigenvalues
QR decomposition also appears in the QR algorithm for computing eigenvalues.
The basic iteration starts with
At step , compute
Then form
Since
the matrices are orthogonally similar. Therefore they have the same eigenvalues.
Under suitable conditions and with practical enhancements, the sequence tends toward a form from which eigenvalues can be read. This is one of the major uses of QR decomposition in numerical linear algebra.
78.18 Algorithm: Modified Gram-Schmidt QR
The following algorithm computes a reduced QR decomposition using modified Gram-Schmidt.
modified_gram_schmidt_qr(A):
m, n = shape of A
Q = zero matrix of size m by n
R = zero matrix of size n by n
for j = 1 to n:
v = A[:,j]
for i = 1 to j - 1:
R[i,j] = Q[:,i]^T v
v = v - R[i,j] * Q[:,i]
R[j,j] = norm(v)
if R[j,j] == 0:
stop: columns are linearly dependent
Q[:,j] = v / R[j,j]
return Q, RThis algorithm is easy to understand and useful for teaching. For high-quality dense numerical software, Householder QR is often preferred.
78.19 Storage in Practice
QR decomposition is often stored implicitly.
A dense implementation using Householder reflections may store the upper triangular part of in the upper triangular part of the original matrix. The Householder vectors are stored below the diagonal. The matrix is not always formed explicitly.
| Object | Practical storage |
|---|---|
| Upper triangular part of the array | |
| Product of stored Householder reflectors | |
| Householder vectors | Lower triangular part of the array |
| Reduced | Formed only when needed |
This storage model saves memory and improves performance. Many algorithms only need to multiply by or , so forming explicitly would be unnecessary work.
78.20 Comparison with LU and Cholesky
| Decomposition | Form | Applies to | Main advantage |
|---|---|---|---|
| LU | General square matrices | Fast direct solves | |
| Cholesky | Symmetric positive definite matrices | Faster SPD solves | |
| QR | Square or rectangular matrices | Orthogonality and least squares | |
| SVD | General matrices | Rank and conditioning information |
QR is usually more expensive than LU for a square dense solve, but it is more stable in important settings and naturally handles rectangular matrices. It is the standard decomposition behind many reliable least squares methods.
78.21 Summary
QR decomposition factors a matrix as
The factor has orthonormal columns. The factor is upper triangular. Geometrically, gives an orthonormal basis for the column space of , while gives the coordinates of the original columns in that basis.
QR decomposition is used to solve least squares problems, orthogonalize vectors, solve square systems, update factorizations, estimate rank, and compute eigenvalues. Its strength comes from orthogonality. Since orthogonal transformations preserve lengths and angles, QR-based algorithms tend to have good numerical behavior.
The main computational forms are Gram-Schmidt, Householder reflections, and Givens rotations. Gram-Schmidt explains the idea. Householder reflections are standard for dense matrices. Givens rotations are useful when sparsity or incremental updates matter.