# Chapter 11. Invertible Matrices

# Chapter 11. Invertible Matrices

An invertible matrix is a square matrix that has a matrix inverse. The inverse reverses the action of the original matrix. If \(A\) sends \(x\) to \(Ax\), then \(A^{-1}\) sends \(Ax\) back to \(x\). Formally, an \(n\times n\) matrix \(A\) is invertible if there exists an \(n\times n\) matrix \(B\) such that \(AB=BA=I_n\). In that case, \(B\) is unique and is written \(A^{-1}\).

## 11.1 Definition

Let \(A\) be an \(n\times n\) matrix over a field \(F\). The matrix \(A\) is invertible if there exists an \(n\times n\) matrix \(B\) such that

$$
AB=I_n
$$

and

$$
BA=I_n.
$$

The matrix \(B\) is called the inverse of \(A\). It is denoted by

$$
A^{-1}.
$$

Thus

$$
AA^{-1}=I_n
$$

and

$$
A^{-1}A=I_n.
$$

The identity matrix plays the role of \(1\) in matrix multiplication. An invertible matrix is therefore a matrix that has a multiplicative inverse.

## 11.2 Why the Matrix Must Be Square

The usual inverse is defined for square matrices.

If \(A\) is \(m\times n\), then a product \(AB\) can equal \(I_m\) only if \(B\) is \(n\times m\). A product \(BA\) can equal \(I_n\) only if the same \(B\) is \(n\times m\). For both identities

$$
AB=I_m
$$

and

$$
BA=I_n
$$

to hold as a two-sided inverse in the ordinary finite-dimensional setting, the dimensions must match in the square case. Non-square matrices may have one-sided inverses in special cases, but they do not have ordinary two-sided inverses of the same kind used for square matrices.

## 11.3 The Meaning of an Inverse

If \(A\) is invertible, then the transformation

$$
x\mapsto Ax
$$

can be undone.

Starting with a vector \(x\), apply \(A\):

$$
x\mapsto Ax.
$$

Then apply \(A^{-1}\):

$$
Ax\mapsto A^{-1}(Ax).
$$

By associativity,

$$
A^{-1}(Ax)=(A^{-1}A)x=I_nx=x.
$$

Thus \(A^{-1}\) recovers the original vector.

Likewise,

$$
A(A^{-1}y)=y
$$

for every vector \(y\in F^n\).

## 11.4 Uniqueness of the Inverse

If a matrix has an inverse, that inverse is unique.

Suppose \(B\) and \(C\) are both inverses of \(A\). Then

$$
AB=BA=I
$$

and

$$
AC=CA=I.
$$

We show that \(B=C\):

$$
B=BI.
$$

Since \(AC=I\),

$$
B=B(AC).
$$

By associativity,

$$
B=(BA)C.
$$

Since \(BA=I\),

$$
B=IC=C.
$$

Therefore the inverse is unique.

This justifies the notation \(A^{-1}\).

## 11.5 Invertibility and Linear Systems

Invertibility gives the simplest solution rule for square systems.

If

$$
Ax=b
$$

and \(A\) is invertible, then multiply both sides by \(A^{-1}\):

$$
A^{-1}Ax=A^{-1}b.
$$

Since

$$
A^{-1}A=I,
$$

we get

$$
x=A^{-1}b.
$$

Thus an invertible coefficient matrix gives a unique solution for every right-hand side \(b\). Conversely, if every system \(Ax=b\) has a unique solution, then \(A\) is invertible. This equivalence is one form of the invertible matrix theorem.

## 11.6 Singular Matrices

A square matrix that is not invertible is called singular.

A singular matrix cannot be undone. The transformation \(x\mapsto Ax\) loses information. In geometric terms, it collapses at least one nonzero direction. In algebraic terms, the homogeneous equation

$$
Ax=0
$$

has a nonzero solution.

For example,

$$
A=
\begin{bmatrix}
1&2\\
2&4
\end{bmatrix}
$$

is singular. Its second row is twice its first row, so the two rows do not give two independent constraints.

Indeed,

$$
A
\begin{bmatrix}
-2\\
1
\end{bmatrix} =
\begin{bmatrix}
1(-2)+2(1)\\
2(-2)+4(1)
\end{bmatrix} =
\begin{bmatrix}
0\\
0
\end{bmatrix}.
$$

A nonzero vector has been sent to zero, so \(A\) cannot be invertible.

## 11.7 The Invertible Matrix Theorem

For an \(n\times n\) matrix \(A\) over a field, the following statements are equivalent:

| Statement | Meaning |
|---|---|
| \(A\) is invertible | There exists \(A^{-1}\) |
| \(A\) is row equivalent to \(I_n\) | Row reduction gives the identity |
| \(A\) has a pivot in every column | No free variables occur |
| \(A\) has rank \(n\) | Full rank |
| \(Ax=0\) has only the trivial solution | No nonzero vector is collapsed |
| \(Ax=b\) has a unique solution for every \(b\in F^n\) | Every right-hand side is reachable exactly once |
| The columns of \(A\) are linearly independent | No column is redundant |
| The columns of \(A\) span \(F^n\) | Every vector can be produced |
| The linear map \(x\mapsto Ax\) is bijective | One-to-one and onto |
| \(\det(A)\ne 0\) | Nonzero determinant |

These conditions are either all true or all false for a given square matrix.

## 11.8 Invertibility and Row Reduction

A square matrix \(A\) is invertible exactly when its reduced row echelon form is the identity matrix.

For example,

$$
A=
\begin{bmatrix}
1&2\\
3&5
\end{bmatrix}.
$$

Row reduce:

$$
\begin{bmatrix}
1&2\\
3&5
\end{bmatrix}
\longrightarrow
\begin{bmatrix}
1&2\\
0&-1
\end{bmatrix}
$$

using

$$
R_2\leftarrow R_2-3R_1.
$$

Then

$$
\begin{bmatrix}
1&2\\
0&-1
\end{bmatrix}
\longrightarrow
\begin{bmatrix}
1&2\\
0&1
\end{bmatrix}
$$

using

$$
R_2\leftarrow -R_2.
$$

Then

$$
\begin{bmatrix}
1&2\\
0&1
\end{bmatrix}
\longrightarrow
\begin{bmatrix}
1&0\\
0&1
\end{bmatrix}
$$

using

$$
R_1\leftarrow R_1-2R_2.
$$

Since \(A\) row reduces to \(I_2\), it is invertible.

## 11.9 Computing the Inverse by Row Reduction

To compute \(A^{-1}\), form the augmented matrix

$$
[A\mid I_n].
$$

Then row reduce. If the left side becomes \(I_n\), the right side becomes \(A^{-1}\):

$$
[A\mid I_n]\longrightarrow [I_n\mid A^{-1}].
$$

If the left side cannot be reduced to \(I_n\), then \(A\) is singular.

This method works because each row operation is equivalent to multiplying on the left by an elementary matrix. Reducing \(A\) to \(I_n\) records the inverse operation on the identity side.

## 11.10 Example: Computing an Inverse

Let

$$
A=
\begin{bmatrix}
1&2\\
3&5
\end{bmatrix}.
$$

Form

$$
[A\mid I] =
\left[
\begin{array}{cc|cc}
1&2&1&0\\
3&5&0&1
\end{array}
\right].
$$

Use

$$
R_2\leftarrow R_2-3R_1.
$$

Then

$$
\left[
\begin{array}{cc|cc}
1&2&1&0\\
0&-1&-3&1
\end{array}
\right].
$$

Use

$$
R_2\leftarrow -R_2.
$$

Then

$$
\left[
\begin{array}{cc|cc}
1&2&1&0\\
0&1&3&-1
\end{array}
\right].
$$

Use

$$
R_1\leftarrow R_1-2R_2.
$$

Then

$$
\left[
\begin{array}{cc|cc}
1&0&-5&2\\
0&1&3&-1
\end{array}
\right].
$$

Therefore

$$
A^{-1} =
\begin{bmatrix}
-5&2\\
3&-1
\end{bmatrix}.
$$

Check:

$$
AA^{-1} =
\begin{bmatrix}
1&2\\
3&5
\end{bmatrix}
\begin{bmatrix}
-5&2\\
3&-1
\end{bmatrix} =
\begin{bmatrix}
1&0\\
0&1
\end{bmatrix}.
$$

## 11.11 Formula for a \(2\times 2\) Inverse

Let

$$
A=
\begin{bmatrix}
a&b\\
c&d
\end{bmatrix}.
$$

Then \(A\) is invertible exactly when

$$
ad-bc\ne 0.
$$

In that case,

$$
A^{-1} =
\frac{1}{ad-bc}
\begin{bmatrix}
d&-b\\
-c&a
\end{bmatrix}.
$$

The scalar \(ad-bc\) is the determinant of \(A\).

For example,

$$
A=
\begin{bmatrix}
1&2\\
3&5
\end{bmatrix}
$$

has determinant

$$
1\cdot 5-2\cdot 3=-1.
$$

Therefore

$$
A^{-1} =
\frac{1}{-1}
\begin{bmatrix}
5&-2\\
-3&1
\end{bmatrix} =
\begin{bmatrix}
-5&2\\
3&-1
\end{bmatrix}.
$$

## 11.12 Invertibility and Determinants

For a square matrix over a field,

$$
A \text{ is invertible}
$$

if and only if

$$
\det(A)\ne 0.
$$

If

$$
\det(A)=0,
$$

then \(A\) is singular.

The determinant criterion is often useful theoretically. In computation, however, solving systems by row reduction or decomposition is usually preferred to explicitly computing an inverse. Matrix inversion is often unnecessary when the goal is only to solve \(Ax=b\).

## 11.13 Products of Invertible Matrices

If \(A\) and \(B\) are invertible \(n\times n\) matrices, then \(AB\) is invertible, and

$$
(AB)^{-1}=B^{-1}A^{-1}.
$$

The order reverses.

To verify this, compute:

$$
(AB)(B^{-1}A^{-1}) =
A(BB^{-1})A^{-1} =
AI A^{-1} =
AA^{-1} =
I.
$$

Also,

$$
(B^{-1}A^{-1})(AB) =
B^{-1}(A^{-1}A)B =
B^{-1}IB =
B^{-1}B =
I.
$$

Thus \(B^{-1}A^{-1}\) is the inverse of \(AB\).

## 11.14 Powers of an Invertible Matrix

If \(A\) is invertible, then every positive power

$$
A^k
$$

is invertible, and

$$
(A^k)^{-1}=(A^{-1})^k.
$$

We also define negative powers by

$$
A^{-k}=(A^{-1})^k.
$$

Thus

$$
A^{-2}=A^{-1}A^{-1}.
$$

This notation is meaningful only when \(A\) is invertible.

## 11.15 Transposes of Invertible Matrices

If \(A\) is invertible, then \(A^T\) is invertible, and

$$
(A^T)^{-1}=(A^{-1})^T.
$$

To see this, transpose the identity

$$
AA^{-1}=I.
$$

Using the rule

$$
(AB)^T=B^TA^T,
$$

we get

$$
(A^{-1})^TA^T=I.
$$

Similarly, transposing

$$
A^{-1}A=I
$$

gives

$$
A^T(A^{-1})^T=I.
$$

Therefore \((A^{-1})^T\) is the inverse of \(A^T\).

## 11.16 Elementary Matrices and Inverses

Every elementary row operation corresponds to multiplication by an elementary matrix.

Each elementary matrix is invertible because each elementary row operation is reversible.

For example, the row operation

$$
R_2\leftarrow R_2-3R_1
$$

on a \(2\times 2\) matrix is represented by

$$
E=
\begin{bmatrix}
1&0\\
-3&1
\end{bmatrix}.
$$

The inverse operation is

$$
R_2\leftarrow R_2+3R_1,
$$

represented by

$$
E^{-1} =
\begin{bmatrix}
1&0\\
3&1
\end{bmatrix}.
$$

Indeed,

$$
EE^{-1} =
\begin{bmatrix}
1&0\\
-3&1
\end{bmatrix}
\begin{bmatrix}
1&0\\
3&1
\end{bmatrix} =
\begin{bmatrix}
1&0\\
0&1
\end{bmatrix}.
$$

## 11.17 Invertible Matrices as a Group

The set of all invertible \(n\times n\) matrices over a field \(F\) is denoted

$$
GL_n(F).
$$

It is called the general linear group.

This set is closed under matrix multiplication. It contains the identity matrix. Every element has an inverse. Matrix multiplication is associative. Thus \(GL_n(F)\) forms a group under multiplication.

This group is important because it represents all invertible linear changes of coordinates in \(F^n\).

## 11.18 Geometric Meaning

An invertible matrix represents a transformation that preserves dimension.

It may rotate, reflect, shear, scale, or combine these actions. But it cannot collapse a line to a point, a plane to a line, or a three-dimensional region to a plane.

For example,

$$
A=
\begin{bmatrix}
2&0\\
0&3
\end{bmatrix}
$$

is invertible. It stretches the \(x\)-direction by \(2\) and the \(y\)-direction by \(3\). Its inverse is

$$
A^{-1} =
\begin{bmatrix}
1/2&0\\
0&1/3
\end{bmatrix}.
$$

The inverse reverses the stretching.

By contrast,

$$
B=
\begin{bmatrix}
1&0\\
0&0
\end{bmatrix}
$$

is singular. It sends

$$
\begin{bmatrix}
x\\
y
\end{bmatrix}
$$

to

$$
\begin{bmatrix}
x\\
0
\end{bmatrix}.
$$

It collapses the plane onto the \(x\)-axis, losing the \(y\)-coordinate. No inverse can recover information that has been lost.

## 11.19 Solving Without Explicit Inversion

Although the formula

$$
x=A^{-1}b
$$

is mathematically correct, it is often not the best computational method.

In numerical computation, one usually solves

$$
Ax=b
$$

by elimination or factorization, such as LU decomposition, rather than computing \(A^{-1}\) explicitly. Computing an inverse can cost extra work and may introduce additional numerical error.

Thus the inverse is conceptually important, but direct inversion is not always computationally preferred.

## 11.20 Common Mistakes

| Mistake | Correction |
|---|---|
| Assuming every square matrix is invertible | A square matrix may be singular |
| Trying to invert a non-square matrix | Ordinary two-sided inverses require square matrices |
| Writing \((AB)^{-1}=A^{-1}B^{-1}\) | The correct order is \((AB)^{-1}=B^{-1}A^{-1}\) |
| Using \(x=A^{-1}b\) when \(A^{-1}\) does not exist | First check invertibility |
| Treating \(A^{-1}\) as entrywise reciprocal | Matrix inversion is not entrywise inversion |
| Computing an inverse just to solve one system | Use elimination or factorization when appropriate |

## 11.21 Summary

An invertible matrix is a square matrix whose action can be reversed. Its inverse \(A^{-1}\) satisfies

$$
AA^{-1}=A^{-1}A=I.
$$

Invertibility has many equivalent forms:

| Viewpoint | Equivalent condition |
|---|---|
| Algebraic | \(A^{-1}\) exists |
| Row-reduction | \(A\) row reduces to \(I_n\) |
| Systems | \(Ax=b\) has a unique solution for every \(b\) |
| Homogeneous systems | \(Ax=0\) has only the zero solution |
| Rank | \(\operatorname{rank}(A)=n\) |
| Columns | Columns are linearly independent and span \(F^n\) |
| Geometry | The transformation does not collapse dimension |
| Determinants | \(\det(A)\ne 0\) |

Invertible matrices are the algebraic model of reversible linear transformations. They are central to solving systems, changing coordinates, understanding dimension, and studying linear maps.
