Matrix operations are the algebraic rules used to combine and transform matrices. The basic operations are equality, addition, subtraction, scalar multiplication, matrix-vector multiplication, matrix multiplication, transpose, and powers. Each operation has size conditions. A matrix expression is meaningful only when these size conditions are satisfied. Standard matrix algebra defines addition entry by entry, scalar multiplication entry by entry, matrix multiplication by row-column dot products, and transpose by interchanging rows and columns.
7.1 Equality
Two matrices are equal when they have the same size and the same entries in corresponding positions.
If
and
then
means
for every valid pair .
For example,
But
The entries are the same numbers, but they occupy different positions.
7.2 Addition
Matrices of the same size can be added. The sum is formed by adding corresponding entries.
If and are both matrices, then
For example,
Matrix addition is defined only when the two matrices have the same number of rows and the same number of columns.
7.3 Subtraction
Matrix subtraction is defined by adding the additive inverse:
If and have the same size, then
For example,
Like addition, subtraction requires equal sizes.
7.4 Scalar Multiplication
A scalar multiplies a matrix by multiplying each entry.
If is a scalar and , then
For example,
Scalar multiplication preserves the size of the matrix. If is , then is also .
7.5 Algebraic Laws for Addition and Scaling
For matrices of the same size and scalars , the following laws hold:
| Law | Formula |
|---|---|
| Commutativity of addition | |
| Associativity of addition | |
| Additive identity | |
| Additive inverse | |
| Scalar distributivity over matrix addition | |
| Scalar distributivity over scalar addition | |
| Compatibility of scalar multiplication |
These laws follow from the corresponding laws for scalar arithmetic because addition and scalar multiplication are defined entry by entry.
7.6 Matrix-Vector Multiplication
Let be an matrix and let . Then is a vector in .
If
then
Each entry of is the dot product of one row of with .
For example,
7.7 Column Combination Form
Matrix-vector multiplication can also be read by columns.
If
then
Thus is a linear combination of the columns of .
This view is often more important than the row view. It shows that the equation
asks whether lies in the span of the columns of .
7.8 Matrix Multiplication
Let be an matrix and let be an matrix. Then the product is an matrix.
The entry in row and column of is
Equivalently,
This is the dot product of row of with column of . Matrix multiplication is defined only when the number of columns of the left factor equals the number of rows of the right factor.
For example,
Then
7.9 Size Rule for Matrix Products
The product
is defined when
and
The inner dimensions must match:
The outer dimensions give the size of the result.
For example,
produces a
matrix.
But
is undefined, because the inner dimensions and do not match.
7.10 Columns of a Product
The columns of can be computed by multiplying by each column of .
If
then
This interpretation is useful because it reduces matrix multiplication to repeated matrix-vector multiplication.
It also shows that each column of is a linear combination of the columns of .
7.11 Rows of a Product
The rows of can also be read from the rows of .
If the rows of are
then the rows of are
Thus left multiplication by combines rows, while right multiplication by combines columns. This distinction matters in computation and in proofs.
7.12 Matrix Multiplication as Composition
Matrix multiplication represents composition of linear transformations.
Suppose
and
Then
represents the transformation that first applies , then applies :
Thus
This explains the size rule. The output of must live in the input space of . It also explains the order: the right factor acts first. Matrix multiplication corresponds to composition of the linear transformations represented by the matrices.
7.13 Noncommutativity
Matrix multiplication is generally not commutative. Usually,
Sometimes one product is defined and the other is not. Even when both are defined, they may differ.
For example, let
Then
while
Thus
The order of multiplication must be preserved.
7.14 Associativity
Matrix multiplication is associative when all products are defined:
This law allows the expression
to be written without ambiguity.
Associativity is essential for linear transformations. If , , and represent transformations, then composing them in a fixed order gives the same result no matter how the products are grouped.
7.15 Distributive Laws
Matrix multiplication distributes over matrix addition:
and
These formulas are valid when the sizes make all products and sums meaningful.
For example,
requires and to have the same size, and must be compatible with both.
7.16 Identity Matrices
The identity matrix acts as a multiplicative identity.
If is , then
and
The identity on the left has size . The identity on the right has size .
For example,
The size of the identity matrix is determined by context.
7.17 Zero Products
If is , then
and
A product may be zero even when neither factor is zero.
For example,
This behavior differs from ordinary scalar multiplication. It is one reason matrix algebra must be handled carefully.
7.18 Transpose
The transpose of an matrix is the matrix obtained by interchanging rows and columns.
If , then
For example,
The transpose changes the shape of a rectangular matrix. It leaves a square matrix square.
7.19 Laws of Transpose
The transpose satisfies the following identities:
| Law | Formula |
|---|---|
| Double transpose | |
| Transpose of a sum | |
| Transpose of a scalar multiple | |
| Transpose of a product |
The last formula is especially important. The order of multiplication reverses under transpose.
7.20 Powers of a Matrix
Powers are defined only for square matrices.
If is , then
and in general,
Also,
For example, if
then
Matrix powers occur in difference equations, graph theory, Markov chains, dynamical systems, and matrix functions.
7.21 Polynomial Expressions in a Matrix
If is square, then one may form polynomial expressions such as
For example, if
then
The identity matrix appears in the constant term because each term must be an matrix.
Matrix polynomials are used in the Cayley-Hamilton theorem, minimal polynomials, diagonalization, and matrix functions.
7.22 Trace
The trace of a square matrix is the sum of its diagonal entries.
If
then
For example,
The trace is defined only for square matrices. It becomes important in eigenvalue theory and inner products on matrix spaces.
7.23 Hadamard Product
The Hadamard product is the entrywise product of two matrices of the same size.
If and , then
For example,
This operation differs from matrix multiplication. It does not represent composition of linear transformations. It is useful in statistics, numerical analysis, optimization, and elementwise computation.
7.24 Common Errors
Many errors in matrix algebra come from ignoring size or order.
| Error | Correction |
|---|---|
| Adding different-sized matrices | Addition requires equal sizes |
| Multiplying when inner dimensions do not match | must be , must be |
| Assuming | Matrix multiplication is generally noncommutative |
| Treating as entrywise squaring | , not |
| Forgetting transpose reverses products | |
| Using the wrong identity size | , |
Matrix notation is compact, but the size constraints must always be checked.
7.25 Summary
Matrix operations turn matrices into algebraic objects. Addition and scalar multiplication are entrywise. Matrix multiplication uses row-column dot products and represents composition of linear transformations. Transpose interchanges rows and columns. Powers and polynomial expressions are defined for square matrices.
The main operations are:
| Operation | Condition | Result |
|---|---|---|
| Same size | Same size | |
| Same size | Same size | |
| Scalar , any matrix | Same size as | |
| is , | Vector in | |
| is , is | matrix | |
| is | matrix | |
| is square | Same size as |
These operations form the computational language used throughout the rest of linear algebra.