Differential equations describe quantities that change continuously.
A differential equation relates an unknown function to one or more of its derivatives. The unknown is usually a function of time, position, or both. The derivative expresses rate of change.
Linear algebra enters differential equations because many differential equations can be written as vector equations. Systems of first-order linear differential equations are governed by matrices. Their solutions depend on eigenvalues, eigenvectors, matrix exponentials, diagonalization, Jordan form, and numerical linear algebra.
The central linear system has the form
Here is a vector-valued function and is a matrix. The matrix determines how the state changes over time.
113.1 Scalar Differential Equations
A scalar differential equation involves one unknown function.
For example,
describes exponential growth or decay.
If , the solution grows. If , the solution decays.
The solution is
This equation says that the rate of change of is proportional to itself.
Many physical and mathematical models begin with this principle.
| Equation | Interpretation |
|---|---|
| Growth or decay | |
| Relaxation toward or away from | |
| Harmonic oscillation | |
| Forced first-order system |
The scalar case gives the basic idea. The vector case shows where linear algebra becomes essential.
113.2 Systems of Differential Equations
A system of differential equations has several unknown functions.
Let
A first-order linear homogeneous system has the form
where
Written componentwise, this means
The matrix encodes all interactions among the components.
113.3 Initial Value Problems
An initial value problem specifies both the differential equation and the initial state:
The goal is to find the function satisfying both conditions.
The matrix determines the dynamics. The vector determines which particular trajectory occurs.
For a scalar equation, the solution is . For a matrix equation, the analogous solution is
The object is the matrix exponential.
113.4 Matrix Exponential
The matrix exponential is defined by the power series
This definition is valid for every square matrix .
The solution of
is
This is the exact vector analogue of the scalar solution . Matrix exponentials are standard tools for solving systems of linear differential equations and for describing linear time evolution.
113.5 Why the Matrix Exponential Works
Differentiate the power series term by term:
Factor out :
Therefore
If
then
Also,
Thus solves the initial value problem.
113.6 Diagonal Matrices
Matrix exponentials are easy when is diagonal.
Let
Then
Each coordinate evolves independently:
A diagonal system is therefore a collection of uncoupled scalar equations.
113.7 Diagonalization
Suppose is diagonalizable. Then
where is diagonal.
The columns of are eigenvectors of , and the diagonal entries of are eigenvalues.
Since powers of satisfy
the matrix exponential satisfies
Thus diagonalization reduces a coupled system to independent scalar equations.
This is one of the main reasons eigenvalues and eigenvectors are central in differential equations.
113.8 Eigenvector Solutions
If is an eigenvector of with eigenvalue , then
Consider
Then
and
Thus
Each eigenvector gives a special solution.
If has a basis of eigenvectors , then the general solution is
The constants are determined by the initial condition.
113.9 Example: A Diagonalizable System
Let
The eigenvalues are and .
For , an eigenvector is
For , an eigenvector is
Therefore the general solution of
is
This expression separates the motion into two independent eigen-directions.
The term with eigenvalue grows faster than the term with eigenvalue .
113.10 Stability
Stability concerns the behavior of solutions as .
For the linear homogeneous system
the eigenvalues of determine stability.
If all eigenvalues have negative real parts, then
for every initial condition.
If some eigenvalue has positive real part, then there are solutions that grow without bound.
If eigenvalues lie on the imaginary axis, the behavior may involve oscillation, neutral stability, or instability, depending on the matrix structure.
| Eigenvalue condition | Typical behavior |
|---|---|
| Decay | |
| Growth | |
| Oscillation | |
| Repeated eigenvalue with defective structure | Polynomial factors may appear |
Thus spectral information gives qualitative information about the differential equation.
113.11 Complex Eigenvalues
Real matrices may have complex eigenvalues.
Suppose
is an eigenvalue.
Then the corresponding exponential is
Using Euler’s formula,
Thus complex eigenvalues produce oscillation.
The real part controls growth or decay. The imaginary part controls angular frequency.
| Eigenvalue | Behavior |
|---|---|
| , | Decaying oscillation |
| , | Sustained oscillation |
| , | Growing oscillation |
This explains spirals in planar systems.
113.12 Planar Systems
A planar linear system has the form
The phase plane shows trajectories in the -plane.
The eigenvalues of classify many common behaviors.
| Eigenvalues | Phase portrait |
|---|---|
| Two negative real eigenvalues | Stable node |
| Two positive real eigenvalues | Unstable node |
| Opposite signs | Saddle |
| Complex with negative real part | Stable spiral |
| Complex with positive real part | Unstable spiral |
| Pure imaginary | Center |
For example,
has eigenvalues
The system is
Its solutions rotate around the origin.
113.13 Second-Order Equations as First-Order Systems
Many differential equations involve second derivatives. Linear algebra handles them by rewriting them as first-order systems.
Consider
Set
Then
and
Thus
This converts one second-order scalar equation into a first-order vector system.
The matrix
then determines the behavior.
113.14 Forced Linear Systems
A nonhomogeneous linear system has the form
Here is an external forcing term.
The solution is given by variation of constants:
The first term is the natural response. The integral term is the forced response.
This formula shows how the matrix exponential propagates both the initial condition and the external input.
113.15 Equilibrium Points
An equilibrium point is a constant solution.
For the system
the vector is always an equilibrium.
For an affine system
an equilibrium satisfies
If is invertible, then
The stability of this equilibrium is determined by the eigenvalues of .
By shifting variables,
the affine system becomes
Thus the study of affine systems reduces to homogeneous linear systems.
113.16 Systems with Constant Coefficients
The matrix equation
is called a constant-coefficient linear system because does not depend on .
If the matrix depends on time,
then the problem is more complicated. In general,
does not give the solution unless the matrices commute at different times.
For constant coefficients, all powers of commute with each other, and the exponential formula is exact.
This makes constant-coefficient systems a fundamental class.
113.17 Defective Matrices and Jordan Form
A matrix may fail to have enough eigenvectors for diagonalization.
In that case, Jordan form gives the replacement.
A Jordan block has the form
Jordan form records eigenvalues and the structure of generalized eigenvectors.
For a Jordan block,
where is nilpotent. Therefore
Since for some , the exponential is a finite polynomial in .
Thus defective matrices produce terms such as
in solutions.
113.18 Discretization
Differential equations are often solved numerically.
A simple method is Euler’s method. For
choose a step size . Approximate
by
Thus
This turns a continuous differential equation into a discrete linear recurrence.
After steps,
This approximation should be compared with the exact solution
Numerical methods for differential equations therefore depend on matrix powers, spectral stability, conditioning, and approximation.
113.19 Linear Differential Equations in Applications
Linear differential equations appear in many areas.
| Field | Model |
|---|---|
| Mechanics | Coupled springs and masses |
| Electrical engineering | Circuits |
| Control theory | State-space systems |
| Biology | Linear population models |
| Chemistry | Reaction networks near equilibrium |
| Economics | Linear dynamic systems |
| Quantum mechanics | Schrödinger equation |
| Heat flow | Discretized diffusion equations |
| Vibrations | Normal modes |
| Signal processing | Linear filters |
Many nonlinear systems are also studied by linearization near equilibrium points. This means replacing a nonlinear system by its derivative matrix at a point.
The resulting matrix describes local behavior.
113.20 Summary
Differential equations describe change. Linear algebra describes coupled change.
A first-order homogeneous linear system has the form
Its solution is
When is diagonalizable, the system decomposes into independent modes determined by eigenvalues and eigenvectors. Eigenvalues describe growth, decay, oscillation, and stability.
Second-order equations can be rewritten as first-order systems. Forced systems are solved using matrix exponentials and integrals. Numerical methods convert differential equations into matrix recurrences.
The main lesson is that a linear differential equation is a dynamic form of a matrix problem. The matrix determines how the state evolves, and the tools of linear algebra reveal the structure of that evolution.