Eigenvalues are numbers that describe how a linear transformation stretches or compresses space along special directions.
Most vectors change direction when a matrix acts on them. A few vectors may keep their direction. These vectors are called eigenvectors. The factors by which they are stretched are called eigenvalues.
Eigenvalues are among the most important objects in linear algebra. They appear in differential equations, quantum mechanics, numerical analysis, graph theory, optimization, statistics, machine learning, and dynamical systems.
The study of eigenvalues connects algebra, geometry, and computation.
59.1 Motivation
Consider the matrix
Apply to the vector
Then
The vector keeps its direction. Only its length changes.
Now apply to
Then
Again, the direction is preserved.
The vectors and are eigenvectors. The numbers and are eigenvalues.
Most vectors do not behave this way. For example,
which is not a scalar multiple of the original vector.
Eigenvectors identify the intrinsic directions of a transformation.
59.2 Definition of Eigenvalue
Let be a vector space and let
be a linear transformation.
A nonzero vector is called an eigenvector of if there exists a scalar such that
The scalar is called the eigenvalue associated with .
For matrices, the definition becomes
Av=\lambda v
The vector must be nonzero. Otherwise every scalar would satisfy
The eigenvalue equation says that the action of on only rescales the vector.
59.3 Geometric Interpretation
Geometrically, eigenvectors are directions that remain invariant under the transformation.
The matrix may:
- stretch the vector,
- shrink the vector,
- reverse the vector,
- leave the vector unchanged.
If
the vector is stretched.
If
the vector is compressed.
If
the vector reverses direction.
If
the vector remains unchanged.
If
the vector is mapped to zero.
For example, reflection across the -axis has matrix
Vectors on the -axis have eigenvalue . Vectors on the -axis have eigenvalue .
59.4 Rearranging the Eigenvalue Equation
Starting from
move all terms to one side:
Factor out :
(A-\lambda I)v=0
Here is the identity matrix.
This is a homogeneous system of equations. A nonzero solution exists only if the matrix
is singular.
Therefore,
\det(A-\lambda I)=0
This equation determines the eigenvalues.
59.5 Characteristic Polynomial
The polynomial
is called the characteristic polynomial of .
Its roots are the eigenvalues of the matrix.
For an matrix, the characteristic polynomial has degree .
Example
Let
Then
Compute the determinant:
Expand:
Solve
Factor:
The eigenvalues are
59.6 Finding Eigenvectors
After computing an eigenvalue, substitute it into
and solve for .
Example
For
consider the eigenvalue
Then
Solve
The equations reduce to
Thus every nonzero multiple of
is an eigenvector corresponding to eigenvalue .
Now consider :
The equations become
Thus
is an eigenvector.
59.7 Eigenspaces
The set of all eigenvectors associated with an eigenvalue , together with the zero vector, forms a subspace.
This subspace is called the eigenspace corresponding to .
The eigenspace is
Thus eigenspaces are null spaces.
For the previous example:
and
Each eigenspace is a line through the origin.
59.8 Algebraic and Geometric Multiplicity
An eigenvalue may appear more than once as a root of the characteristic polynomial.
The number of times it appears is called its algebraic multiplicity.
The dimension of its eigenspace is called its geometric multiplicity.
For every eigenvalue,
Example
Consider
The characteristic polynomial is
Thus has algebraic multiplicity .
Now solve
We obtain
Thus
The eigenspace is
Its dimension is . Therefore the geometric multiplicity is .
59.9 Diagonal Matrices
Diagonal matrices provide the simplest example of eigenvalues.
If
then the eigenvalues are exactly the diagonal entries:
The standard basis vectors are eigenvectors.
For example,
Diagonal matrices are easy to understand because each coordinate acts independently.
Much of spectral theory attempts to reduce matrices to diagonal form.
59.10 Triangular Matrices
For triangular matrices, the eigenvalues are also the diagonal entries.
If
then
Therefore the eigenvalues are
This fact is important in numerical linear algebra because many algorithms reduce matrices to triangular form.
59.11 Complex Eigenvalues
Real matrices may have complex eigenvalues.
Consider
This matrix rotates vectors by .
No nonzero real vector keeps its direction under this rotation.
Compute the characteristic polynomial:
The roots are
Thus the eigenvalues are complex.
Complex eigenvalues are essential in oscillatory systems, wave equations, quantum mechanics, and control theory.
59.12 Determinant and Trace
The eigenvalues are closely related to the determinant and trace.
If has eigenvalues
counted with multiplicity, then
and
These identities follow from the characteristic polynomial.
They connect local geometric scaling with global algebraic quantities.
59.13 Eigenvalues and Dynamical Systems
Repeated application of a matrix reveals the importance of eigenvalues.
Suppose
Then
If is an eigenvector with eigenvalue , then
Thus:
- if , growth occurs,
- if , decay occurs,
- if , oscillation or stability occurs.
Eigenvalues therefore determine long-term behavior.
This principle appears in population models, differential equations, Markov chains, iterative algorithms, and neural networks.
59.14 Spectral Perspective
The collection of eigenvalues of a matrix is called its spectrum.
Spectral theory studies how operators behave through their eigenvalues and eigenvectors.
Many difficult problems become simpler in spectral coordinates.
Examples include:
| Problem | Spectral interpretation |
|---|---|
| Heat equation | Modes decay exponentially |
| Vibrating systems | Natural frequencies |
| Principal component analysis | Largest variance directions |
| Quantum mechanics | Energy levels |
| Graph analysis | Connectivity structure |
| Markov chains | Long-term probability behavior |
The spectral viewpoint is one of the unifying themes of modern mathematics.
59.15 Summary
An eigenvector of a matrix is a nonzero vector satisfying
The scalar is the eigenvalue.
Eigenvalues are found from
Eigenvectors are obtained by solving
Eigenvalues describe invariant directions and scaling behavior of linear transformations. They connect algebraic structure, geometric behavior, and dynamical evolution.
The next chapter studies eigenvectors and eigenspaces in greater detail, including independence, bases of eigenvectors, and diagonalization.