Eigenvectors are the nonzero vectors whose directions are preserved by a linear transformation. If a matrix sends a vector to a scalar multiple of itself, then that vector is an eigenvector.
The eigenvalue tells how much the vector is scaled. The eigenvector tells which direction is being scaled.
For a square matrix , the eigenvector equation is
Here is a nonzero vector and is a scalar. The vector is an eigenvector of , and is its associated eigenvalue. This formulation is standard in linear algebra: eigenvectors are nonzero vectors transformed into scalar multiples of themselves.
60.1 Direction Preserved by a Matrix
A matrix usually changes both the length and direction of a vector. Eigenvectors are special because their direction does not change.
Let
Then
The vector
is an eigenvector with eigenvalue .
Also,
The vector
is an eigenvector with eigenvalue .
The coordinate axes are invariant directions for this transformation. Vectors on the first coordinate axis are stretched by . Vectors on the second coordinate axis are stretched by .
60.2 Eigenvectors Are Not Unique
If is an eigenvector of with eigenvalue , then every nonzero scalar multiple of is also an eigenvector with the same eigenvalue.
Indeed, suppose
Let be a nonzero scalar. Then
Thus is also an eigenvector.
Eigenvectors therefore describe directions, not individual arrows. The vectors
all lie on the same line through the origin. If one of them is an eigenvector, all nonzero vectors on that line are eigenvectors with the same eigenvalue.
60.3 The Zero Vector Is Excluded
The zero vector is not an eigenvector.
This exclusion is part of the definition. The reason is simple. For every scalar ,
If zero were allowed, every scalar would appear to be an eigenvalue for every matrix. The definition would lose its meaning.
Eigenvectors must therefore be nonzero.
60.4 Eigenspaces
For a fixed eigenvalue , all eigenvectors with eigenvalue , together with the zero vector, form a subspace.
This subspace is called the eigenspace of corresponding to .
It is written as
Equivalently,
This identity is central. It says that finding eigenvectors is the same as finding the null space of . Eigenspaces are commonly described as kernels of .
60.5 Computing an Eigenvector
To compute eigenvectors, first find an eigenvalue . Then solve
The nonzero solutions are the eigenvectors.
Example
Let
The characteristic polynomial is
Compute:
Thus
Factor:
The eigenvalues are
Now find the eigenvectors for .
Solve
The equation is
Hence
So the eigenspace for is
Now find the eigenvectors for .
Solve
The equation is
Hence
So
60.6 Checking an Eigenvector
A proposed eigenvector should be checked directly.
For
check
Then
Thus is an eigenvector with eigenvalue .
Now check
Then
Thus is an eigenvector with eigenvalue .
60.7 Eigenvectors and Linear Independence
Eigenvectors belonging to distinct eigenvalues are linearly independent.
For example, if and are eigenvectors with distinct eigenvalues and , then and cannot lie on the same line.
More generally, if
are eigenvectors corresponding to distinct eigenvalues
then the list
is linearly independent.
This theorem is one of the main reasons eigenvectors are useful. A matrix with enough independent eigenvectors can be described in a simpler coordinate system.
60.8 Proof for Two Eigenvectors
Let and be eigenvectors of with distinct eigenvalues and .
Suppose
Apply to both sides:
Using linearity,
Since and ,
Now multiply the original equation by :
Subtract this equation from the previous one:
Since and , we must have
Then the original equation gives
Since , we also have
Thus and are linearly independent.
60.9 Eigenvectors as a Basis
If an matrix has linearly independent eigenvectors, then those eigenvectors form a basis of the space.
In that basis, the matrix acts diagonally.
Suppose
for
Any vector can be written as
Then
By linearity,
Using the eigenvector equations,
Thus, in an eigenvector basis, the transformation simply rescales each coordinate.
This is the idea behind diagonalization.
60.10 Eigenvectors and Diagonalization
Let
be the matrix whose columns are eigenvectors of .
Let
Then
If the eigenvectors are linearly independent, then is invertible. Hence
This representation decomposes into a change of basis, a diagonal scaling, and a change back to the original coordinates.
60.11 Repeated Eigenvalues
A repeated eigenvalue may have more than one independent eigenvector, or it may have only one.
Consider
Every nonzero vector is an eigenvector with eigenvalue , since
The eigenspace is all of .
Now consider
Again, the only eigenvalue is . But
Solving
gives
Thus the eigenspace is only
Both matrices have the same repeated eigenvalue. Their eigenvectors behave differently.
60.12 Defective Matrices
A matrix is called defective if it does not have enough linearly independent eigenvectors to form a basis.
The matrix
is defective. It is a matrix, but it has only one independent eigenvector.
Defective matrices cannot be diagonalized. They require a more general form, such as Jordan canonical form.
Defectiveness is caused by insufficient geometric multiplicity.
60.13 Symmetric Matrices
Real symmetric matrices have especially good eigenvector behavior.
If
then eigenvectors corresponding to distinct eigenvalues are orthogonal.
Moreover, a real symmetric matrix has an orthonormal basis of eigenvectors.
This is the content of the spectral theorem for real symmetric matrices.
For example,
has eigenvectors
Their dot product is
They are orthogonal.
60.14 Left and Right Eigenvectors
For a matrix , a right eigenvector satisfies
A left eigenvector satisfies
Equivalently,
For symmetric matrices, left and right eigenvectors coincide. For general matrices, they may differ.
Left eigenvectors are important in Markov chains, sensitivity analysis, nonnormal matrices, and numerical algorithms.
60.15 Normalizing Eigenvectors
Because any nonzero scalar multiple of an eigenvector is also an eigenvector, it is often useful to choose a standard length.
For real vectors, one common normalization is
If
then the normalized eigenvector is
For example,
has norm
The normalized vector is
Both and point in the same direction. If is an eigenvector, then is also an eigenvector.
60.16 Complex Eigenvectors
When a real matrix has complex eigenvalues, its eigenvectors are usually complex.
Consider
This matrix rotates the plane by . Its eigenvalues are
For , solve
That is,
The first equation gives
so
Taking , one eigenvector is
Although the matrix has real entries, its eigenvectors belong to .
60.17 Eigenvectors in Applications
Eigenvectors identify stable directions, dominant modes, and preferred coordinate systems.
| Area | Meaning of eigenvectors |
|---|---|
| Differential equations | Modes of exponential growth or decay |
| Mechanics | Modes of vibration |
| Markov chains | Long-term distributions and transient modes |
| Graph theory | Structural directions of graphs |
| Statistics | Principal component directions |
| Machine learning | Low-dimensional feature directions |
| Quantum mechanics | States with definite measured values |
| Numerical analysis | Directions controlling convergence |
In many applications, the eigenvalues give scale or frequency, while the eigenvectors give shape or direction.
60.18 Summary
An eigenvector is a nonzero vector whose direction is preserved by a linear transformation.
For a matrix , an eigenvector satisfies
The scalar is the corresponding eigenvalue.
For each eigenvalue , the eigenspace is
Eigenvectors belonging to distinct eigenvalues are linearly independent. If a matrix has enough independent eigenvectors to form a basis, then it can be diagonalized.
Eigenvectors reveal the directions in which a linear transformation acts in the simplest possible way.