Skip to content

Chapter 74. Perron-Frobenius Theory

Perron-Frobenius theory studies eigenvalues and eigenvectors of matrices whose entries are nonnegative.

Most spectral theory allows arbitrary signs and complex entries. Perron-Frobenius theory is different. It uses the order structure of the nonnegative cone:

R0n={xRn:xi0}. \mathbb{R}_{\geq 0}^n = \{x\in\mathbb{R}^n:x_i\geq 0\}.

If a matrix has nonnegative entries, then it maps nonnegative vectors to nonnegative vectors. This simple fact produces strong conclusions. Under suitable hypotheses, such a matrix has a distinguished eigenvalue, a positive eigenvector, and a spectral structure controlled by the geometry of a directed graph. This distinguished eigenvalue is called the Perron root or Perron-Frobenius eigenvalue.

74.1 Nonnegative and Positive Matrices

A real matrix A=[aij]A=[a_{ij}] is nonnegative if

aij0 a_{ij}\geq 0

for all i,ji,j. We write

A0. A\geq 0.

A real matrix is positive if

aij>0 a_{ij}>0

for all i,ji,j. We write

A>0. A>0.

The distinction matters. A positive matrix has every entry strictly positive. A nonnegative matrix may contain zero entries.

For example,

A=[1234] A= \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}

is positive, while

B=[0110] B= \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}

is nonnegative but not positive.

74.2 The Nonnegative Cone

The nonnegative cone is the set

R0n={x:xi0 for all i}. \mathbb{R}_{\geq 0}^n = \{x:x_i\geq 0\text{ for all }i\}.

If

A0 A\geq 0

and

x0, x\geq 0,

then

Ax0. Ax\geq 0.

Thus nonnegative matrices preserve the nonnegative cone.

If

A>0 A>0

and

x0,x0, x\geq 0,\quad x\neq 0,

then

Ax>0. Ax>0.

A positive matrix sends every nonzero nonnegative vector into the interior of the cone.

This order-preserving property is the source of Perron-Frobenius theory.

74.3 Spectral Radius

The spectral radius of a square matrix AA is

ρ(A)=max{λ:λ is an eigenvalue of A}. \rho(A)=\max\{|\lambda|:\lambda\text{ is an eigenvalue of }A\}.

It is the largest absolute value among the eigenvalues.

Perron-Frobenius theory says that, for nonnegative matrices, the spectral radius is not merely a number. Under suitable conditions, it is itself an eigenvalue with a nonnegative or positive eigenvector.

This is special. For a general real matrix, the eigenvalue of largest modulus may be complex and may have no real eigenvector.

74.4 Perron Theorem for Positive Matrices

Let AA be a positive square matrix:

A>0. A>0.

Then:

PropertyStatement
Positive eigenvalueρ(A)>0\rho(A)>0 is an eigenvalue of AA
Positive eigenvectorThere exists v>0v>0 such that Av=ρ(A)vAv=\rho(A)v
Simplicityρ(A)\rho(A) is a simple eigenvalue
DominanceEvery other eigenvalue λ\lambda satisfies (
Uniqueness in coneEvery nonnegative eigenvector is a scalar multiple of vv

This is the classical Perron theorem. It applies to strictly positive matrices. Frobenius extended the theory to broader classes of nonnegative matrices.

74.5 Example of a Positive Matrix

Let

A=[2112]. A= \begin{bmatrix} 2 & 1 \\ 1 & 2 \end{bmatrix}.

This matrix is positive.

Its eigenvalues are

3and1. 3 \qquad \text{and} \qquad 1.

The spectral radius is

ρ(A)=3. \rho(A)=3.

A corresponding eigenvector is

v=[11]. v= \begin{bmatrix} 1\\ 1 \end{bmatrix}.

This vector is positive. The other eigenvalue has smaller modulus:

1<3. |1|<3.

Thus the Perron root is the unique dominant eigenvalue.

74.6 Nonnegative Matrices

For a general nonnegative matrix, the conclusions are weaker.

Let

A=[0110]. A= \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}.

This matrix is nonnegative. Its eigenvalues are

1and1. 1 \qquad \text{and} \qquad -1.

The spectral radius is

ρ(A)=1. \rho(A)=1.

The vector

[11] \begin{bmatrix} 1\\ 1 \end{bmatrix}

is a positive eigenvector for eigenvalue 11. However, there is another eigenvalue with the same modulus:

1=1. |-1|=1.

Thus dominance fails.

This example shows why additional conditions are needed.

74.7 Reducibility

A nonnegative matrix can be studied through its directed graph.

Given an n×nn\times n nonnegative matrix AA, define a directed graph GAG_A with vertices

1,2,,n. 1,2,\ldots,n.

There is a directed edge

ij i\to j

when

aji>0 a_{ji}>0

if the matrix acts on column vectors. Some authors use the transposed convention. The choice is not important as long as it is used consistently.

The matrix is irreducible if its directed graph is strongly connected. This means that for every pair of vertices i,ji,j, there is a directed path from ii to jj. The graph characterization of irreducibility is a standard form of Perron-Frobenius theory.

A matrix that is not irreducible is called reducible.

74.8 Meaning of Irreducibility

Irreducibility means that no part of the system is isolated from the rest.

If AA is reducible, then after a suitable permutation of coordinates, it can be placed into block upper triangular form:

P1AP=[BC0D]. P^{-1}AP= \begin{bmatrix} B & C \\ 0 & D \end{bmatrix}.

This means that one group of coordinates evolves without receiving influence from another group.

If AA is irreducible, no such nontrivial decomposition exists. Every coordinate can eventually influence every other coordinate through powers of AA.

74.9 Perron-Frobenius Theorem for Irreducible Nonnegative Matrices

Let A0A\geq 0 be irreducible. Let

r=ρ(A). r=\rho(A).

Then:

PropertyStatement
Positive Perron rootr>0r>0 and rr is an eigenvalue
Positive eigenvectorsThere exist right and left eigenvectors v>0v>0, w>0w>0 with Av=rvAv=rv, ATw=rwA^Tw=rw
Simplicityrr is a simple root of the characteristic polynomial
Cone uniquenessThe only eigenvectors with all positive entries correspond to rr
Peripheral spectrumOther eigenvalues may satisfy (

For irreducible nonnegative matrices, the Perron root is still positive and simple, but it may not be the only eigenvalue on the spectral circle.

74.10 Left and Right Perron Eigenvectors

The right Perron eigenvector satisfies

Av=rv. Av=rv.

The left Perron eigenvector satisfies

wTA=rwT. w^TA=rw^T.

Equivalently,

ATw=rw. A^Tw=rw.

Both can be chosen positive when AA is irreducible.

It is common to normalize them by

wTv=1. w^Tv=1.

Then the rank-one matrix

vwT vw^T

is the spectral projection associated with the Perron root in the primitive case.

74.11 Primitive Matrices

A nonnegative matrix AA is primitive if some power of it is positive:

Am>0 A^m>0

for some positive integer mm.

Every positive matrix is primitive. Some matrices with zero entries are also primitive.

For example,

A=[1110] A= \begin{bmatrix} 1 & 1 \\ 1 & 0 \end{bmatrix}

is nonnegative and primitive, since

$$ A^2= \begin{bmatrix} 2 & 1 \ 1 & 1 \end{bmatrix}

$$

Primitive matrices behave like positive matrices after enough iterations.

74.12 Perron-Frobenius Theorem for Primitive Matrices

Let A0A\geq 0 be primitive. Let

r=ρ(A). r=\rho(A).

Then:

PropertyStatement
Perron rootr>0r>0 is an eigenvalue
Positive eigenvectorsThere are v>0v>0, w>0w>0 such that Av=rvAv=rv, ATw=rwA^Tw=rw
Simplicityrr is algebraically simple
Strict dominanceEvery other eigenvalue λ\lambda satisfies (
Limit theoremAk/rkA^k/r^k converges to vwTvw^T, after normalization wTv=1w^Tv=1

Thus primitive matrices recover the strong dominance conclusion of positive matrices. In Perron-Frobenius theory, primitive matrices are exactly the irreducible aperiodic nonnegative matrices.

74.13 Period

For an irreducible nonnegative matrix, the period is a positive integer that measures cyclic behavior.

In graph terms, the period is the greatest common divisor of the lengths of all directed cycles in the associated strongly connected graph.

If the period is

h=1, h=1,

the matrix is aperiodic and primitive.

If

h>1, h>1,

the matrix is imprimitive. Then there are several eigenvalues on the spectral circle.

For an irreducible nonnegative matrix with period hh, the eigenvalues satisfying

λ=ρ(A) |\lambda|=\rho(A)

are

ρ(A)ζ, \rho(A)\zeta,

where ζ\zeta ranges over the hh-th roots of unity.

74.14 Example of Period Two

Let

A=[0110]. A= \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}.

The graph has two vertices and two directed edges:

12,21. 1\to 2, \qquad 2\to 1.

All closed walks have even length. The period is

h=2. h=2.

The eigenvalues are

1and1. 1 \qquad \text{and} \qquad -1.

Both lie on the spectral circle:

λ=1. |\lambda|=1.

The matrix is irreducible but not primitive.

74.15 Long-Term Behavior in the Primitive Case

Let A0A\geq 0 be primitive. Let

r=ρ(A). r=\rho(A).

Choose positive right and left Perron eigenvectors v,wv,w such that

wTv=1. w^Tv=1.

Then

limkAkrk=vwT. \lim_{k\to\infty}\frac{A^k}{r^k}=vw^T.

This means that, after scaling by the dominant growth rate, the powers of AA converge to a rank-one matrix.

For any vector xx,

Akxrkv(wTx). \frac{A^kx}{r^k}\to v(w^Tx).

Thus the long-term direction is controlled by the Perron eigenvector vv, while the coefficient is determined by the left eigenvector ww. This convergence statement is a standard primitive-matrix form of Perron-Frobenius theory.

74.16 Markov Chains

Perron-Frobenius theory is closely related to finite Markov chains.

A column-stochastic matrix has nonnegative entries and columns summing to 11:

ipij=1. \sum_i p_{ij}=1.

It maps probability vectors to probability vectors.

For such a matrix,

1 1

is an eigenvalue. If the Markov chain is irreducible and aperiodic, then the transition matrix is primitive in the appropriate convention.

The Perron eigenvector gives the stationary distribution:

Pπ=π. P\pi=\pi.

The convergence theorem says that, under irreducibility and aperiodicity, repeated transitions converge to the stationary distribution.

74.17 PageRank and Network Centrality

Perron-Frobenius theory also explains eigenvector centrality.

In a directed network, let AA be a nonnegative adjacency matrix. A centrality vector xx may be defined by

Ax=λx. Ax=\lambda x.

A node is important if it receives weight from important nodes.

When the matrix satisfies the right positivity or primitivity conditions, the Perron-Frobenius theorem guarantees a positive dominant eigenvector. This gives a mathematically stable centrality score.

PageRank modifies a link matrix by adding a positive teleportation component. This makes the matrix primitive, ensuring a unique positive stationary distribution.

74.18 Population Models

In population dynamics, a nonnegative matrix may describe transitions between age classes or stages.

If

xk+1=Axk, x_{k+1}=Ax_k,

then xkx_k records the population distribution at time kk.

If AA is primitive, then

xkrkcv x_k\approx r^k c v

for large kk, where:

SymbolMeaning
rrasymptotic growth factor
vvstable population distribution
cccoefficient depending on the initial population

If

r>1, r>1,

the population grows asymptotically.

If

r<1, r<1,

the population decays asymptotically.

If

r=1, r=1,

the population stabilizes in scale.

74.19 Collatz-Wielandt Formula

For a positive matrix AA, the Perron root can be characterized by variational inequalities.

One form is

ρ(A)=minx>0maxi(Ax)ixi. \rho(A) = \min_{x>0} \max_i \frac{(Ax)_i}{x_i}.

Another form is

ρ(A)=maxx>0mini(Ax)ixi. \rho(A) = \max_{x>0} \min_i \frac{(Ax)_i}{x_i}.

These are called Collatz-Wielandt formulas.

They show that the Perron root is determined by how AA stretches positive vectors componentwise. They are useful in analysis, comparison arguments, and numerical algorithms.

74.20 Monotonicity

Perron roots are monotone for nonnegative matrices.

If

0AB 0\leq A\leq B

entrywise, then

ρ(A)ρ(B). \rho(A)\leq \rho(B).

If

0A<B 0\leq A<B

and BB is irreducible, then the inequality is strict:

ρ(A)<ρ(B). \rho(A)<\rho(B).

This monotonicity is another order-theoretic feature absent from general matrices. It says that increasing entries of a nonnegative irreducible system increases its dominant growth rate.

74.21 Power Iteration

Perron-Frobenius theory gives a foundation for power iteration.

Given a primitive nonnegative matrix AA, start with a vector

x0>0. x_0>0.

Define

xk+1=AxkAxk. x_{k+1}=\frac{Ax_k}{\|Ax_k\|}.

Then xkx_k tends toward the normalized Perron eigenvector.

This works because the Perron root strictly dominates all other eigenvalues in modulus.

Power iteration is simple and widely used, but convergence speed depends on the spectral gap:

λ2ρ(A). \frac{|\lambda_2|}{\rho(A)}.

The smaller this ratio, the faster the convergence.

74.22 Reducible Case

If A0A\geq 0 is reducible, Perron-Frobenius theory still gives some information, but the structure is less rigid.

The spectral radius is still an eigenvalue of AA, and there is a nonnegative eigenvector corresponding to it.

However:

PropertyReducible behavior
Positive eigenvectorMay fail
SimplicityMay fail
Strict dominanceMay fail
UniquenessMay fail
Limit behaviorMay depend on blocks

Reducible matrices decompose into communicating classes or block triangular components. The dominant behavior comes from the blocks with largest spectral radius.

74.23 Common Errors

The first common error is to confuse nonnegative with positive. A nonnegative matrix may have zero entries. A positive matrix has all entries strictly positive.

The second common error is to assume every nonnegative matrix has a strictly positive eigenvector. Reducible matrices may only have eigenvectors with zeros.

The third common error is to assume irreducible implies strict spectral dominance. Irreducible matrices may have several eigenvalues on the spectral circle if they are periodic.

The fourth common error is to ignore the period. Periodic matrices produce cyclic behavior rather than convergence of Ak/rkA^k/r^k.

The fifth common error is to apply Perron-Frobenius conclusions to matrices with negative entries. The theorem depends on preservation of the nonnegative cone.

74.24 Summary

Perron-Frobenius theory studies spectral properties of nonnegative matrices.

For a positive or primitive matrix AA, the spectral radius

ρ(A) \rho(A)

is a positive, simple, strictly dominant eigenvalue. It has positive right and left eigenvectors. After normalization,

Akρ(A)kvwT. \frac{A^k}{\rho(A)^k}\to vw^T.

For irreducible nonnegative matrices, the Perron root is still positive and simple, and positive eigenvectors still exist, but other eigenvalues may lie on the same spectral circle. Their structure is controlled by the period.

The theory connects linear algebra with directed graphs, Markov chains, population models, network centrality, and positive dynamical systems.