Skip to content

6.5 Recursion and Induction

Defining objects step by step and proving properties by following the same construction.

Recursion and induction are paired patterns. Recursion defines objects step by step. Induction proves statements about objects built step by step.

The natural numbers give the basic model. Start with 00. Then repeatedly apply the successor operation:

0,1,2,3, 0,\quad 1,\quad 2,\quad 3,\quad \dots

Each number is either the starting value or obtained from an earlier number. This recursive construction supports the principle of induction.

To prove a statement P(n)P(n) for all natural numbers nn, it is enough to prove two things:

P(0) P(0)

and

P(n)P(n+1). P(n) \Rightarrow P(n+1).

The first part is the base case. The second part is the induction step. Together, they cover every natural number because every natural number is reached by repeated successor steps.

Recursion defines. Induction verifies.

For example, the factorial function is defined recursively by

0!=1 0! = 1

and

(n+1)!=(n+1)n!. (n+1)! = (n+1)n!.

The value at each stage depends on the previous value. This gives a finite procedure for computing n!n! for any natural number nn.

One can then prove properties of factorials by induction. For instance, to show that n!1n! \geq 1 for all nn, prove it for 00, then show that if n!1n! \geq 1, then (n+1)!1(n+1)! \geq 1.

The same pattern appears beyond numbers. Lists are recursive objects. A list is either empty or formed by adding an element to the front of a smaller list. Trees are recursive objects. A tree may consist of a root together with smaller subtrees. Formal expressions are recursive objects built from variables and operations.

Because these objects are recursively formed, induction is the natural proof method for them.

For lists, one proves a property by checking the empty list and then proving that adding one element preserves the property. For trees, one proves the base case for a leaf and then proves the step for a node assuming the property for its subtrees. For formulas, one proves the result for atomic formulas and then checks each way of building larger formulas.

This is called structural induction. The proof follows the construction of the object.

Recursion is also central in algorithms. Many algorithms solve a problem by reducing it to smaller instances. To sort a list, one may sort smaller lists and merge them. To search a tree, one searches subtrees. To compute powers, one may reduce the exponent step by step.

The proof of correctness usually mirrors the recursion. If the algorithm is correct on smaller inputs and the combining step is correct, then the algorithm is correct on the larger input.

For example, a recursive definition of powers is

a0=1 a^0 = 1

and

an+1=ana. a^{n+1} = a^n a.

Induction proves the laws that follow from this definition, such as

am+n=aman. a^{m+n} = a^m a^n.

The proof proceeds by induction on nn or mm, depending on how the recursion is arranged.

Recursion must be controlled. A recursive definition is valid only when each step reduces the problem toward a base case. Without this decrease, the definition may not determine an object.

For natural numbers, the decrease is clear: n+1n+1 depends on nn. For lists, a nonempty list depends on a shorter list. For trees, a tree depends on smaller subtrees.

This condition is called well-foundedness. It prevents infinite descent.

Induction also depends on well-foundedness. A proof works because there are no endlessly descending chains of smaller cases. Every case eventually reaches a base case.

There are several forms of induction.

Ordinary induction proves P(n)P(n) from P(n1)P(n-1). Strong induction proves P(n)P(n) assuming P(k)P(k) for all k<nk < n. Structural induction follows the formation rules of an object. Well-founded induction works over any relation where infinite descent is impossible.

These are not separate tricks. They are variations of the same principle: prove the property at minimal cases, then prove it is preserved by construction.

Recursion and induction also appear in definitions of syntax. A formal language is often defined recursively. Terms are built from variables and function symbols. Formulas are built from atomic formulas using logical connectives and quantifiers.

To prove a property of all formulas, one uses induction on formula structure. Check atomic formulas, then show the property is preserved by negation, conjunction, implication, and quantification.

This is why recursion and induction are fundamental in logic, programming languages, and proof assistants.

They also appear in combinatorics. Many counting sequences are recursively defined. The Fibonacci numbers satisfy

F0=0,F1=1, F_0 = 0,\quad F_1 = 1,

and

Fn+2=Fn+1+Fn. F_{n+2} = F_{n+1} + F_n.

The definition builds each term from previous terms. Induction proves identities about the sequence.

Recursion gives an object by rules. Induction gives a proof by the same rules. This pairing is one of the most important design patterns in mathematics.

A useful habit is to ask: how is the object built? The answer usually tells you how to prove things about it.

If the object is built by cases, prove by cases. If it is built step by step, prove by induction. If it is built from subobjects, prove by structural induction.

Recursion and induction make infinite families manageable. They reduce an infinite task to a finite base case plus a stable transition step.