Find a root of a continuous function by repeatedly halving an interval where the function changes sign.
The bisection method finds an approximate root of a continuous function on an interval. It applies when the function has opposite signs at the two endpoints.
If:
and is continuous, then a root exists somewhere in .
Problem
Given a continuous function and an interval , find a value such that:
In practice, the algorithm returns an approximation whose error is bounded by a chosen tolerance .
Key Idea
Use the intermediate value theorem.
At each step:
- Compute the midpoint
- Evaluate
- Keep the half interval where the sign change still occurs
This is binary search over a continuous domain.
Algorithm
bisection_method(f, l, r, eps):
if f(l) == 0:
return l
if f(r) == 0:
return r
while r - l > eps:
m = (l + r) / 2
if f(m) == 0:
return m
if f(l) * f(m) <= 0:
r = m
else:
l = m
return (l + r) / 2Example
Find a root of:
on interval .
Since:
and
there is a root in the interval. Repeated bisection converges to:
Correctness
The algorithm maintains the invariant that the current interval contains a root.
Initially, the signs at and are opposite or one endpoint is already a root. By continuity, a root exists inside the interval.
At each step, the midpoint divides the interval into two halves. If and have opposite signs, then the root lies in . Otherwise, it lies in .
The chosen half still satisfies the sign-change condition, so the invariant is preserved.
When the interval length is at most , any point inside it approximates a root within that tolerance.
Complexity
Let the initial interval length be:
After iterations, the interval length is:
To make this at most :
| cost | value |
|---|---|
| iterations | |
| function evaluations | |
| space |
When to Use
Use the bisection method when:
- the function is continuous
- the interval endpoints have opposite signs
- guaranteed convergence matters more than speed
- derivative information is unavailable or unreliable
Comparison
| method | requires derivative | convergence |
|---|---|---|
| bisection | no | guaranteed with sign change |
| Newton method | yes | fast but may fail |
| secant method | no | faster but less robust |
Notes
- Bisection is robust but slower than Newton-style methods.
- It finds one root in the interval, not necessarily all roots.
- If the function touches zero without changing sign, bisection may not detect it from endpoint signs.
- Floating point comparisons to zero should usually use a tolerance.
Implementation
def bisection_method(f, l, r, eps=1e-9):
fl = f(l)
fr = f(r)
if fl == 0:
return l
if fr == 0:
return r
if fl * fr > 0:
raise ValueError("f(l) and f(r) must have opposite signs")
while r - l > eps:
m = (l + r) / 2
fm = f(m)
if fm == 0:
return m
if fl * fm <= 0:
r = m
fr = fm
else:
l = m
fl = fm
return (l + r) / 2import "errors"
func BisectionMethod(f func(float64) float64, l, r, eps float64) (float64, error) {
fl := f(l)
fr := f(r)
if fl == 0 {
return l, nil
}
if fr == 0 {
return r, nil
}
if fl*fr > 0 {
return 0, errors.New("f(l) and f(r) must have opposite signs")
}
for r-l > eps {
m := (l + r) / 2
fm := f(m)
if fm == 0 {
return m, nil
}
if fl*fm <= 0 {
r = m
fr = fm
} else {
l = m
fl = fm
}
}
return (l + r) / 2, nil
}