Newton's Method Convergence Theorem
Newton's method exhibits quadratic convergence near simple roots, making it one of the most efficient root-finding algorithms when derivatives are available and the initial guess is sufficiently accurate.
Let have a root with and . Then there exists such that if , Newton's iteration: converges quadratically to , satisfying:
Moreover, if and , then for sufficiently small:
The theorem guarantees that once Newton's method enters a neighborhood of the root, errors decrease quadratically. This means the number of accurate digits roughly doubles with each iteration, leading to very rapid convergence.
Comparison with Linear Convergence: A linearly convergent method with rate satisfies . To achieve error :
- Linear convergence requires iterations
- Quadratic convergence requires iterations
For example, achieving 16-digit accuracy from 2-digit accuracy:
- Linear with : about 47 iterations
- Quadratic: about 4 iterations
For on seeking :
Bisection (50 iterations):
Newton from (4 iterations):
- (error )
- (error )
- (error )
- (machine precision)
Newton achieves machine precision in 4 iterations while bisection requires 50.
The condition is essential; when (multiple root), Newton's method degrades to linear convergence. The basin of attraction depends on : larger ratios require closer initial guesses for convergence.
In practice, hybrid methods combine global convergence guarantees (bisection or false position) with Newton's local quadratic convergence, using Newton steps when safe and falling back to bisection otherwise.