Newton's Method and Secant Method
Newton's method achieves quadratic convergence using derivative information, while the secant method approximates derivatives using finite differences, offering a practical alternative when derivatives are expensive to compute.
Given differentiable with root , Newton's method is:
Geometrically, is where the tangent line to at intersects the -axis. The method derives from linearization: , solving for when this equals zero.
Newton's method exhibits quadratic convergence near simple roots: if and , then:
This means the number of correct digits approximately doubles with each iteration once is sufficiently close to .
To find , solve . Newton's method gives:
For with :
- (correct to 9 decimal places)
Only three iterations achieve high precision from a rough initial guess.
The secant method approximates using a finite difference:
This requires two initial guesses but avoids computing derivatives. Geometrically, is where the secant line through and intersects the -axis.
Convergence Rates:
- Bisection: Linear, rate (one bit per iteration)
- Fixed-point: Linear, rate depends on
- Secant: Superlinear, order (golden ratio)
- Newton: Quadratic, order 2 (near simple roots)
The secant method converges faster than linear but slower than quadratic. It requires one function evaluation per iteration versus Newton's two (function and derivative), making it competitive in practice.
Newton's method can fail: it diverges if started far from a root, requires , and converges only linearly at multiple roots (where ). Modified Newton's method addresses multiple roots: where is the multiplicity, restoring quadratic convergence.
The secant method is more robust than Newton regarding divergence but can also fail if , causing division by near-zero. Hybrid methods combine bisection's reliability with Newton's speed: starting with bisection to enter a convergence basin, then switching to Newton for rapid final convergence.