ConceptComplete

Newton's Method and Secant Method

Newton's method achieves quadratic convergence using derivative information, while the secant method approximates derivatives using finite differences, offering a practical alternative when derivatives are expensive to compute.

DefinitionNewton's Method

Given f:RRf: \mathbb{R} \to \mathbb{R} differentiable with root rr, Newton's method is: xn+1=xnf(xn)f(xn)x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}

Geometrically, xn+1x_{n+1} is where the tangent line to ff at (xn,f(xn))(x_n, f(x_n)) intersects the xx-axis. The method derives from linearization: f(x)f(xn)+f(xn)(xxn)f(x) \approx f(x_n) + f'(x_n)(x - x_n), solving for xx when this equals zero.

Newton's method exhibits quadratic convergence near simple roots: if f(r)=0f(r) = 0 and f(r)0f'(r) \neq 0, then: limnxn+1rxnr2=f(r)2f(r)\lim_{n \to \infty} \frac{|x_{n+1} - r|}{|x_n - r|^2} = \frac{|f''(r)|}{2|f'(r)|}

This means the number of correct digits approximately doubles with each iteration once xnx_n is sufficiently close to rr.

ExampleComputing $\sqrt[3]{a}$

To find a3\sqrt[3]{a}, solve f(x)=x3a=0f(x) = x^3 - a = 0. Newton's method gives: xn+1=xnxn3a3xn2=2xn3+a3xn2=2xn+a/xn23x_{n+1} = x_n - \frac{x_n^3 - a}{3x_n^2} = \frac{2x_n^3 + a}{3x_n^2} = \frac{2x_n + a/x_n^2}{3}

For a=7a = 7 with x0=2x_0 = 2:

  • x1=(22+7/4)/3=1.9166x_1 = (2 \cdot 2 + 7/4)/3 = 1.916\overline{6}
  • x2=1.912938x_2 = 1.912938
  • x3=1.912931183x_3 = 1.912931183 (correct to 9 decimal places)

Only three iterations achieve high precision from a rough initial guess.

DefinitionSecant Method

The secant method approximates f(xn)f'(x_n) using a finite difference: xn+1=xnf(xn)xnxn1f(xn)f(xn1)x_{n+1} = x_n - f(x_n) \frac{x_n - x_{n-1}}{f(x_n) - f(x_{n-1})}

This requires two initial guesses x0,x1x_0, x_1 but avoids computing derivatives. Geometrically, xn+1x_{n+1} is where the secant line through (xn1,f(xn1))(x_{n-1}, f(x_{n-1})) and (xn,f(xn))(x_n, f(x_n)) intersects the xx-axis.

Remark

Convergence Rates:

  • Bisection: Linear, rate 1/21/2 (one bit per iteration)
  • Fixed-point: Linear, rate depends on g(r)|g'(r)|
  • Secant: Superlinear, order ϕ=(1+5)/21.618\phi = (1+\sqrt{5})/2 \approx 1.618 (golden ratio)
  • Newton: Quadratic, order 2 (near simple roots)

The secant method converges faster than linear but slower than quadratic. It requires one function evaluation per iteration versus Newton's two (function and derivative), making it competitive in practice.

Newton's method can fail: it diverges if started far from a root, requires f(xn)0f'(x_n) \neq 0, and converges only linearly at multiple roots (where f(r)=0f'(r) = 0). Modified Newton's method addresses multiple roots: xn+1=xnmf(xn)f(xn)x_{n+1} = x_n - m \frac{f(x_n)}{f'(x_n)} where mm is the multiplicity, restoring quadratic convergence.

The secant method is more robust than Newton regarding divergence but can also fail if f(xn)f(xn1)f(x_n) \approx f(x_{n-1}), causing division by near-zero. Hybrid methods combine bisection's reliability with Newton's speed: starting with bisection to enter a convergence basin, then switching to Newton for rapid final convergence.