TheoremComplete

Taylor's Theorem (Multivariable)

Taylor's theorem extends to multivariable functions, approximating f(x)f(\mathbf{x}) near a\mathbf{a} using derivatives. The first-order approximation uses the gradient, and the second-order uses the Hessian. This is essential for optimization (Newton's method), error analysis, and differential geometry.


Statement

Theorem9.1Taylor's Theorem (Second Order)

Let f:Rn→Rf : \mathbb{R}^n \to \mathbb{R} be twice continuously differentiable. Then for x\mathbf{x} near a\mathbf{a},

f(x)=f(a)+βˆ‡f(a)β‹…(xβˆ’a)+12(xβˆ’a)THf(a)(xβˆ’a)+o(βˆ₯xβˆ’aβˆ₯2),f(\mathbf{x}) = f(\mathbf{a}) + \nabla f(\mathbf{a}) \cdot (\mathbf{x} - \mathbf{a}) + \frac{1}{2}(\mathbf{x} - \mathbf{a})^T H f(\mathbf{a}) (\mathbf{x} - \mathbf{a}) + o(\|\mathbf{x} - \mathbf{a}\|^2),

where Hf(a)Hf(\mathbf{a}) is the Hessian matrix.


Applications

ExampleNewton's method for optimization

To minimize f(x)f(\mathbf{x}), Newton's method iterates

xn+1=xnβˆ’[Hf(xn)]βˆ’1βˆ‡f(xn).\mathbf{x}_{n+1} = \mathbf{x}_n - [Hf(\mathbf{x}_n)]^{-1} \nabla f(\mathbf{x}_n).

This uses the second-order Taylor approximation to find the minimum of the quadratic approximation.


Summary

Multivariable Taylor's theorem:

  • First-order: f(x)β‰ˆf(a)+βˆ‡fβ‹…(xβˆ’a)f(\mathbf{x}) \approx f(\mathbf{a}) + \nabla f \cdot (\mathbf{x} - \mathbf{a}).
  • Second-order: adds Hessian term for curvature.
  • Applications: optimization (Newton's method), error analysis.

See Extrema for optimization.