Extrema of Multivariable Functions
Finding extrema (maxima and minima) of multivariable functions is central to optimization, economics, and physics. Critical points occur where the gradient vanishes, and the second derivative test (using the Hessian matrix) classifies them as local maxima, minima, or saddle points. Constrained optimization uses Lagrange multipliers.
Critical points
A point is a critical point of if or does not exist.
If has a local extremum (maximum or minimum) at an interior point and is differentiable at , then .
gives as the only critical point. However, is a saddle point, not an extremum.
Second derivative test
The Hessian matrix of at is
Let be a critical point of with . Let .
- If is positive definite, then is a local minimum.
- If is negative definite, then is a local maximum.
- If has both positive and negative eigenvalues, then is a saddle point.
- If is indefinite (test inconclusive), higher-order tests are needed.
at . The Hessian is
which is positive definite (eigenvalues ). Thus is a local minimum.
Lagrange multipliers
To optimize subject to the constraint , solve
for and (the Lagrange multiplier). The solutions are candidates for constrained extrema.
Lagrange condition: . This gives and , so . If , then , so .
For : , so , giving with (maximum).
For : , giving (minimum).
Summary
Extrema of multivariable functions:
- Critical points: .
- Second derivative test: Hessian determines local behavior.
- Lagrange multipliers for constrained optimization.
See Implicit Function Theorem for related results.