Optimization in Several Variables
Finding maxima and minima of multivariable functions requires extending the critical point analysis of single-variable calculus to higher dimensions, using the gradient and Hessian matrix.
Critical Points
A point is a critical point of if (i.e., all partial derivatives vanish). A critical point is a local minimum if for all near , a local maximum if , and a saddle point if it is neither.
The Hessian matrix of at is the matrix of second partial derivatives: By Clairaut's theorem, is symmetric when .
The Second Derivative Test
Let be and a critical point of . Then:
- If is positive definite (all eigenvalues ), then is a local minimum.
- If is negative definite (all eigenvalues ), then is a local maximum.
- If is indefinite (has both positive and negative eigenvalues), then is a saddle point.
- If is semidefinite (has zero eigenvalues), the test is inconclusive.
For with critical point , let (the determinant of the Hessian):
- and : local minimum
- and : local maximum
- : saddle point
- : inconclusive
For : critical point at with , , so is a local minimum with .
Lagrange Multipliers
To optimize subject to a constraint , the method of Lagrange multipliers states that at a constrained extremum, for some scalar . This elegant condition says the gradient of is parallel to the gradient of at the optimum, meaning can only change in directions that would violate the constraint.