← back to calculus

Optimization

Active Calculus · CC BY-SA · activecalculus.org

Find the best point. Critical points are where the gradient vanishes. The second derivative test classifies them as maxima, minima, or saddle points. Lagrange multipliers handle optimization with constraints: find the best point on a surface, not just in open space.

Critical points in several variables

A critical point of f(x, y) is where both partial derivatives are zero: grad(f) = (0, 0). These are the only candidates for local maxima and minima in the interior of the domain.

Scheme

Second derivative test

At a critical point, compute the Hessian determinant D = f_xx*f_yy − (f_xy)^2. If D > 0 and f_xx > 0, it is a local minimum. If D > 0 and f_xx < 0, a local maximum. If D < 0, a saddle point.

Scheme
max contour plot with gradient arrows ∇f

Lagrange multipliers

To optimize f(x, y) subject to a constraint g(x, y) = 0, find points where grad(f) = lambda * grad(g). At the optimum, the gradient of the objective is parallel to the gradient of the constraint. Lambda is the Lagrange multiplier: it measures how much the constraint costs you.

Scheme

Notation reference

Symbol Meaning
∇f = 0Critical point condition
D = f_xx f_yy − f_xy²Hessian determinant
∇f = λ∇gLagrange condition
λLagrange multiplier (shadow price of constraint)
Neighbors

Calculus sequence

Foundations (Wikipedia)