Optimization is one of the most practical applications of calculus. The fundamental insight is that at a local maximum or minimum, the derivative of the function is zero (or undefined). By finding these critical points and classifying them, you can solve a wide range of real-world problems.
The process follows a standard framework. First, translate the word problem into a mathematical function. Identify the quantity to optimize and express it in terms of a single variable. Second, find the derivative and set it equal to zero. Third, solve for the critical points. Fourth, classify each critical point as a maximum, minimum, or neither using the first or second derivative test. Fifth, verify that the result makes sense in the context of the problem.
The first derivative test examines sign changes of the derivative around a critical point. If the derivative changes from positive to negative, the function has a local maximum. If it changes from negative to positive, the function has a local minimum. If the sign does not change, the point is neither.
The second derivative test is often quicker. Evaluate the second derivative at the critical point. If the second derivative is negative, the function is concave down, giving a local maximum. If positive, concave up, giving a local minimum. If zero, the test is inconclusive and you must use the first derivative test.
Classic optimization problems include: finding the dimensions of a rectangle with maximum area for a fixed perimeter (a square); minimizing the surface area of a cylinder for a fixed volume (height equals diameter); and finding the maximum distance from a point to a curve. Each requires setting up the objective function, finding its derivative, and solving.
Applied problems add complexity. A manufacturer might want to minimize production cost, which involves fixed costs plus variable costs that depend on production quantity. The optimal production level occurs where the marginal cost equals the marginal revenue. An economist might want to maximize profit, where profit equals revenue minus cost.
Constrained optimization adds a constraint on the variables. For problems with two variables and one constraint, you can often eliminate one variable using the constraint and reduce to a single-variable problem. The quadratic solver helps when the derivative is a quadratic expression.
The extreme value theorem guarantees that a continuous function on a closed bounded interval attains both a maximum and a minimum. To find the global extrema, evaluate the function at all critical points within the interval and at both endpoints. The largest value is the global maximum; the smallest is the global minimum.
Multivariable optimization uses partial derivatives. Set the gradient equal to the zero vector to find critical points. The second derivative test for two variables involves the Hessian determinant. If the determinant is positive and the second partial with respect to x is positive, the point is a local minimum. If the determinant is positive and the second partial is negative, it is a local maximum. If the determinant is negative, it is a saddle point.