Nonlinear Optimization Examples |
Kuhn-Tucker Conditions |
The nonlinear programming (NLP) problem with one objective function and constraint functions , which are continuously differentiable, is defined as follows:
In the preceding notation, is the dimension of the function , and is the number of equality constraints. The linear combination of objective and constraint functions
is the Lagrange function, and the coefficients are the Lagrange multipliers.
If the functions and are twice differentiable, the point is an isolated local minimizer of the NLP problem, if there exists a vector that meets the following conditions:
Kuhn-Tucker conditions
second-order condition
Each nonzero vector with
satisfies
In practice, you cannot expect the constraint functions to vanish within machine precision, and determining the set of active constraints at the solution might not be simple.
Copyright © SAS Institute, Inc. All Rights Reserved.