The NLP Procedure |
PROC NLP solves
where is the objective function and the
's are
the constraint functions.
A point is feasible if it satisfies all the constraints.
The feasible region
is the set of all the feasible points.
A feasible point
is a global solution of the
preceding problem if no point in
has a smaller function value than
).
A feasible point
is a local solution of the problem if there
exists some open neighborhood surrounding
in that no point has
a smaller function value than
).
Nonlinear programming algorithms cannot consistently find
global minima. All the algorithms in PROC NLP find a
local minimum for this problem. If you need to check whether the obtained solution
is a global minimum, you may have to run PROC NLP with
different starting points obtained either at random or by selecting
a point on a grid that contains
.
Every local minimizer of this problem
satisfies the following local optimality conditions:
Most of the optimization algorithms in PROC NLP use iterative techniques
that result in a sequence of points , that
converges to a local solution
. At the solution, PROC NLP
performs tests to confirm that the (projected) gradient is close to
zero and that the (projected) Hessian matrix is positive definite.
An important tool in the analysis and design of algorithms in constrained optimization is the Lagrangian function, a linear combination of the objective function and the constraints:
The coefficients are called Lagrange multipliers.
This tool makes it possible to state necessary and sufficient
conditions for a local minimum. The various algorithms in PROC NLP
create sequences of points, each of which is closer than the previous
one to satisfying these conditions.
Assuming that the functions and
are twice continuously
differentiable, the point
is a local
minimum of the nonlinear programming problem, if there exists a vector
that meets the following
conditions.
1. First-order Karush-Kuhn-Tucker conditions:
2. Second-order conditions:
Each nonzero vector that satisfies
Most of the algorithms to solve this problem attempt to find a
combination of vectors and
for which the gradient
of the Lagrangian function with respect to
is zero.
The first- and second-order conditions of optimality are based
on first and second derivatives of the objective function and the
constraints
.
The gradient vector contains the first
derivatives of the objective function with respect
to the parameters
as follows:
The symmetric Hessian matrix contains
the second derivatives of the objective function
with
respect to the parameters
as follows:
For least-squares problems, the Jacobian matrix
contains the first-order derivatives of the
objective
functions
with respect to the parameters
as follows:
The Jacobian matrix contains the first-order
derivatives of the
nonlinear constraint functions
, with respect to the parameters
, as follows:
PROC NLP provides three ways to compute derivatives:
Copyright © 2008 by SAS Institute Inc., Cary, NC, USA. All rights reserved.