Procedures in Online Documentation
The legacy NLP procedure offers a set of optimization techniques for minimizing or maximizing a continuous nonlinear function of n decision variables with boundary, general linear, and nonlinear equality and inequality constraints. The NLP procedure supports a number of algorithms for solving this problem that take advantage of special structures on and . Two algorithms are especially designed for quadratic optimization problems, and two other algorithms are provided for the efficient solution of nonlinear least squares problems.
The DATA= data set specifies special kinds of objective functions.
The INQUAD= data set specifies a quadratic programming problem.
The INEST= or INVAR= data set specifies initial values for the decision variables, the values of constants that are referred to in the program statements, in addition to simple boundary and general linear constraints.
The MODEL= data set specifies a model saved from a previous execution of the NLP procedure.
The OUTEST= data set or OUTVAR= data set contains the values of the decision variables at optimality, derivatives at the solution, and covariance matrices. These can be used in subsequent calls of PROC NLP as an INEST= data set.
The OUT= data set contains variables generated in the program statements that define the objective function (and perhaps derivatives) plus selected variables of the DATA= data set if available.
The OUTMOD= data set saves the programming statements. It can also be used to input a model in the MODEL= data set.
quadratic qptimization
linear complementarity problem
quadratic active set technique
general nonlinear optimization
trust-region method
Newton-Raphson method with line search
Newton-Raphson method with ridging
quasi-Newton methods (DBFGS, DDFP, BFGS, and DFP)
double-dogleg method (DBFGS and DDFP)
conjugate gradient methods (PB, FR, PR, and CD)
Nelder-Mead simplex method
nonlinear least squares
Levenberg-Marquardt method
hybrid quasi-Newton methods (DBFGS and DDFP)
All optimization techniques in PROC NLP work iteratively. The quasi-Newton methods and the Nelder-Mead simplex method support nonlinear constraints on the decision variables.
When the variables are not subjected to nonlinear constraints, then the optimization and least squares techniques are feasible-point methods, moving iteratively from feasible point x(k) to a better feasible point x(k+1) by a step along the search direction s(k), k=1,2,3,.... If a feasible starting point x(0) is not provided, each of the optimization methods starts calling an algorithm that tries to compute a starting point that is feasible for boundary and linear constraints.
All the NLP optimization techniques require a continuous objective function f, and all except the Nelder-Mead simplex method require continuous first-order derivatives of the objective function f. Some of the techniques also require continuous second-order derivatives. There are three ways to compute the derivatives in PROC NLP:
analytically (using a special derivative compiler), which is the default method
by finite difference approximations
by user-supplied exact or approximate numerical functions