The NLP Procedure |
Since nonlinear optimization is an iterative process that depends on many factors, it is difficult to estimate how much computer time is necessary to compute an optimal solution satisfying one of the termination criteria. The MAXTIME=, MAXITER=, and MAXFUNC= options can be used to restrict the amount of real time, the number of iterations, and the number of function calls in a single run of PROC NLP.
In each iteration , the NRRIDG and LEVMAR techniques use symmetric Householder transformations to decompose the Hessian (crossproduct Jacobian) matrix ,
The larger the problem, the more time is spent computing function values and derivatives. Therefore, many researchers compare optimization techniques by counting and comparing the respective numbers of function, gradient, and Hessian (crossproduct Jacobian) evaluations. You can save computer time and memory by specifying derivatives (using the GRADIENT, JACOBIAN, CRPJAC, or HESSIAN statement) since you will typically produce a more efficient representation than the internal derivative compiler.
Finite-difference approximations of the derivatives are expensive since they require additional function or gradient calls.
The following table shows for each optimization technique which derivatives are needed (FOD: first-order derivatives; SOD: second-order derivatives), what kinds of constraints are supported (BC: boundary constraints; LIC: linear constraints), and the minimal memory (number of double floating point numbers) required. For various reasons, there are additionally about double floating point numbers needed.
Quadratic Programming | FOD | SOD | BC | LIC | Memory |
LICOMP | - | - | x | x | |
QUADAS | - | - | x | x | |
General Optimization | FOD | SOD | BC | LIC | Memory |
TRUREG | x | x | x | x | |
NEWRAP | x | x | x | x | |
NRRIDG | x | x | x | x | |
QUANEW | x | - | x | x | |
DBLDOG | x | - | x | x | |
CONGRA | x | - | x | x | |
NMSIMP | - | - | x | x | |
Least-Squares | FOD | SOD | BC | LIC | Memory |
LEVMAR | x | - | x | x | |
HYQUAN | x | - | x | x |
Notes:
The total amount of memory needed to run an optimization technique consists of the technique-specific memory listed in the preceding table, plus additional blocks of memory as shown in the following table.
double | int | long | 8byte | |
Basic Requirement | ||||
DATA= data set | - | - | ||
JACOBIAN statement | - | - | - | |
CRPJAC statement | - | - | - | |
HESSIAN statement | - | - | - | |
COV= option | - | - | - | |
Scaling vector | - | - | - | |
BOUNDS statement | - | - | ||
Bounds in INEST= | - | - | - | |
LINCON and TRUREG | - | - | ||
LINCON and other | - | - |
Notes:
Copyright © 2008 by SAS Institute Inc., Cary, NC, USA. All rights reserved.