| The NLP Procedure |
The following table outlines the options in PROC NLP classified by function. An alphabetical list of options is provided in the Dictionary of Options.
Table 4.1: Functional Summary
| Description | Statement | Option |
| Input Data Set Options: | ||
| input data set | PROC NLP | DATA= |
| initial values and constraints | PROC NLP | INEST= |
| quadratic objective function | PROC NLP | INQUAD= |
| program statements | PROC NLP | MODEL= |
| skip missing value observations | PROC NLP | NOMISS |
| Output Data Set Options: | ||
| variables and derivatives | PROC NLP | OUT= |
| result parameter values | PROC NLP | OUTEST= |
| program statements | PROC NLP | OUTMODEL= |
| combine various OUT... statements | PROC NLP | OUTALL |
| CRP Jacobian in the OUTEST= data set | PROC NLP | OUTCRPJAC |
| derivatives in the OUT= data set | PROC NLP | OUTDER= |
| grid in the OUTEST= data set | PROC NLP | OUTGRID |
| Hessian in the OUTEST= data set | PROC NLP | OUTHESSIAN |
| iterative output in the OUTEST= data set | PROC NLP | OUTITER |
| Jacobian in the OUTEST= data set | PROC NLP | OUTJAC |
| NLC Jacobian in the OUTEST= data set | PROC NLP | OUTNLCJAC |
| time in the OUTEST= data set | PROC NLP | OUTTIME |
| Optimization Options: | ||
| minimization method | PROC NLP | TECH= |
| update technique | PROC NLP | UPDATE= |
| version of optimization technique | PROC NLP | VERSION= |
| line-search method | PROC NLP | LINESEARCH= |
| line-search precision | PROC NLP | LSPRECISION= |
| type of Hessian scaling | PROC NLP | HESCAL= |
| start for approximated Hessian | PROC NLP | INHESSIAN= |
| iteration number for update restart | PROC NLP | RESTART= |
| Initial Value Options: | ||
| produce best grid points | PROC NLP | BEST= |
| infeasible points in grid search | PROC NLP | INFEASIBLE |
| pseudorandom initial values | PROC NLP | RANDOM= |
| constant initial values | PROC NLP | INITIAL= |
| Derivative Options: | ||
| finite-difference derivatives | PROC NLP | FD= |
| finite-difference derivatives | PROC NLP | FDHESSIAN= |
| compute finite-difference interval | PROC NLP | FDINT= |
| use only diagonal of Hessian | PROC NLP | DIAHES |
| test gradient specification | PROC NLP | GRADCHECK= |
| Constraint Options: | ||
| range for active constraints | PROC NLP | LCEPSILON= |
| LM tolerance for deactivating | PROC NLP | LCDEACT= |
| tolerance for dependent constraints | PROC NLP | LCSINGULAR= |
| sum all observations for continuous functions | NLINCON | / SUMOBS |
| evaluate each observation for continuous functions | NLINCON | / EVERYOBS |
| Termination Criteria Options: | ||
| maximum number of function calls | PROC NLP | MAXFUNC= |
| maximum number of iterations | PROC NLP | MAXITER= |
| minimum number of iterations | PROC NLP | MINITER= |
| upper limit on real time | PROC NLP | MAXTIME= |
| absolute function convergence criterion | PROC NLP | ABSCONV= |
| absolute function convergence criterion | PROC NLP | ABSFCONV= |
| absolute gradient convergence criterion | PROC NLP | ABSGCONV= |
| absolute parameter convergence criterion | PROC NLP | ABSXCONV= |
| relative function convergence criterion | PROC NLP | FCONV= |
| relative function convergence criterion | PROC NLP | FCONV2= |
| relative gradient convergence criterion | PROC NLP | GCONV= |
| relative gradient convergence criterion | PROC NLP | GCONV2= |
| relative parameter convergence criterion | PROC NLP | XCONV= |
| used in FCONV, GCONV criterion | PROC NLP | FSIZE= |
| used in XCONV criterion | PROC NLP | XSIZE= |
| Covariance Matrix Options: | ||
| type of covariance matrix | PROC NLP | COV= |
| PROC NLP | SIGSQ= | |
| determine factor of COV matrix | PROC NLP | VARDEF= |
| absolute singularity for inertia | PROC NLP | ASINGULAR= |
| relative M singularity for inertia | PROC NLP | MSINGULAR= |
| relative V singularity for inertia | PROC NLP | VSINGULAR= |
| threshold for Moore-Penrose inverse | PROC NLP | G4= |
| tolerance for singular COV matrix | PROC NLP | COVSING= |
| profile confidence limits | PROC NLP | CLPARM= |
| Printed Output Options: | ||
| display (almost) all printed output | PROC NLP | PALL |
| suppress all printed output | PROC NLP | NOPRINT |
| reduce some default output | PROC NLP | PSHORT |
| reduce most default output | PROC NLP | PSUMMARY |
| display initial values and gradients | PROC NLP | PINIT |
| display optimization history | PROC NLP | PHISTORY |
| display Jacobian matrix | PROC NLP | PJACOBI |
| display crossproduct Jacobian matrix | PROC NLP | PCRPJAC |
| display Hessian matrix | PROC NLP | PHESSIAN |
| display Jacobian of nonlinear constraints | PROC NLP | PNLCJAC |
| display values of grid points | PROC NLP | PGRID |
| display values of functions in LSQ, MIN, MAX | PROC NLP | PFUNCTION |
| display approximate standard errors | PROC NLP | PSTDERR |
| display covariance matrix | PROC NLP | PCOV |
| display eigenvalues for covariance matrix | PROC NLP | PEIGVAL |
| print code evaluation problems | PROC NLP | PERROR |
| print measures of real time | PROC NLP | PTIME |
| display model program, variables | PROC NLP | LIST |
| display compiled model program | PROC NLP | LISTCODE |
| Step Length Options: | ||
| damped steps in line search | PROC NLP | DAMPSTEP= |
| maximum trust region radius | PROC NLP | MAXSTEP= |
| initial trust region radius | PROC NLP | INSTEP= |
| Profile Point and Confidence Interval Options: | ||
| factor relating discrepancy function to | PROFILE | FFACTOR= |
| scale for | PROFILE | FORCHI= |
| upper bound for confidence limit search | PROFILE | FEASRATIO= |
| write all confidence limit parameter estimates to OUTEST= data set | PROFILE | OUTTABLE |
| Miscellaneous Options: | ||
| number of accurate digits in objective function | PROC NLP | FDIGITS= |
| number of accurate digits in nonlinear constraints | PROC NLP | CDIGITS= |
| general singularity criterion | PROC NLP | SINGULAR= |
| do not compute inertia of matrices | PROC NLP | NOEIGNUM |
| check optimality in neighborhood | PROC NLP | OPTCHECK= |
Copyright © 2008 by SAS Institute Inc., Cary, NC, USA. All rights reserved.