Nonlinear Optimization Examples |
The IML procedure offers a set of optimization subroutines
for minimizing or maximizing a continuous nonlinear function
of
parameters, where
.
The parameters can be subject to boundary constraints and
linear or nonlinear equality and inequality constraints.
The following set of optimization subroutines is available:
NLPCG | Conjugate Gradient Method |
NLPDD | Double Dogleg Method |
NLPNMS | Nelder-Mead Simplex Method |
NLPNRA | Newton-Raphson Method |
NLPNRR | Newton-Raphson Ridge Method |
NLPQN | (Dual) Quasi-Newton Method |
NLPQUA | Quadratic Optimization Method |
NLPTR | Trust-Region Method |
The following subroutines are provided for solving nonlinear least squares problems:
NLPLM | Levenberg-Marquardt Least Squares Method |
NLPHQN | Hybrid Quasi-Newton Least Squares Methods |
A least squares problem is a special form of minimization problem where the objective function is defined as a sum of squares of other (nonlinear) functions.
The following subroutines are provided for the related problems of computing finite difference approximations for first- and second-order derivatives and of determining a feasible point subject to boundary and linear constraints:
NLPFDD | Approximate Derivatives by Finite Differences |
NLPFEA | Feasible Point Subject to Constraints |
Each optimization subroutine works iteratively.
If the parameters are subject only to linear constraints, all
optimization and least squares techniques are
feasible-point methods; that is, they move from feasible
point to a better feasible point
by a
step in the search direction
,
.
If you do not provide a feasible starting point
,
the optimization methods call the algorithm used in the NLPFEA
subroutine, which tries to compute a starting point that is
feasible with respect to the boundary and linear constraints.
The NLPNMS and NLPQN subroutines permit nonlinear constraints on parameters. For problems with nonlinear constraints, these subroutines do not use a feasible-point method; instead, the algorithms begin with whatever starting point you specify, whether feasible or infeasible.
Each optimization technique requires a continuous objective
function , and all optimization subroutines
except the NLPNMS subroutine require continuous
first-order derivatives of the objective function
.
If you do not provide the derivatives of
, they
are approximated by finite-difference formulas.
You can use the NLPFDD subroutine to check the
correctness of analytical derivative specifications.
Most of the results obtained from the IML procedure optimization and least squares subroutines can also be obtained by using the OPTMODEL procedure or the NLP procedure in SAS/OR software.
The advantages of the IML procedure are as follows:
Copyright © 2009 by SAS Institute Inc., Cary, NC, USA. All rights reserved.