Language Reference |
nonlinear optimization by conjugate gradient method
Value of opt[4] | Update Method |
1 | Automatic restart method of Powell (1977) and Beale (1972). |
This is the default. | |
2 | Fletcher-Reeves update (Fletcher 1987) |
3 | Polak-Ribiere update (Fletcher 1987) |
4 | Conjugate-descent update of Fletcher (1987) |
The NLPCG subroutine is useful for
optimization problems with large .
For the unconstrained or boundary constrained case,
the NLPCG method needs only order
bytes of
working memory, whereas the other optimization
methods require order
bytes of working memory.
During
successive iterations, uninterrupted by restarts
or changes in the working set, the conjugate gradient
algorithm computes a cycle of
conjugate search directions.
In each iteration, a line search is done
along the search direction to find an
approximate optimum of the objective function.
The default line-search method uses quadratic
interpolation and cubic extrapolation to obtain a step
size
that satisfies the Goldstein conditions.
One of the Goldstein conditions can be violated if the
feasible region defines an upper limit for the step size.
You can specify other line-search algorithms
with the fifth element of the opt argument.
For an example of the NLPCG subroutine, see the section "Constrained Betts Function".
Copyright © 2009 by SAS Institute Inc., Cary, NC, USA. All rights reserved.