Language Reference


calculates Levenberg-Marquardt least squares

CALL NLPLM( rc, xr, "fun", x0, opt, blc, tc, par, "ptit", "jac">);

See the section "Nonlinear Optimization and Related Subroutines" for a listing of all NLP subroutines. See Chapter 11 for a description of the inputs to and outputs of all NLP subroutines.

The NLPLM subroutine uses the Levenberg-Marquardt method, which is an efficient modification of the trust-region method for nonlinear least squares problems and is implemented as in Moré (1978). This is the recommended algorithm for small to medium least squares problems. Large least squares problems can often be processed more efficiently with other subroutines, such as the NLPCG and NLPQN methods. In each iteration, the NLPLM subroutine solves a quadratically constrained quadratic minimization problem that restricts the step to the boundary or interior of an n-dimensional elliptical trust region.

The m functions f_1(x), ... ,f_m(x) are computed by the module specified with the "fun" module argument. The m x n Jacobian matrix, J, contains the first-order derivatives of the m functions with respect to the n parameters, as follows:
j(x) = (\nabla f_1, ... ,\nabla f_m)    = ( \frac{\partial f_i}{\partial x_j} )
You can specify J with the "jac" module argument; otherwise, the subroutine will compute it with finite difference approximations. In each iteration, the subroutine computes the crossproduct of the Jacobian matrix, j^t{j}, to be used as an approximate Hessian.

Note: In least squares subroutines, you must set the first element of the opt vector to m, the number of functions.

In addition to the standard iteration history, the NLPLM subroutine also prints the following information:

See the section "Unconstrained Rosenbrock Function" for an example that uses the NLPLM subroutine to solve the unconstrained Rosenbrock problem.

Previous Page | Next Page | Top of Page