The OPTMODEL Procedure

The Rosenbrock Problem

You can use parameters to produce a clear formulation of a problem. Consider the Rosenbrock problem,

\[ \displaystyle \mathop {\textrm{minimize}}\; f(x_1, x_2) = \alpha \, (x_2 - x_1^2)^2 + (1 - x_1)^2 \]

where $\alpha = 100$ is a parameter (constant), $x_1$ and $x_2$ are optimization variables (whose values are to be determined), and $f(x_1, x_2)$ is an objective function.

Here is a PROC OPTMODEL program that solves the Rosenbrock problem:

proc optmodel;
   number alpha = 100; /* declare parameter */
   var x {1..2};       /* declare variables */
   /* objective function */
   min f = alpha*(x[2] - x[1]**2)**2 +
           (1 - x[1])**2;
   /* now run the solver */
   solve;

   print x;
   quit;

The PROC OPTMODEL output is shown in Figure 5.3.

Figure 5.3: Rosenbrock Function Results

The OPTMODEL Procedure

Problem Summary
Objective Sense Minimization
Objective Function f
Objective Type Nonlinear
   
Number of Variables 2
Bounded Above 0
Bounded Below 0
Bounded Below and Above 0
Free 2
Fixed 0
   
Number of Constraints 0

Performance Information
Execution Mode Single-Machine
Number of Threads 4

Solution Summary
Solver NLP
Algorithm Interior Point
Objective Function f
Solution Status Optimal
Objective Value 8.204873E-23
   
Optimality Error 9.704881E-11
Infeasibility 0
   
Iterations 14
Presolve Time 0.00
Solution Time 0.00

[1] x
1 1
2 1