The NLP Procedure |
The following example is taken from the user's guide of GINO (Liebman et al. 1986). A simple network of five roads (arcs) can be illustrated by the path diagram:
Figure 4.11: Simple Road Network
The five roads connect four intersections illustrated by numbered nodes. Each minute vehicles enter and leave the network. Arc refers to the road from intersection to intersection , and the parameter refers to the flow from to . The law that traffic flowing into each intersection must also flow out is described by the linear equality constraint
The three linear equality constraints are linearly dependent. One of them is deleted automatically by the PROC NLP subroutines. Even though the default technique is used for this small example, any optimization subroutine can be used.
proc nlp all initial=.5; max y; parms x12 x13 x32 x24 x34; bounds x12 <= 10, x13 <= 30, x32 <= 10, x24 <= 30, x34 <= 10; /* what flows into an intersection must flow out */ lincon x13 = x32 + x34, x12 + x32 = x24, x24 + x34 = x12 + x13; y = x24 + x34 + 0*x12 + 0*x13 + 0*x32; run;
The iteration history is given in Output 4.9.1, and the optimal solution is given in Output 4.9.2.
Output 4.9.1: Iteration HistoryPROC NLP: Nonlinear Maximization
Value of Objective Function = 30
|
Finding a traffic pattern that minimizes the total delay to move vehicles per minute from node 1 to node 4 introduces nonlinearities that, in turn, demand nonlinear optimization techniques. As traffic volume increases, speed decreases. Let be the travel time on arc and assume that the following formulas describe the travel time as decreasing functions of the amount of traffic:
proc nlp all initial=.5; min y; parms x12 x13 x32 x24 x34; bounds x12 x13 x32 x24 x34 >= 0; lincon x13 = x32 + x34, /* flow in = flow out */ x12 + x32 = x24, x24 + x34 = 5; /* = f = desired flow */ t12 = 5 + .1 * x12 / (1 - x12 / 10); t13 = x13 / (1 - x13 / 30); t32 = 1 + x32 / (1 - x32 / 10); t24 = x24 / (1 - x24 / 30); t34 = 5 + .1 * x34 / (1 - x34 / 10); y = t12*x12 + t13*x13 + t32*x32 + t24*x24 + t34*x34; run;
The iteration history is given in Output 4.9.3, and the optimal solution is given in Output 4.9.4.
Output 4.9.3: Iteration HistoryNewton-Raphson Ridge Optimization
Without Parameter Scaling
|
PROC NLP: Nonlinear Minimization
Value of Objective Function = 40.303030303
|
The active constraints and corresponding Lagrange multiplier estimates (costs) are given in Output 4.9.5 and Output 4.9.6, respectively.
Output 4.9.5: Linear Constraints at SolutionOutput 4.9.7 shows that the projected gradient is very small, satisfying the first-order optimality criterion.
Output 4.9.7: Projected Gradient at SolutionThe projected Hessian matrix (shown in Output 4.9.8) is positive definite, satisfying the second-order optimality criterion.
Output 4.9.8: Projected Hessian at Solution
Copyright © 2008 by SAS Institute Inc., Cary, NC, USA. All rights reserved.