Genetic Algorithms

Example 17.2: Real-Valued Objective Optimization with Constant Bounds

The next example illustrates some of the strengths and weaknesses of the arithmetic and heuristic crossover operators. The objective function to be minimized is

  
 start sin_obj(x) global(xopt); 
   r = abs(sin(sum(abs(x-xopt)))); 
   return(r); 
 finish;
 
This function obviously has a minimum at x=xopt, and is not differentiable at all points. The following program sets xopt= 0 and specifies constant boundary constraints such that the optimum is in the interior of the search space, and specifies the heuristic crossover operator:
  
 proc iml; 
  
 /* objective function, has minimum of 0 at x = xopt */ 
 start sin_obj(x) global(xopt); 
   r = abs(sin(sum(abs(x-xopt)))); 
   return(r); 
 finish; 
  
 xopt = { 0 0 0 }; 
 optimum = xopt; 
 optval = sin_obj(optimum); 
  
 id = gasetup(1,   /* 1-> fixed-length floating point vector encoding */ 
              3,   /* 3-> length of solution vectors */ 
              1234 /* 0-> initial random seed */ 
             ); 
 call gasetobj(id,0,"sin_obj"); /* 0->minimize a user module, 
                                 * "sin_obj" is name of module */ 
 call gasetcro(id, 0.9, 4); /* crossover probabilty 0.9, 
                             * 4-> heuristic crossover operator   */ 
  
 call gasetmut(id,0.05,2,0.01); /* mutation probability 0.05, 
                                 * 2-> delta mutation operator 
                                 * 0.01 is delta value  */ 
  
 call gasetsel(id, 5, 1, 0.95); /* carry best 5 solutions over 
                                * to the next generation, dual 
                                * tournment with 0.95 best-player 
                                * wins probability   */ 
  
 bounds = {-1 -1 -1, 1 1 1}; 
 call gainit(id,200,bounds);    /* initialize population with 
                                 * 200 members, "bounds" gives 
                                 * upper and lower bounds for 
                                 * components of randomly 
                                 * randomly generated vectors */ 
 summary = j(20,2); 
 mattrib summary [c = {"bestValue", "avgValue"}]; 
 call gagetval(value, id); 
 summary[1,1] = value[1]; 
 summary[1,2] = value[:]; 
  
  
 do i = 2 to 20; 
  
    call garegen(id); 
  
    call gagetval(value, id); /* get all objective values of 
                               * the population */ 
    summary[i,1] = value[1]; 
    summary[i,2] = value[:]; 
  
 end; 
  
 iteration = t(1:20); 
 print iteration summary; 
 call gaend(id);
 
The output results are
  
                             SUMMARY 
                 ITERATION bestValue avgValue 
  
                         1  0.894517 0.8926763 
                         2  0.894517  0.752227 
                         3 0.1840732 0.6087493 
                         4   0.14112 0.4848342 
                         5   0.14112 0.3991614 
                         6   0.14112 0.3539561 
                         7 0.0481937 0.3680798 
                         8 0.0481937 0.3243406 
                         9 0.0481937 0.3027395 
                        10 0.0481937 0.2679123 
                        11 0.0481937 0.2550643 
                        12 0.0481937 0.2582514 
                        13 0.0481937 0.2652337 
                        14 0.0481937 0.2799655 
                        15 0.0383933  0.237546 
                        16 0.0383933 0.3008743 
                        17 0.0383933 0.2341022 
                        18 0.0383933 0.1966969 
                        19 0.0383933 0.2778152 
                        20 0.0383933 0.2690036
 
To show the convergence of the overall population, the average value of the objective function for the whole population is printed out as well as the best value. The optimum value for this formulation is 0, and the optimum solution is (0 0 0). The output shows the convergence of the GA to be slow, especially as the solutions get near the optimum. This is the result of applying the heuristic crossover operator to an ill-behaved objective function. If you change the crossover to the arithmetic operator by changing the GASETCRO call to
  
 call gasetcro(id, 0.9, 3); /* 3-> arithmetic crossover operator */
 
you get the following output:
  
                             SUMMARY 
                 ITERATION bestValue avgValue 
  
                         1  0.894517 0.8926763 
                         2  0.894517 0.8014329 
                         3 0.1840732 0.6496871 
                         4 0.1705931 0.4703868 
                         5 0.0984926 0.2892114 
                         6  0.076859 0.1832358 
                         7 0.0287965 0.1123732 
                         8 0.0273074 0.0720792 
                         9  0.018713 0.0456323 
                        10 0.0129708 0.0309648 
                        11 0.0087931 0.0240822 
                        12 0.0087931 0.0172102 
                        13 0.0050753 0.0128258 
                        14 0.0019603 0.0092872 
                        15 0.0016225 0.0070575 
                        16 0.0016225 0.0051149 
                        17 0.0012465 0.0036445 
                        18 0.0011895  0.002712 
                        19 0.0007646 0.0023329 
                        20 0.0007646 0.0020842
 
For this case, the arithmetic operator shows improved convergence. Suppose you change the problem characteristics again by changing the constraints so that the optimum lies on a boundary. The following statement moves the optimum to a boundary:
  
 bounds = {0 0 0, 1 1 1};
 
The output using the arithmetic operator is
  
                             SUMMARY 
                 ITERATION bestValue avgValue 
  
                         1 0.8813497 0.8749132 
                         2 0.8813497  0.860011 
                         3 0.3721446 0.8339357 
                         4 0.3721446   0.79106 
                         5 0.3721446  0.743336 
                         6 0.3721446 0.7061592 
                         7 0.3721446 0.6797346 
                         8 0.3721446 0.6302206 
                         9 0.3721446 0.5818008 
                        10 0.3721446 0.5327339 
                        11 0.3721446 0.5149562 
                        12 0.3721446   0.48525 
                        13 0.3721446 0.4708617 
                        14 0.3721446 0.4582203 
                        15 0.3721446  0.433538 
                        16 0.3721446 0.4256162 
                        17 0.3721446 0.4236062 
                        18 0.3721446 0.4149336 
                        19 0.3721446 0.4135214 
                        20 0.3721446 0.4078068
 
In this case, the algorithm fails to converge to the true optimum, given the characteristic of the arithmetic operator to converge on interior points. However, if you switch back to the heuristic crossover operator the results are
  
                            SUMMARY 
                ITERATION bestValue avgValue 
  
                        1 0.8813497 0.8749132 
                        2 0.8813497 0.7360591 
                        3 0.3721446 0.5465098 
                        4         0 0.3427185 
                        5         0 0.2006271 
                        6         0 0.0826017 
                        7         0 0.0158228 
                        8         0 0.0002602 
                        9         0   0.00005 
                       10         0   0.00065 
                       11         0    0.0003 
                       12         0    0.0002 
                       13         0    0.0002 
                       14         0  0.000285 
                       15         0    0.0005 
                       16         0 0.0002952 
                       17         0    0.0002 
                       18         0 0.0001761 
                       19         0   0.00035 
                       20         0   0.00035
 
These results show a rapid convergence to the optimum. This example illustrates how the results of a GA are very operator-dependent. For complicated problems with unknown solution, you might need to try a number of different combinations of parameters in order to have confidence that you have converged to a true global optimum.

Previous Page | Next Page | Top of Page