Genetic Algorithms |
For the symmetric traveling salesman problem, there is a simple local optimization that can be incorporated into a user objective function module, which is to check each pair of adjacent locations in the solution and swap their positions if that would improve the objective function value. Here is the previous TSP example, modified to use an objective function module that implements this strategy. In this initial example, the optimized solution is not written back out to the solution population (except to get the final solution at the end).
proc iml; /* cost coefficients for TSP problem */ coeffs = { 0 1 2 3 4 5 4 3 2 1, 1 0 1 2 3 4 5 4 3 2, 2 1 0 1 2 3 4 5 4 3, 3 2 1 0 1 2 3 4 5 4, 4 3 2 1 0 1 2 3 4 5, 5 4 3 2 1 0 1 2 3 4, 4 5 4 3 2 1 0 1 2 3, 3 4 5 4 3 2 1 0 1 2, 2 3 4 5 4 3 2 1 0 1, 1 2 3 4 5 4 3 2 1 0 }; start TSPObjectiveFunction(r) global(coeffs, p); s = r; nc = ncol(s); /* local optimization, assumes symmetric cost * * coefficients */ do i = 1 to nc; city1 = s[i]; inext = 1 + mod(i,nc); city2 = s[inext]; if i=1 then before = s[nc]; else before = s[i-1]; after = s[1 + mod(inext,nc)]; if (coeffs[before,city1] + coeffs[city2, after]) > (coeffs[before,city2] + coeffs[city1, after]) then do; s[i] = city2; s[inext] = city1; end; end; /* compute objective function */ cost = coeffs[s[nc],s[1]]; do i = 1 to nc-1; cost = cost + coeffs[s[i],s[i+1]]; end; if uniform(1234)<=p then r = s; return (cost); finish; /* problem setup */ id = gasetup(3, /* 3 -> integer sequence encoding */ 10, /* number of locations */ 123 /* initial random seed */ ); /* set objective function */ call gasetobj(id, 0, /* 0 -> minimize a user-defined module */ "TSPObjectiveFunction" ); call gasetcro(id, 1.0, 6); call gasetmut(id, 0.05, 4); call gasetsel(id, 1, 1, 0.95); p = 0; /* probability of writing locally optimized * solution back out to population */ /* initialization phase */ call gainit(id, 100 /* initial population size */ ); /* execute regeneration loop */ niter = 10; /* number of iterations */ bestValue = j(niter,1); /* to store results */ call gagetval(value, id, 1); /* gets first (and best) value */ bestValue[1] = value; do i = 2 to niter; call garegen(id); call gagetval(value, id, 1); bestValue[i] = value; end; /* print solution history */ print (t(1:niter))[l = "iteration"] bestValue; /* make sure local optimization is * written back to all solutions */ p = 1.; /* set global probability to 1 */ call gareeval(id); /* print final solution */ call gagetmem(bestMember, value, id, 1); print "best member " bestMember [f = 3.0 l = ""],, "final best value " value [l = ""]; call gaend(id);
The results of running this program are
iteration BESTVALUE 1 12 2 12 3 12 4 12 5 10 6 10 7 10 8 10 9 10 10 10 best member 7 6 5 4 3 2 1 10 9 8 final best value 10
Convergence is much improved by the local optimization, reaching the optimum in just 5 iterations compared to 13 with no local optimization. Writing some of the optimized solutions back to the solution population, by setting the global probability p to 0.05 or 0.15, will improve convergence even more.
Copyright © SAS Institute, Inc. All Rights Reserved.