Previous Page | Next Page

The OPTEX Procedure

Memory and Run-Time Considerations

The OPTEX procedure provides a computationally intensive approach to designing an experiment, and therefore, some finesse is called for to make the most efficient use of computer resources.

The OPTEX procedure must retain the entire set of candidate points in memory. This is necessary because all of the search algorithms access these points repeatedly. If this requires more memory than is available, consider using knowledge of the problem to reduce the set of candidate points. For example, for first- or second-order models, it is usually adequate to restrict the candidates to just the center and the edges of the experimental region or perhaps an even smaller set; see the introductory examples in the sections Handling Many Variables and Constructing a Mixture-Process Design.

The distance-based criteria (CRITERION=U and CRITERION=S) also require repeated access to the distance between candidate points. The procedure will try to fit the matrix of these distances in memory; if it cannot, it will recompute them as needed, but this will cause the search to be dramatically slower.

The run time of each search algorithm depends primarily on , the size of the target design and on , the number of candidate points. For a given model, the run times of the sequential, exchange, and DETMAX algorithms are all roughly proportional to both and (that is, ). The run times for the two simultaneous switching algorithms (FEDOROV and M_FEDOROV) are roughly proportional to the product of and (that is, ). The constant of proportionality is larger when searching for A-optimal designs because the update formulas are more complicated (see "Search Methods," which follows).

For problems where either or is large, it is a good idea to make a few test runs with a faster algorithm and a small number of tries before attempting to use one of the slower and more reliable search algorithms. For most problems, the efficiency of a design found by a faster algorithm will be within one or two percent of that for the best possible design, and this is usually sufficient if it appears that searching with a slower algorithm is infeasible.

Previous Page | Next Page | Top of Page