Testing of linear hypotheses based on estimable functions is discussed in the section Test of Hypotheses in Chapter 3: Introduction to Statistical Modeling with SAS/STAT Software, and the construction of special sets of estimable functions corresponding to Type I–Type IV hypotheses is discussed in Chapter 15: The Four Types of Estimable Functions. In linear regression models, testing of general linear hypotheses follows along the same lines. Test statistics are usually formed based on sums of squares that are associated with the hypothesis in question. Furthermore, when is of full rank—as is the case in many regression models—the consistency of the linear hypothesis is guaranteed.

Recall from Chapter 3: Introduction to Statistical Modeling with SAS/STAT Software, that the general form of a linear hypothesis for the parameters is , where is , is , and is . To test this hypothesis, you take the linear function with respect to the parameter estimates: . This linear function in has variance

The *sum of squares due to the hypothesis* is a simple quadratic form:

If this hypothesis is testable, then can be used in the numerator of an F statistic:

If is normally distributed, which follows as a consequence of normally distributed model errors, then this statistic follows an F distribution with q numerator degrees of freedom and denominator degrees of freedom. Note that it was assumed in this derivation that is of full row rank q.