Testing Linear Hypotheses about the Regression Coefficients

Linear hypotheses for $\btheta $ are expressed in matrix form as

\[  H_0\colon \mb {L}\btheta = \mb {c}  \]

where $\mb {L}$ is a matrix of coefficients for the linear hypotheses and $\mb {c}$ is a vector of constants. The vector of regression coefficients $\btheta $ includes slope parameters as well as intercept parameters. The Wald chi-square statistic for testing $H_0$ is computed as

\[  \chi ^2_{W} = (\mb {L}\hat{\btheta } - \mb {c})’ [{\mb {L}\hat{\bV }(\hat{\btheta })\mb {L}’}]^{-1} (\mb {L}\hat{\btheta } - \mb {c})  \]

where $\widehat{\bV }(\hat{\btheta })$ is the estimated covariance matrix in the section Variance Estimation. Under $H_0$, $\chi ^2_{W}$ has an asymptotic chi-square distribution with r degrees of freedom, where r is the rank of $\mb {L}$.