


Linear hypotheses for
are expressed in matrix form as
![\[ H_{0}\colon \mb{L}{\bbeta }=\mb{c} \]](images/statug_phreg0494.png)
where L is a matrix of coefficients for the linear hypotheses, and c is a vector of constants. The Wald chi-square statistic for testing
is computed as
![\[ \chi ^{2}_{W}=\left( \mb{L}\hat{\bbeta }-\mb{c} \right) ’ \left[ \mb{L}\hat{\mb{V}}(\hat{\bbeta })\mb{L}’ \right] ^{-1} \left( \mb{L}\hat{\bbeta }-\mb{c} \right) \]](images/statug_phreg0496.png)
where
is the estimated covariance matrix. Under
,
has an asymptotic chi-square distribution with r degrees of freedom, where r is the rank of
.
Let
, where
is a subset of s regression coefficients. For any vector
of length s,
![\[ \mb{e}’\hat{\bbeta }_0 \sim N(\mb{e}’\bbeta _0, \mb{e}’\hat{\bV }(\hat{\bbeta _0})\mb{e}) \]](images/statug_phreg0501.png)
To find
such that
has the minimum variance, it is necessary to minimize
subject to
. Let
be a vector of 1’s of length s. The expression to be minimized is
![\[ \mb{e}’\hat{\bV }(\hat{\bbeta }_0) \mb{e} - \lambda (\mb{e}’\mb{1}_ s -1) \]](images/statug_phreg0507.png)
where
is the Lagrange multiplier. Differentiating with respect to
and
, respectively, yields

Solving these equations gives
![\[ \mb{e}= [\mb{1}_ s’{\hat{\bV }^{-1}(\hat{\bbeta }_0)} \mb{1}_ s]^{-1} {\hat{\bV }^{-1}(\hat{\bbeta }_0)} \mb{1}_ s \]](images/statug_phreg0509.png)
This provides a one degree-of-freedom test for testing the null hypothesis
with normal test statistic
![\[ Z = \frac{\mb{e}'\hat{\bbeta }_0}{\sqrt {\mb{e}’\hat{\bV }(\hat{\bbeta }_0)\mb{e}}} \]](images/statug_phreg0511.png)
This test is more sensitive than the multivariate test specified by the TEST statement
Multivariate: test X1, ..., Xs;
where X1, …, Xs are the variables with regression coefficients
, respectively.