Testing for Normality

The NORMAL option in the FIT statement performs multivariate and univariate tests of normality.

The three multivariate tests provided are Mardia’s skewness test and kurtosis test (Mardia 1970) and the Henze-Zirkler ${T_{n, {\beta }}}$ test (Henze and Zirkler 1990). The two univariate tests provided are the Shapiro-Wilk W test and the Kolmogorov-Smirnov test. (For details on the univariate tests, refer to Goodness-of-Fit Tests section in The UNIVARIATE Procedure chapter in the Base SAS Procedures Guide.) The null hypothesis for all these tests is that the residuals are normally distributed.

For a random sample ${X_{1}, {\ldots }, X_{n}}$, ${X_{i} {\in } \mr {R}^{d}}$, where d is the dimension of ${X_{i}}$ and n is the number of observations, a measure of multivariate skewness is

\[  b_{1,d} = \frac{1}{n^{2}} \sum _{i=1}^{n} \sum _{j=1}^{n}{[ ( X_{i} - {\mu })’ {\bS }^{-1} (X_{j} - {\mu })]^{3} }  \]

where S is the sample covariance matrix of X. For weighted regression, both S and ${(X_{i} - {\mu })}$ are computed by using the weights supplied by the WEIGHT statement or the _WEIGHT_ variable.

Mardia showed that under the null hypothesis ${\frac{n}{6}b_{1,d}}$ is asymptotically distributed as ${{\chi }^{2}( d(d+1)(d+2)/6)}$. For small samples, Mardia’s skewness test statistic is calculated with a small sample correction formula, given by ${\frac{nk}{6}b_{1,d}}$ where the correction factor $k$ is given by $k=(d+1)(n+1)(n+3)/n(((n+1)(d+1))-6)$. Mardia’s skewness test statistic in PROC MODEL uses this small sample corrected formula.

A measure of multivariate kurtosis is given by

\[  b_{2,d} = \frac{1}{n} \sum _{i=1}^{n}{[( X_{i} - {\mu })^{} {\bS }^{-1} ( X_{i} - {\mu })]^{2} }  \]

Mardia showed that under the null hypothesis, ${b_{2,d}}$ is asymptotically normally distributed with mean ${d(d+2)}$ and variance ${8d(d+2)/n}$.

The Henze-Zirkler test is based on a nonnegative functional ${D(.,.)}$ that measures the distance between two distribution functions and has the property that

\[  D(\mr {N}_{d}(0, I_{d}), Q) = 0  \]

if and only if

\[  Q = \mr {N}_{d}(0, I_{d})  \]

where ${\mr {N}_{d}({\mu }, {\Sigma }_{d}) }$ is a d-dimensional normal distribution.

The distance measure ${D(.,.)}$ can be written as

\[  D_{{\beta }}( P, Q ) = \int _{\mr {R}^{d}}^{}{| \hat{P}(t) - \hat{Q}(t) |^{2} {\varphi }_{{\beta }}(t) dt}  \]

where ${\hat{P}(t)}$ and ${\hat{Q}(t)}$ are the Fourier transforms of P and Q, and ${{\varphi }_{{\beta }}(t)}$ is a weight or a kernel function. The density of the normal distribution ${\mr {N}_{d}(0,{\beta }^{2}I_{d})}$ is used as ${{\varphi }_{{\beta }}(t)}$

\[  {\varphi }_{{\beta }}(t) = ( 2{\pi }{\beta }^{2})^{\frac{-d}{2}} \mr {exp} ( \frac{- |t|^{2}}{2{\beta }^{2}} ), t ~ {\in }~  \mr {R}^{d}  \]

where ${|t| = ( t^{} t)^{0.5}}$.

The parameter ${{\beta }}$ depends on ${n}$ as

\[  {\beta }_{d}(n) = \frac{1}{\sqrt {2}}( \frac{2d+1}{4} )^{1/(d+4)} n^{1/(d+4)}  \]

The test statistic computed is called ${T_{{\beta }}(d)}$ and is approximately distributed as a lognormal. The lognormal distribution is used to compute the null hypothesis probability.

$\displaystyle  T_{{\beta }}(d) =  $
$\displaystyle \frac{1}{n^{2}} $
$\displaystyle  \sum _{j=1}^{n}{\sum _{k=1}^{n}{{\exp }(- \frac{{\beta }^{2}}{2} |Y_{j} - Y_{k}|^{2})}}  $
$\displaystyle  $
$\displaystyle - $
$\displaystyle 2(1+{\beta }^{2})^{-d/2} \frac{1}{n} \sum _{j=1}^{n}{{\exp }(- \frac{{\beta }^{2}}{2(1+{\beta }^{2})} |Y_{j}|^{2})} + (1+2{\beta }^{2})^{-d/2} \nonumber  $


\[  |Y_{j} - Y_{k}|^{2} = (X_{j} - X_{k})’ {\bS }^{-1} (X_{j} - X_{k})  \]
\[  |Y_{j}|^{2} = (X_{j} - \bar{X})’ {\bS }^{-1} (X_{j} - \bar{X})  \]

Monte Carlo simulations suggest that ${T_{{\beta }}(d)}$ has good power against distributions with heavy tails.

The Shapiro-Wilk W test is computed only when the number of observations (n ) is less than ${2000}$ while computation of the Kolmogorov-Smirnov test statistic requires at least ${2000}$ observations.

The following is an example of the output produced by the NORMAL option.

proc model data=test2;
   y1 = a1 * x2 * x2 - exp( d1*x1);
   y2 = a2 * x1 * x1 + b2 * exp( d2*x2);
   fit y1 y2 / normal ;

Figure 19.40: Normality Test Output

The MODEL Procedure

Normality Test
Equation Test Statistic Value Prob
y1 Shapiro-Wilk W 0.37 <.0001
y2 Shapiro-Wilk W 0.84 <.0001
System Mardia Skewness 286.4 <.0001
  Mardia Kurtosis 31.28 <.0001
  Henze-Zirkler T 7.09 <.0001