Multiple Regression |
The Parameter Estimates table, as shown in Figure 14.5, displays the parameter estimates and the corresponding degrees of freedom, standard deviation, t statistic, and p-values. Using the parameter estimates, you can also write out the fitted model:
The t statistic is used to test the null hypothesis that a parameter is 0 in the model. In this example, only the coefficient for HSM appears to be statistically significant (p 0.0001). The coefficients for HSS and HSE are not significant, partly because of the relatively high correlations among the three explanatory variables. Once HSM is included in the model, adding HSS and HSE does not substantially improve the model fit. Thus, their corresponding parameters are not statistically significant.
Two other statistics, tolerance and variance inflation, also appear in the Parameter Estimates table. These measure the strength of interrelationships among the explanatory variables in the model. Tolerances close to 0 and large variance inflation factor values indicate strong linear association or collinearity among the explanatory variables (Rawlings 1988, p. 277). For the GPA data, these statistics signal no problems of collinearity, even for HSE and HSS, which are the two most highly correlated variables in the data set.
Copyright © 2007 by SAS Institute Inc., Cary, NC, USA. All rights reserved.