References

  • Allen, D. M. (1971), “Mean Square Error of Prediction as a Criterion for Selecting Variables,” Technometrics, 13, 469–475.

  • Allen, D. M. and Cady, F. B. (1982), Analyzing Experimental Data by Regression, Belmont, CA: Lifetime Learning Publications.

  • Belsley, D. A., Kuh, E., and Welsch, R. E. (1980), Regression Diagnostics: Identifying Influential Data and Sources of Collinearity, New York: John Wiley & Sons.

  • Bock, R. D. (1975), Multivariate Statistical Methods in Behavioral Research, New York: McGraw-Hill.

  • Box, G. E. P. (1966), “The Use and Abuse of Regression,” Technometrics, 8, 625–629.

  • Cleveland, W. S., Devlin, S. J., and Grosse, E. (1988), “Regression by Local Fitting,” Journal of Econometrics, 37, 87–114.

  • Cook, R. D. (1977), “Detection of Influential Observations in Linear Regression,” Technometrics, 19, 15–18.

  • Cook, R. D. (1979), “Influential Observations in Linear Regression,” Journal of the American Statistical Association, 74, 169–174.

  • Daniel, C. and Wood, F. (1980), Fitting Equations to Data, Rev. Edition, New York: John Wiley & Sons.

  • Darlington, R. B. (1968), “Multiple Regression in Psychological Research and Practice,” Psychological Bulletin, 69, 161–182.

  • Davis, A. W. (1970), “Differential Equation of Hotelling’s Generalized $T^2$,” Annals of Statistics, 39, 815–832.

  • Davis, A. W. (1972), “On the Marginal Distributions of the Latent Roots of the Multivariate Beta Matrix,” Biometrika, 43, 1664–1670.

  • Davis, A. W. (1979), “On the Differential Equation for Meijer’s $G_{p,p}^{p,0}$ Function, and Further Tables of Wilks’s Likelihood Ratio Criterion,” Biometrika, 66, 519–531.

  • Davis, A. W. (1980), “Further Tabulation of Hotelling’s Generalized $T^2$,” Communications in Statistics—Simulation and Computation, 9, 321–336.

  • Draper, N. R. and Smith, H. (1981), Applied Regression Analysis, 2nd Edition, New York: John Wiley & Sons.

  • Durbin, J. and Watson, G. S. (1951), “Testing for Serial Correlation in Least Squares Regression,” Biometrika, 37, 409–428.

  • Freund, R. J. and Littell, R. C. (1986), SAS System for Regression, 1986 Edition, Cary, NC: SAS Institute Inc.

  • Freund, R. J., Littell, R. C., and Spector, P. C. (1991), SAS System for Linear Models, Cary, NC: SAS Institute Inc.

  • Goodnight, J. H. (1979), “A Tutorial on the Sweep Operator,” American Statistician, 33, 149–158.

  • Hawkins, D. M. (1980), “A Note on Fitting a Regression with No Intercept Term,” American Statistician, 34, 233.

  • Hosmer, D. W., Jr. and Lemeshow, S. (1989), Applied Logistic Regression, New York: John Wiley & Sons.

  • Huber, P. J. (1973), “Robust Regression: Asymptotics, Conjectures, and Monte Carlo,” Annals of Statistics, 1, 799–821.

  • Johnston, J. (1972), Econometric Methods, 2nd Edition, New York: McGraw-Hill.

  • Kennedy, W. J., Jr. and Gentle, J. E. (1980), Statistical Computing, New York: Marcel Dekker.

  • Kvalseth, T. O. (1985), “Cautionary Note about $R^2$,” American Statistician, 39, 279–285.

  • Lee, Y. (1972), “Some Results on the Distribution of Wilks’ Likelihood Ratio Criterion,” Biometrika, 95, 649.

  • Mallows, C. L. (1973), “Some Comments on $C_ p$,” Technometrics, 15, 661–675.

  • Mardia, K. V., Kent, J. T., and Bibby, J. M. (1979), Multivariate Analysis, London: Academic Press.

  • Morrison, D. F. (1976), Multivariate Statistical Methods, 2nd Edition, New York: McGraw-Hill.

  • Mosteller, F. and Tukey, J. W. (1977), Data Analysis and Regression, Reading, MA: Addison-Wesley.

  • Muller, K. E. (1998), “A New F Approximation for the Pillai-Bartlett Trace under $H_0$,” Journal of Computational and Graphical Statistics, 7, 131–137.

  • Neter, J. and Wasserman, W. (1974), Applied Linear Statistical Models, Homewood, IL: Irwin.

  • Pillai, K. C. S. (1960), Statistical Table for Tests of Multivariate Hypotheses, Manila: Statistical Center, University of Philippines.

  • Pillai, K. C. S. and Flury, B. N. (1984), “Percentage Points of the Largest Characteristic Root of the Multivariate Beta Matrix,” Communications in Statistics—Theory and Methods, 13, 2199–2237.

  • Pindyck, R. S. and Rubinfeld, D. L. (1981), Econometric Models and Econometric Forecasts, 2nd Edition, New York: McGraw-Hill.

  • Rao, C. R. (1973), Linear Statistical Inference and Its Applications, 2nd Edition, New York: John Wiley & Sons.

  • Rawlings, J. O. (1988), Applied Regression Analysis: A Research Tool, Pacific Grove, CA: Wadsworth & Brooks/Cole Advanced Books & Software.

  • Rousseeuw, P. J. (1984), “Least Median of Squares Regression,” Journal of the American Statistical Association, 79, 871–880.

  • Rousseeuw, P. J. and Yohai, V. (1984), “Robust Regression by Means of S-Estimators,” in J. Franke, W. Härdle, and R. D. Martin, eds., Robust and Nonlinear Time Series Analysis, number 26 in Lecture Notes in Statistics, 256–274, Berlin: Springer-Verlag.

  • Timm, N. H. (1975), Multivariate Analysis with Applications in Education and Psychology, Monterey, CA: Brooks/Cole.

  • Weisberg, S. (1985), Applied Linear Regression, 2nd Edition, New York: John Wiley & Sons.

  • Yohai, V. J. (1987), “High Breakdown Point and High Efficiency Robust Estimates for Regression,” Annals of Statistics, 15, 642–656.

  • Younger, M. S. (1979), Handbook for Linear Regression, North Scituate, MA: Duxbury Press.