The ARIMA Procedure

References

  • Akaike, H. (1974), “A New Look at the Statistical Model Identification,” IEEE Transactions on Automatic Control, AC-19, 716–723.

  • Anderson, T. W. (1971), The Statistical Analysis of Time Series, New York: John Wiley & Sons.

  • Andrews, D. F. and Herzberg, A. M. (1985), A Collection of Problems from Many Fields for the Student and Research Worker, New York: Springer-Verlag.

  • Ansley, C. F. (1979), “An Algorithm for the Exact Likelihood of a Mixed Autoregressive–Moving Average Process,” Biometrika, 66, 59–65.

  • Ansley, C. F. and Newbold, P. (1980), “Finite Sample Properties of Estimators for Autoregressive Moving-Average Models,” Journal of Econometrics, 13, 159–183.

  • Bhansali, R. J. (1980), “Autoregressive and Window Estimates of the Inverse Correlation Function,” Biometrika, 67, 551–566.

  • Box, G. E. P. and Jenkins, G. M. (1976), Time Series Analysis: Forecasting and Control, Rev. Edition, San Francisco: Holden-Day.

  • Box, G. E. P., Jenkins, G. M., and Reinsel, G. C. (1994), Time Series Analysis: Forecasting and Control, 3rd Edition, Englewood Cliffs, NJ: Prentice-Hall.

  • Box, G. E. P. and Tiao, G. C. (1975), “Intervention Analysis with Applications to Economic and Environmental Problems,” Journal of the American Statistical Association, 70, 70–79.

  • Brocklebank, J. C. and Dickey, D. A. (2003), SAS for Forecasting Time Series, 2nd Edition, Cary, NC: SAS Institute Inc.

  • Brockwell, P. J. and Davis, R. A. (1991), Time Series: Theory and Methods, 2nd Edition, New York: Springer-Verlag.

  • Chatfield, C. (1980), “Inverse Autocorrelations,” Journal of the Royal Statistical Society, Series A, 142, 363–377.

  • Choi, B. (1992), ARMA Model Identification, New York: Springer-Verlag.

  • Cleveland, W. S. (1972), “The Inverse Autocorrelations of a Time Series and Their Applications,” Technometrics, 14, 277.

  • Cobb, G. W. (1978), “The Problem of the Nile: Conditional Solution to a Change Point Problem,” Biometrika, 65, 243–251.

  • Davidson, J. (1981), “Problems with the Estimation of Moving Average Models,” Journal of Econometrics, 16, 295.

  • Davies, N., Triggs, C. M., and Newbold, P. (1977), “Significance Levels of the Box-Pierce Portmanteau Statistic in Finite Samples,” Biometrika, 64, 517–522.

  • de Jong, P. and Penzer, J. (1998), “Diagnosing Shocks in Time Series,” Journal of the American Statistical Association, 93, 796–806.

  • Dickey, D. A. (1976), Estimation and Testing of Nonstationary Time Series, Ph.D. diss., Iowa State University.

  • Dickey, D. A. and Fuller, W. A. (1979), “Distribution of the Estimators for Autoregressive Time Series with a Unit Root,” Journal of the American Statistical Association, 74, 427–431.

  • Dickey, D. A., Hasza, D. P., and Fuller, W. A. (1984), “Testing for Unit Roots in Seasonal Time Series,” Journal of the American Statistical Association, 79, 355–367.

  • Dunsmuir, W. (1984), “Large Sample Properties of Estimation in Time Series Observed at Unequally Spaced Times,” in E. Parzen, ed., Time Series Analysis of Irregularly Observed Data, New York: Springer-Verlag.

  • Findley, D. F., Monsell, B. C., Bell, W. R., Otto, M. C., and Chen, B. C. (1998), “New Capabilities and Methods of the X-12-ARIMA Seasonal Adjustment Program,” Journal of Business and Economic Statistics, 16, 127–176.

  • Fuller, W. A. (1976), Introduction to Statistical Time Series, New York: John Wiley & Sons.

  • Hamilton, J. D. (1994), Time Series Analysis, Princeton, NJ: Princeton University Press.

  • Hannan, E. J. and Rissanen, J. (1982), “Recursive Estimation of Mixed Autoregressive Moving Average Order,” Biometrika, 69, 81–94.

  • Harvey, A. C. (1981), Time Series Models, New York: John Wiley & Sons.

  • Jones, R. H. (1980), “Maximum Likelihood Fitting of ARMA Models to Time Series with Missing Observations,” Technometrics, 22, 389–396.

  • Kohn, R. and Ansley, C. F. (1985), “Efficient Estimation and Prediction in Time Series Regression Models,” Biometrika, 72, 694–697.

  • Ljung, G. M. and Box, G. E. P. (1978), “On a Measure of Lack of Fit in Time Series Models,” Biometrika, 65, 297–303.

  • Montgomery, D. C. and Johnson, L. A. (1976), Forecasting and Time Series Analysis, New York: McGraw-Hill.

  • Morf, M., Sidhu, G. S., and Kailath, T. (1974), “Some New Algorithms for Recursive Estimation on Constant Linear Discrete Time Systems,” IEEE Transactions on Automatic Control, 19, 315–323.

  • Nelson, C. R. (1973), Applied Time Series for Managerial Forecasting, San Francisco: Holden-Day.

  • Newbold, P. (1981), “Some Recent Developments in Time Series Analysis,” International Statistical Review, 49, 53–66.

  • Newton, H. J. and Pagano, M. (1983), “The Finite Memory Prediction of Covariance Stationary Time Series,” SIAM Journal on Scientific and Statistical Computing, 4, 330–339.

  • Pankratz, A. (1983), Forecasting with Univariate Box-Jenkins Models: Concepts and Cases, New York: John Wiley & Sons.

  • Pankratz, A. (1991), Forecasting with Dynamic Regression Models, New York: John Wiley & Sons.

  • Pearlman, J. G. (1980), “An Algorithm for the Exact Likelihood of a High-Order Autoregressive–Moving Average Process,” Biometrika, 67, 232–233.

  • Priestley, M. B. (1981), Spectral Analysis and Time Series, London: Academic Press.

  • Schwarz, G. (1978), “Estimating the Dimension of a Model,” Annals of Statistics, 6, 461–464.

  • Stoffer, D. S. and Toloi, C. M. C. (1992), “A Note on the Ljung-Box-Pierce Portmanteau Statistic with Missing Data,” Statistics and Probability Letters, 13, 391–396.

  • Tsay, R. S. and Tiao, G. C. (1984), “Consistent Estimates of Autoregressive Parameters and Extended Sample Autocorrelation Function for Stationary and Nonstationary ARMA Models,” Journal of the American Statistical Association, 79, 84–96.

  • Tsay, R. S. and Tiao, G. C. (1985), “Use of Canonical Analysis in Time Series Model Identification,” Biometrika, 72, 299–315.

  • Woodfield, T. J. (1987), “Time Series Intervention Analysis Using SAS Software,” in Proceedings of the Twelfth Annual SAS Users Group International Conference, Cary, NC: SAS Institute Inc.