The HPSPLIT Procedure

References

  • Agresti, A. and Coull, B. A. (1998), “Approximate Is Better Than 'Exact' for Interval Estimation of Binomial Proportions,” American Statistician, 52, 119–126.

  • Blyth, C. R. and Still, H. A. (1983), “Binomial Confidence Intervals,” Journal of the American Statistical Association, 78, 108–116.

  • Breiman, L., Friedman, J., Olshen, R. A., and Stone, C. J. (1984), Classification and Regression Trees, Belmont, CA: Wadsworth.

  • Friedman, J. H. (1977), “A Recursive Partitioning Decision Rule for Nonparametric Classification,” IEEE Transactions on Computers, 26, 404–408.

  • Hastie, T. J., Tibshirani, R. J., and Friedman, J. H. (2001), The Elements of Statistical Learning, New York: Springer-Verlag.

  • Kass, G. V. (1980), “An Exploratory Technique for Investigating Large Quantities of Categorical Data,” Applied Statistics, 29, 119–127.

  • Quinlan, R. J. (1993), C4.5: Programs for Machine Learning, San Francisco: Morgan Kaufmann.

  • Rokach, L. and Maimon, O. (2008), Data Mining with Decision Trees: Theory and Applications, volume 69 of Series in Machine Perception and Artificial Intelligence, London: World Scientific.

  • Soman, K. P., Diwakar, S., and Ajay, V. (2010), Insight into Data Mining: Theory and Practice, New Delhi: PHI Learning.

  • Utgoff, P. E. and Clouse, J. A. (1996), A Kolmogorov-Smirnov Metric for Decision Tree Induction, Technical Report 96-3, University of Massachusetts, Amherst.

  • Wilson, E. B. (1927), “Probable Inference, the Law of Succession, and Statistical Inference,” Journal of the American Statistical Association, 22, 209–212.