- Example 5.1: Analyzing Iris Data with PROC HPCANDISC
- Example 5.2: Performing Canonical Discriminant Analysis in Single-Machine and Distributed Modes

- Mixture Modeling for Binomial Overdispersion: "Student," Pearson, Beer, and Yeast
- Modeling Zero-Inflation: Is It Better to Fish Poorly or Not to Have Fished at All?
- Looking for Multiple Modes: Are Galaxies Clustered?

- Example 6.1: Modeling Mixing Probabilities: All Mice Are Equal, but Some Mice Are More Equal Than Others
- Example 6.2: The Usefulness of Custom Starting Values: When Do Cows Eat?
- Example 6.3: Enforcing Homogeneity Constraints: Count and Dispersion—It Is All Over!

- Mixture Modeling for Binomial Overdispersion: "Student," Pearson, Beer, and Yeast
- Modeling Zero-Inflation: Is It Better to Fish Poorly or Not to Have Fished at All?
- Looking for Multiple Modes: Are Galaxies Clustered?

- Example 6.1: Modeling Mixing Probabilities: All Mice Are Equal, but Some Mice Are More Equal Than Others
- Example 6.2: The Usefulness of Custom Starting Values: When Do Cows Eat?
- Example 6.3: Enforcing Homogeneity Constraints: Count and Dispersion—It Is All Over!

- Example 7.1: Model Selection
- Example 7.2: Modeling Binomial Data
- Example 7.3: Tweedie Model
- Example 7.4: Model Selection by the LASSO Method

- Example 9.1: Model Selection
- Example 9.2: Modeling Binomial Data
- Example 9.3: Ordinal Logistic Regression
- Example 9.4: Partitioning Data

- Example 11.1: Choosing a PLS Model by Test Set Validation
- Example 11.2: Fitting a PLS Model in Single-Machine and Distributed Modes

- Example 12.1: Analyzing Mean Temperatures of US Cities
- Example 12.2: Computing Principal Components in Single-Machine and Distributed Modes
- Example 12.3: Extracting Principal Components with NIPALS

- Example 14.1: Model Selection with Validation
- Example 14.2: Backward Selection in Single-Machine and Distributed Modes
- Example 14.3: Forward-Swap Selection
- Example 14.4: Forward Selection with Screening

- Example 15.1: Building a Classification Tree for a Binary Outcome
- Example 15.2: Cost-Complexity Pruning with Cross Validation
- Example 15.3: Creating a Regression Tree
- Example 15.4: Creating a Binary Classification Tree with Validation Data
- Example 15.5: Assessing Variable Importance