Overview of Model Comparison, Validation, and Summary Reports

What Are Model Comparison, Validation, and Summary Reports?

The SAS Model Manager model comparison, validation, and summary reports are tools that you can use to evaluate and compare the candidate models in a version or across versions to help you select and approve the champion model that moves to production status. The model comparison reports are analytical tools that project managers, statisticians, and analysts can use to assess the structure, performance, and resilience of candidate models. The model validation reports use statistical measures to validate the stability, performance, and calibration of risk models and parameters. The training summary data set report creates frequency and distribution charts that summarize the train table variables.
The reports present information about a number of attributes that can affect model performance. Together, the reports provide qualified information that can serve as the analytical basis for choosing and monitoring a champion model.
Here is a description of the comparison reports:
For a single model, this report displays the profile data that is associated with input, output, and target variables. Profile data includes the variable name, type, length, label, SAS format, measurement level, and role.
This report compares the profile data for two models and notes the differences.
The Dynamic Lift report provides visual summaries of the performance of one or more models for predicting a binary outcome variable.
The Interval Target Variable report creates two plots for you to view the actual versus predicted values for a model and the actual versus residual values for a model. Interval Target Variable report can be created only for prediction models.
The following are the Basel II model validation reports:
The Loss Given Default (LGD) report calculates the amount that might be lost in an investment and calculates the economic or regulatory capital for Basel II compliance.
The Probability of Default (PD) Validation report estimates the probability of defaulting on a debt that is owed. Probability of default is used to calculate economic or regulatory capital for Basel II compliance.
The model validation reports use statistical measures that report on these model validation measures:
  • The model stability measures track the change in distribution for the modeling data and scoring data.
  • The model performance measures check the model’s ability to distinguish between accounts that have not defaulted and accounts that have defaulted, as well as report on the relationship between actual default probability and predicted default probability.
  • The model calibration measures check the accuracy of the selected models for the LGD and the PD reports by comparing the correct quantification of the risk components with the available standards.
This is the train table data set summary report:
The Training Summary Data Set report creates frequency and distribution charts for a training data set.
After you execute a performance definition, you can generate performance monitoring results and compare the champion and challenger models:
After you execute a performance definition, SAS Model Manager stores the output data sets in the project folder. You can format the performance monitoring results and then view the performance monitoring results report.
After you execute a performance definition for the champion model, you can execute a performance definition for the challenger model using the same performance data sets. You can then create a Champion and Challenger Performance report that compares the performance of the two models.
You create the reports using the New Report window that you start from a project’s Reports page.

The Model Comparison, Validation, and Summary Report Input Files

SAS Model Manager uses a test table as the input table for the Dynamic Lift report and the Interval Target Variable report.
Before you can create a Dynamic Lift report or the Interval Target Variable report, make sure that a test table has been added to the SAS Metadata Repository and registered in the Data Tables category or SAS Management Console. The test table can be viewed in the Data Tables category view. Then, specify the test table in the project property Default test table.
You specify the input table for validation reports in the New Report window. The input file for the validation reports can contain only input variables or it can contain input and output variables. If the input table contains input and output variables, the report generation does not need to run a scoring test to obtain the output variables.
When you create a train table summary report, the train table or specified input table is used to create the training summary data sets. The train table must be available in the SAS Metadata Repository. The train table must then be specified in the project property for the Default train table.

The Model Comparison, Validation, and Summary Report Output Files

The Reports page stores the model comparison, validation, and summary report output files in the Model Evaluation tab. The name of the report is the value of the Name box that you specified in the New Report window.
Each time you create a report, these files are generated:
  • the report in either HTML, PDF, RTF, or EXCEL format
    Note: The Loss Given Default and Probability of Default Model Validation reports can be created only in PDF and RTF formats.
  • taskCode.log
  • taskCode.sas
Here is a description of the model comparison output files:
Report File
Description
report-name.html
This file is the report output in HTML format.
report-name.pdf
This file is the report output in PDF format.
report-name.rtf
This file is the report output in RTF format.
report-name.xls
This file is the report output in Excel format.
taskCode.log
This file is the log file that contains messages from running the SAS code to create the report.
taskCode.sas
This file is the SAS code that is used to create the report.
After you create a report, you can view the report from the Reports page.
Note: If you save a report to a local drive, images in the reports, such as graphs, do not appear. The report images are separate files and are stored in the SAS Content Server. Always view reports from the Reports page.