Fit Statistics

The linear regression model computes several assessment measures to help you evaluate how well the model fits the data. These assessment measures are available at the top of the model pane. Click the currently displayed assessment measure to see all of the available assessment measures.
Adjusted R-square
The Adjusted R-squared value attempts to account for the addition of more effect variables. Values can range from 0 to 1. Values closer to 1 are preferred.
AIC
Akaike’s Information Criterion. Smaller values indicate better models, and AIC values can become negative. AIC is based on the Kullback-Leibler information measure of discrepancy between the true distribution of the response variable and the distribution specified by the model.
AICC
Corrected Akaike’s Information Criterion. This version of AIC adjusts the value to account for sample size. The result is that extra effects penalize AICC more than AIC. As the sample size increases, AICC and AIC converge.
Average Squared Error
The average squared error (ASE) is the sum of squared errors (SSE) divided by the number of observations. Smaller values are preferred.
F Value for Model
The value of the F test in a one-way ANOVA after the variances are normalized by the degrees of freedom. Larger values are better, but can indicate overfitting.
Mean Square Error
The mean squared error (MSE) is the SSE divided by the degrees of freedom for error. The degrees of freedom for error is the number of cases minus the number of weights in the model. This process yields an unbiased estimate of the population noise variance under the usual assumptions. Smaller values are preferred.
Observations
The number of observations used in the model.
Pr > F
The p-value associated with the corresponding F statistic. Smaller values are preferred.
R-Square
The R-squared value is an indicator of how well the model fits the data. R-squared values can range from 0 to 1. Values closer to 1 are preferred.
Root MSE
Square root of the MSE.
SBC
The Schwarz’s Bayesian Criterion (SBC), also known as the Bayesian Information Criterion (BIC), is an increasing function of the model’s residual sum of squares and the number of effects. Unexplained variations in the response variable and the number of effects increase the value of the SBC. As a result, a lower SBC implies either fewer explanatory variables, better fit, or both. SBC penalizes free parameters more strongly than AIC.