The
logistic regression model computes several
assessment measures to help you evaluate how well the
model fits the data. These assessment measures are available at the top of the model pane. Click
the currently displayed
assessment measure to see all of the available assessment measures.
-2 Log Likelihood
The
likelihood function estimates the probability of an observed sample given all possible
parameter values. The
log likelihood is simply the logarithm of the likelihood function. The likelihood function value
is -2 times the log likelihood. Smaller values are preferred.
AIC
Akaike’s Information Criterion. Smaller values indicate better models, and
AIC values can become negative. AIC is based on the Kullback-Leibler information measure
of discrepancy between the true distribution of the
response variable and the distribution specified by the model.
AICC
Corrected Akaike’s Information Criterion. This version of AIC adjusts the value to
account for
sample size. The result is that extra effects penalize AICC more than AIC. As the sample size
increases, AICC and AIC converge.
BIC
The Bayesian Information Criterion (BIC), also known as Schwarz’s Bayesian Criterion
(SBC), is an increasing function of the model's
residual sum of squares and the number of effects. Unexplained variations in the response variable and the
number of effects increase the value of the BIC. As a result, a lower BIC
implies either fewer
explanatory variables, better fit, or both. BIC penalizes free
parameters more strongly than AIC.
R-Square
The R-squared value is an indicator of how well the model fits the data. R-squared
values can range from 0 to 1. Values closer to 1 are preferred.
Max-rescaled R-Square
The observed R-squared value divided by the maximum attainable R-squared value. This
value is useful when there are multiple independent
category variables. Values can range from 0 to 1. Values closer to 1 are preferred.