Iterative Algorithms for Model Fitting

Two iterative maximum likelihood algorithms are available in PROC SURVEYLOGISTIC  to obtain the pseudo-estimate of the model parameter . The default is the Fisher scoring method, which is equivalent to fitting by iteratively reweighted least squares. The alternative algorithm is the Newton-Raphson method. Both algorithms give the same parameter estimates; the covariance matrix of is estimated in the section Variance Estimation. For a generalized logit model, only the Newton-Raphson technique is available. You can use the TECHNIQUE= option in the MODEL statement to select a fitting algorithm.

Iteratively Reweighted Least Squares Algorithm (Fisher Scoring)

Let be the response variable that takes values . Let index all observations and be the value of response for the th observation. Consider the multinomial variable such that

     

and . With denoting the probability that the jth observation has response value i, the expected value of is , and . The covariance matrix of is , which is the covariance matrix of a multinomial random variable for one trial with parameter vector . Let be the vector of regression parameters—for example, for cumulative logit model. Let be the matrix of partial derivatives of with respect to . The estimating equation for the regression parameters is

     

where , and and are the WEIGHT and FREQ values of the th observation.

With a starting value of , the pseudo-estimate of is obtained iteratively as

     

where , , and are evaluated at the th iteration . The expression after the plus sign is the step size. If the log likelihood evaluated at is less than that evaluated at , then is recomputed by step-halving or ridging. The iterative scheme continues until convergence is obtained—that is, until is sufficiently close to . Then the maximum likelihood estimate of is .

By default, starting values are zero for the slope parameters, and starting values are the observed cumulative logits (that is, logits of the observed cumulative proportions of response) for the intercept parameters. Alternatively, the starting values can be specified with the INEST= option in the PROC SURVEYLOGISTIC statement.

Newton-Raphson Algorithm

Let

     
     

be the gradient vector and the Hessian matrix, where is the log likelihood for the th observation. With a starting value of , the pseudo-estimate of is obtained iteratively until convergence is obtained:

     

where and are evaluated at the th iteration . If the log likelihood evaluated at is less than that evaluated at , then is recomputed by step-halving or ridging. The iterative scheme continues until convergence is obtained—that is, until is sufficiently close to . Then the maximum likelihood estimate of is .