Likelihood Function

Let $\mb {g}(\cdot )$ be a link function such that

\[  \bpi =\mb {g}(\mb {x}, \btheta )  \]

where $\btheta $ is a column vector for regression coefficients. The pseudo-log likelihood is

\[  l(\btheta ) = \sum _{h=1}^ H\sum _{i=1}^{n_ h} \sum _{j=1}^{m_{hi}} w_{hij} \left( (\log (\bpi _{hij}))’\mb {y}_{hij}+ \log (\pi _{hij(D+1)})y_{hij(D+1)} \right)  \]

Denote the pseudo-estimator as $\hat{\btheta }$, which is a solution to the estimating equations:

\[  \sum _{h=1}^ H\sum _{i=1}^{n_ h} \sum _{j=1}^{m_{hi}} w_{hij}\mb {D}_{hij} \left(\mr {diag}(\bpi _{hij})-\bpi _{hij}\bpi _{hij}’\right)^{-1} (\mb {y}_{hij}-\bpi _{hij})=\mbox{\Strong0}  \]

where $\mb {D}_{hij}$ is the matrix of partial derivatives of the link function $\mb {g}$ with respect to $\btheta $.

To obtain the pseudo-estimator $\hat{\btheta }$, the procedure uses iterations with a starting value $\btheta ^{(0)}$ for $\btheta $. See the section Iterative Algorithms for Model Fitting for more details.