The ENTROPY Procedure(Experimental)

Statistical Tests

Since the GME estimates have been shown to be asymptotically normally distributed, the classical Wald, Lagrange multiplier, and likelihood ratio statistics can be used for testing linear restrictions on the parameters.

Wald Tests

Let $H_{0}: L \beta = m$, where L is a set of linearly independent combinations of the elements of $\beta $. Then under the null hypothesis, the Wald test statistic,

\[ T_{W} = (L \beta - m)ā€™ \left( L(\hat{Var}(\hat{\beta }))Lā€™ \right)^{-1}(L \beta -m) \]

has a central $\chi ^{2}$ limiting distribution with degrees of freedom equal to the rank of L.

Pseudo-Likelihood Ratio Tests

Using the conditionally maximized entropy function as a pseudo-likelihood, F, Mittelhammer and Cardell (2000) state that:

\[ \frac{2 \hat{\psi }(\hat{\beta })}{\hat{\sigma _{\gamma }^{2}}(\hat{\beta })} \left( F(\hat{\beta }) - F(\tilde{\beta }) \right) \]

has the limiting distribution of the Wald statistic when testing the same hypothesis. Note that $F(\hat{\beta })$ and $F(\tilde{\beta })$ are the maximum values of the entropy objective function over the full and restricted parameter spaces, respectively.

Lagrange Multiplier Tests

Again using the GME function as a pseudo-likelihood, Mittelhammer and Cardell (2000) define the Lagrange multiplier statistic as:

\[ \frac{1}{\hat{\sigma _{\gamma }^{2}}(\tilde{\beta })} G(\tilde{\beta })ā€™(Xā€™X)^{-1} G(\tilde{\beta }) \]

where G is the gradient of F, which is being evaluated at the optimum point for the restricted parameters. This test statistic shares the same limiting distribution as the Wald and pseudo-likelihood ratio tests.