next up previous
Link to Statistics and Actuarial Science main page
Next: Assignment 13 Up: 22S:194 Statistical Inference II Previous: Assignment 12

Solutions

10.30 (b)
For the Huber M-estimator $ \psi(-\infty) = -k$ and $ \psi(\infty) = k$, so $ \eta = 1/(1+1) = 1/2$ and the breakdown is 50%.

The formula for the breakdown given in this problem is only applicable to monotone $ \psi$ functions. For redescending $ \psi$ functions the estimating equation need not have a unique root. To resolve this one can specify that an estimator should be determined using a local root finding procedure starting at, say, the sample median. In this case the M-estimator inherits the 50% breakdown of the median. See Huber, pages 53-55, for a more complete discussion.

Problem: Consider the setting of Problem 10.31. Derive an expression for $ -2 \log \Lambda$, where $ \Lambda$ is the likelihood ratio test statistic, and find the approximate distribution of this quantity under the null hypothesis.

Solution: The restricted likelihood corresponds to $ n_1+n_2$ Bernoulli trials with $ S_1+S_2$ successes and common success probability $ p$, so the MLE of $ p$ is $ \widehat{p} =
(S_1+S_2)/(n_1+n_2)$. The unrestricted likelihood consists of two independent sets of Bernoulli trials with success probabilities $ p_1$ and $ p_2$, and the correpsonding MLS's are $ \widehat{p}_1 =
S_1/n_1$ and $ \widehat{p}_2 = S_2/n_2$. The likelihood ratio statistic is therefore

$\displaystyle \Lambda = \frac{\widehat{p}^{S_1+S_2}(1-\widehat{p})^{F_1+F_2}} {...
...hat{p}_1}\right)^{F_1} \left(\frac{1-\widehat{p}}{1-\widehat{p}_2}\right)^{F_2}$    

and

$\displaystyle -2\log\Lambda = 2 \left(S_1 \log\frac{\widehat{p}_1}{\widehat{p}}...
...hat{p}_1}{1-\widehat{p}} + F_2 \log\frac{1-\widehat{p}_2}{1-\widehat{p}}\right)$    

The restricted parameter space under the null hypothesis is one-dimensional and the unrestricted parameter space is two-dimensional. Thus under the null hypothesis the distribution of $ -2 \log \Lambda$ is approximately $ \chi_1^2$.

10.38
The log likelihood for a random sample from a Gamma( $ \alpha,\beta$) distribution is

$\displaystyle \log L(\beta) = n \left(-\log \Gamma(\alpha) -\alpha \log\beta + (\alpha-1) \frac{1}{n}\sum\log X_i - \overline{X}/\beta\right)$    

So the score function is

$\displaystyle V_n(\beta) = \frac{\partial}{\partial\beta}\log L(\beta) = n \lef...
...rac{\overline{X}}{\beta^2} \right) = n \frac{\overline{x}-\alpha\beta}{\beta^2}$    

and the Fisher information is

$\displaystyle I_n(\beta) = - n E\left[\frac{\alpha}{\beta^2}-\frac{2\overline{X...
... \frac{2\alpha\beta}{\beta^3}-n\frac{\alpha}{\beta^2} = n\frac{\alpha}{\beta^2}$    

So the score statistic is

$\displaystyle \frac{V_n(\beta)}{\sqrt{I_n(\beta)}} = \sqrt{n}\frac{\overline{X}...
...t{\alpha}\beta} = \sqrt{n}\frac{\overline{X}-\alpha\beta}{\sqrt{\alpha\beta^2}}$    


next up previous
Link to Statistics and Actuarial Science main page
Next: Assignment 13 Up: 22S:194 Statistical Inference II Previous: Assignment 12
Luke Tierney 2003-05-04