next up previous
Link to Statistics and Actuarial Science main page
Next: Assignment 6 Up: 22S:194 Statistical Inference II Previous: Assignment 5

Solutions

8.5
a.
The likelihood can be written as

$\displaystyle L(\theta,\nu) = \frac{\theta^{n}\nu^{n\theta}}{\prod x_{i}^{\theta+1}} 1_{[\nu,\infty)}(x_{(1)})$    

For fixed $ \theta$, this increases in $ \nu$ for $ \nu \le x_{(1)}$ and is then zero. So $ \widehat{\nu}=x_{(1)}$, and

$\displaystyle L^{*}(\theta)$ $\displaystyle = \max_{\nu} L(\theta,\nu) = \theta^{n}\prod\left(\frac{x_{(1)}}{x_{i}}\right)^{\theta} \frac{1}{\prod x_{i}}$    
  $\displaystyle \propto \theta^{n}e^{-\theta T}$    

So $ \widehat{\theta} = n/T$.
b.
The likelihood ratio criterion is

$\displaystyle \Lambda(x) = \frac{L^{*}(1)}{L^{*}(\widehat{\theta})} = \frac{e^{-T}}{\left(\frac{n}{T}\right)^{n}e^{-n}} =$   const$\displaystyle \times T^{n}e^{-T}$    

This is a unimodal function of $ T$; it increases from zero to a maximum at $ T = n$ and then decreases back to zero. Therefore for any $ c > 0$

$\displaystyle R = \{x: \Lambda(x) < c\} = \{x:$$ T < c_{1}$ or $ T > c_{2}$$\displaystyle \}$    

c.
The conditional density of $ X_2, \dots, X_n$, given $ X_1 =
x$ and $ X_i \ge X_1$, is

$\displaystyle f({x}_{2},\ldots,{x}_{n}\vert x_{1}, x_{i} \ge x_{1})$ $\displaystyle = \frac{f(x_{1}) \cdots f(x_{n})} {f(x_{1})P(X_{2}>X_{1}\vert X_{1}=x_{1}) \cdots P(X_{n}>X_{1}\vert X_{1}=x_{1})}$    
  $\displaystyle = \frac{f(x_{2}) \cdots f(x_{n})}{P(X_{2}>x_{1}) \cdots P(X_{n}>x_{1})}$    

and

$\displaystyle P(X_{i} > y) = \int_{y}^{\infty}\frac{\theta\nu^{\theta}}{x^{\theta+1}}dx = \frac{\nu^{\theta}}{y^{\theta}}$    

So

$\displaystyle f({x}_{2},\ldots,{x}_{n}\vert x_{1}, x_{i} \ge x_{1}) = \theta^{n-1}\prod_{i=2}^{n}\frac{x_{1}^{\theta}}{x_{i}^{\theta+1}} 1_{\{x_{i} > x_{1}\}}$    

Let $ Y_{1} = X_{i}/x_{1}$, $ i = 2,\ldots,n$. Then

$\displaystyle f_{Y}({y}_{2},\ldots,{y}_{n}\vert x_{1}, x_{i} > x_{1})$ $\displaystyle = x_{1}^{n-1}f(y_{2}x_{1},\ldots,y_{n}x_{1}\vert x_{1},x_{i}>x_{1})$    
  $\displaystyle = \frac{\theta^{n-1}}{y_{2}^{\theta+1},\ldots,y_{n}^{\theta+1}}$    

i.e. $ {Y}_{2},\ldots,{Y}_{n}$ are $ i.i.d.$ with density $ \theta/y^{\theta+1}$, and $ T = \log Y_{2}+ \cdots + \log Y_{n}$.

If $ Z = \log Y$, then

$\displaystyle f_{Z}(z) = f_{Y}(y)\frac{dy}{dz} = \frac{\theta}{e^{(\theta+1)z}}e^{z} = \theta e^{-\theta z}$    

and thus $ T\vert\{X_{1}=x_{1}, X_{i} > X_{1}\} \sim$   Gamma$ (n-1,1/\theta)$. By symmetry, this means that $ T\vert X_{(1)} \sim$   Gamma$ (n-1,1/\theta)$, which is indepentent of $ X_{(1)}$, so $ T$ has this distribution unconditionaly as well.

For $ \theta = 1$,

$\displaystyle T$ $\displaystyle \sim$   Gamma$\displaystyle (n-1,1)$    
$\displaystyle 2T$ $\displaystyle \sim$   Gamma$\displaystyle (n-1,2) = \chi^{2}_{n-1}$    

8.6
a.
The likelihood ratio criterion is

$\displaystyle \Lambda$ $\displaystyle = \frac{\left(\frac{n+m}{\sum X_{i}+\sum Y_{i}}\right)^{n+m}e^{-n...
...ac{n}{\sum X_{i}}\right)^{n}e^{-n} \left(\frac{n}{\sum Y_{i}}\right)^{m}e^{-m}}$    
  $\displaystyle = \frac{(n+m)^{n+m}}{n^{n}m^{m}} \frac{(\sum X_{i})^{n}(\sum Y_{i})^{m}}{(\sum X_{i}+\sum Y_{i})^{n+m}}$    

The test rejects if this is small.
b.
The likelihood ratio criterion is of the form $ \Lambda =$   const$ \times T^{n}(1-T)^{m}$. So the test rejects if $ T$ is too small or too large.
c.
Under $ H_{0}$, $ T \sim$   Beta$ (n,m)$.


next up previous
Link to Statistics and Actuarial Science main page
Next: Assignment 6 Up: 22S:194 Statistical Inference II Previous: Assignment 5
Luke Tierney 2003-05-04