next up previous
Link to Statistics and Actuarial Science main page
Next: Assignment 4 Up: 22S:194 Statistical Inference II Previous: Assignment 3

Solutions

7.22
We have

$\displaystyle \overline{X}\vert\theta$ $\displaystyle \sim N(\theta,\sigma^{2}/n)$    
$\displaystyle \theta$ $\displaystyle \sim N(\mu,\tau^{2})$    

a.
The joint density of $ \overline{X}, \theta$ is

$\displaystyle f(\overline{x},\theta) = f(\overline{x}\vert\theta) f(\theta) \pr...
...igma^{2}}(\overline{x}-\theta)^{2} -\frac{1}{2\tau^{2}}(\theta-\mu)^{2}\right\}$    

This is a joint normal distribution. The means and variances are

$\displaystyle E[\theta]$ $\displaystyle = \mu$ $\displaystyle E[\overline{X}]$ $\displaystyle = E[\theta] = \mu$    
Var$\displaystyle (\theta)$ $\displaystyle = \tau^{2}$ Var$\displaystyle (\overline{X})$ $\displaystyle =$   Var$\displaystyle (\theta) + \frac{\sigma^{2}}{n} = \tau^{2} + \frac{\sigma^{2}}{n}$    

The covariance and correlation are

Cov$\displaystyle (\theta,\overline{X})$ $\displaystyle = E[\overline{X}\theta]-\mu^{2} = E[\theta^{2}]-\mu^{2} = \mu^{2}+\tau^{2}-\mu^{2} = \tau^{2}$    
$\displaystyle \rho$ $\displaystyle = \frac{\tau^{2}}{\sqrt{(\tau^{2}+\sigma^{2}/n)\sigma^{2}/n}}$    

b.
This means that the marginal distribution of $ \overline{X}$ is $ N(\mu, \tau^{2}+\sigma^{2}/n)$.
c.
The posterior distribuiton of $ \theta$ is

$\displaystyle f(\theta\vert\overline{x})$ $\displaystyle \propto \exp\left\{-\frac{n}{2\sigma^{2}}(\overline{X}-\theta)^{2} -\frac{1}{2\tau^{2}}(\theta-\mu)^{2}\right\}$    
  $\displaystyle \propto \exp\left\{ \left(\frac{n}{\sigma^{2}}\overline{x}+\frac{...
...ac{1}{2} \left(\frac{n}{\sigma^{2}}+\frac{1}{\tau^{2}}\right)\theta^{2}\right\}$    

This is a normal distribution with mean and variance

Var$\displaystyle (\theta\vert\overline{X})$ $\displaystyle = \left(\frac{n}{\sigma^{2}}+\frac{1}{\tau^{2}}\right)^{-1} = \frac{\tau^{2}\sigma^{2}/n}{\tau^{2}+\sigma^{2}/n}$    
$\displaystyle E[\theta\vert\overline{X}]$ $\displaystyle =$   Var$\displaystyle (\theta\vert\overline{X}) \left(\frac{n}{\sigma^{2}}\overline{X}+...
...u^{2}+\sigma^{2}/n}\overline{X} + \frac{\sigma^{2}/n}{\tau^{2}+\sigma^{2}/n}\mu$    

7.23
We have $ S^{2}\vert\sigma^{2} \sim$   Gamma$ ((n-1)/2, 2
\sigma^{2}/(n-1))$ and

$\displaystyle f(\sigma^{2}) = \frac{1}{\Gamma(\alpha)\beta^{\alpha}} \frac{1}{(\sigma^{2})^{\alpha+1}}e^{-1/(\beta\sigma^{2})}$    

The posterior distribution $ \sigma^{2}\vert S^{2}$ is therefore

$\displaystyle f(\sigma^{2}\vert s^{2})$ $\displaystyle \propto \frac{1}{(\sigma^{2})^{(n-1)/2}}e^{-s^{2}(n-1)/(2\sigma^{2})} \frac{1}{(\sigma^{2})^{\alpha+1}}e^{-1/(\beta\sigma^{2})}$    
  $\displaystyle =$   IG$\displaystyle (\alpha+(n-1)/2, (1/\beta+(n-1)s^{2}/2)^{-1})$    

If $ Y \sim$   IG$ (a,b)$, then $ V = 1/Y \sim$   Gamma$ (a,b)$. So

$\displaystyle E[Y]$ $\displaystyle = E[1/V] = \int_{0}^{\infty}\frac{1}{v}\frac{1}{\Gamma(a)b^{a}}v^{a-1}e^{-v/b}dv$    
  $\displaystyle = \frac{1}{b\Gamma(a)}\int_{0}^{\infty}z^{a-2}e^{-z}dz$    
  $\displaystyle = \frac{\Gamma(a-1)}{b\Gamma(a)} = \frac{1}{b(a-1)}$    

So the posterior mean of $ \sigma^{2}$ is

$\displaystyle E[\sigma^{2}\vert S^{2}] = \frac{1/\beta + (n-1)S^{2}/2}{\alpha+(n-3)/2}$    

7.33
From Example 7.3.5 the MSE of $ \widehat{p}_B$ is

$\displaystyle E[(\widehat{p}_B-p)^2]$ $\displaystyle = \frac{n p (1-p)}{(\alpha + \beta + n)^2} + \left(\frac{np + \alpha}{\alpha+\beta+n} - p\right)^2$    
  $\displaystyle = \frac{n p (1-p)}{(\sqrt{n/4} + \sqrt{n/4} + n)^2} + \left(\frac{np + \sqrt{n/4}}{\sqrt{n/4}+\sqrt{n/4}+n} - p\right)^2$    
  $\displaystyle = \frac{np(1-p) + (np + \sqrt{n/4} - p(\sqrt{n} + n))^2}{(\sqrt{n} + n)^2}$    
  $\displaystyle = \frac{np(1-p)+(\sqrt{n/4}-p\sqrt{n})^2}{(\sqrt{n} + n)^2}$    
  $\displaystyle = \frac{n}{(\sqrt{n} + n)^2}\left(p(1-p)+(1/2-p)^2\right)$    
  $\displaystyle = \frac{n}{(\sqrt{n} + n)^2}\left(p -p^2+1/4 + p^2 - p)^2\right)$    
  $\displaystyle = \frac{n/4}{(\sqrt{n} + n)^2}$    

which is constant in $ p$.
7.38
a.
The population density is

$\displaystyle \theta x^{\theta-1} = \theta x^{-1}e^{\theta\log x}$    

So $ T(X) = \frac{1}{n}\log X_{i}$ is efficient for $ \tau(\theta)=E_{\theta}[\log X_{1}]$.

$\displaystyle \tau(\theta)$ $\displaystyle = \int_{0}^{1}\log x \theta x^{\theta-1}dx$    
  $\displaystyle = - \int_{0}^{\infty} y \theta e^{-\theta y} dy = 1/\theta$    

b.
The population density is

$\displaystyle \frac{\log \theta}{\theta -1} \theta^{x} = \frac{\log \theta}{\theta-1}e^{x\log\theta}$    

So

$\displaystyle \sum \log f(x_{i}\vert\theta)$ $\displaystyle = n(\log\log\theta-\log(\theta-1))+\sum x_{i}\log\theta$    
$\displaystyle \sum\frac{\partial}{\partial\theta}\log f(x_{i}\vert\theta)$ $\displaystyle = n\left(\frac{1}{\theta\log\theta}-\frac{1}{\theta-1}+\frac{\overline{x}}{\theta}\right)$    
  $\displaystyle = \frac{n}{\theta}\left(\frac{1}{\log\theta}-\frac{\theta}{\theta...
...t(\overline{x}-\left(\frac{\theta}{\theta-1}-\frac{1}{\log\theta}\right)\right)$    

So $ \overline{X}$ is efficient for $ \tau(\theta) =
\frac{\theta}{\theta-1}-\frac{1}{\log\theta}$

7.39
Done in class.


next up previous
Link to Statistics and Actuarial Science main page
Next: Assignment 4 Up: 22S:194 Statistical Inference II Previous: Assignment 3
Luke Tierney 2003-05-04