Sunday, February 27, 2011

OCW-18.443: problem-set#4

My solutions to 4th problem set in MIT-Statistics for Applications course.

You should read the handout given with this assignment as same material is not covered in the text book.

1. We can simply use the known(derived in exp-8.4.3 of "Mathematical Statistics and Data Analysis") formulae. Here is the R-terminal interaction...
> data = c(0.312,0.238,0.446,0.968,0.576,0.471,0.596)
>
> estimated_variance = var(data)
> estimated_mean = mean(data)
>
> estimated_lambda = estimated_mean / estimated_variance
> estimated_lambda
[1] 9.089924
>
> estimated_alpha = (estimated_mean * estimated_mean)/estimated_variance
> estimated_alpha
[1] 4.683908
>
>


2. Let T(X) be an unbiased estimator of $p^2$
Then E[T(X)] = $p^2$
=> T(0)P(X=0) + T(1)P(X=1) + T(2)P(X=2) = $p^2$
=> ${(1-p)}^2$T(0) + 2p(1-p)T(1) + $p^2$T(2) = $p^2$
...
...
=> $p^2$(T(0) - 2T(1) + T(2) - 1) + 2p(T(1) - T(0)) + T(0) = 0

For above to be true for all values of p in [0,1], we have
T(0) - 2T(1) + T(2) - 1 = 0
T(1) - T(0) = 0
T(0) = 0

On solving above we get
T(0) = 0
T(1) = 0
T(2) = 1

This is the unbiased estimator of $p^2$. Most surprising part of above estimator is that it estimates $p^2$ to be 0 even if X = 1 that is there was one success(and hence p should not be 0)

3.


4.

No comments:

Post a Comment