Monday, September 21, 2015

Try This Problem

Here's a little exercise for you to work on:

We know from the Gauss-Markhov Theorem that within the class of linear and unbiased estimators, the OLS estimator is most efficient. Because it is unbiased, it therefore has the smallest possible Mean Squared Error (MSE) within the linear and unbiased class of estimators. However, there are many linear estimators which, although biased, have a smaller MSE than the OLS estimator. You might then think of asking: 
“Why don’t I try and find the linear estimator that has the smallest possible MSE?”
(a) Show that attempting to do this yields an “estimator” that can’t actually be used in practice.

(You can do this using the simple linear regression model without an intercept, although the result generalizes to the usual multiple linear regression model.)

(b) Now, for the simple regression model with no intercept, 

         yi = β xi + εi       ;     εi ~ i.i.d. [0 , σ2] ,

find the linear estimator, β* , that minimizes the quantity:

h[Var.(β*) / σ2] + (1 - h)[Bias(β*)/ β]2 , for 0 < h < 1.

Is  β* a legitimate estimator, in the sense that it can actually be applied in practice?

The answer will follow in a subsequent post.


© 2015, David E. Giles

4 comments:

  1. Hi Dave: Is the answer no because you don't know the bias of B* until after you solve for h ? So, it's a circular
    problem ? Just a guess. Thanks.

    ReplyDelete
  2. Not quite - the answer is going to depend n an unknown parameter.

    ReplyDelete
  3. Back in grad skool, Randy Wigle invented the MVLS estimator, for cases like this. It stands for Minimum Variance Lucky Seven, and is calculated as B* = 7. Since Var(7) = 0, it works perfectly for h=1. Now, true, it *may* be biased, but we don't know whether or not it's biased without knowing the parameter B.

    ReplyDelete

Note: Only a member of this blog may post a comment.