Today I had to learn the Gauss-Markov Theorem. Technically, I was supposed to have learned it a few weeks ago in class, but on that day, I was really excited to plan out my tentative class schedule for the fall semester because the Economics department had just posted thier classes. So I didn't pay attention.
I learned it today for my exam tomorrow. It has a snappy acronym and everything. B.L.U.E. (Best Linear Unbiased Estimator). Plus, I got to make up a hand game to remember the assumptions. There's 6 of them, so they match up with my fingers. Please forgive the lack of subscripts:
1. y=b0+b1x+e (I have thumbs, Baboons do not. Baboon has 2 Bs)
2. e=0 (I shot it with my index finger in the shape of a gun, so it's dead)
3. e and y are homoscedastic (homo sounds dirty. It goes with my middle finger)
4. cov(ei,ej)=0=cov(yi,yj) (cov means covariance- co means with, my ring finger is empty, zero ring)
5. x is a random variable and must have 2 values (my pinky finger is random?)
6. e is normally distributed (when I fold my hands together with my fingers interlocking, it makes a shape not unlike a normal distribution)
I have two little hands folded snugly and tight... they are tiny and weak, but they know what is right! (Courtesy of the Children's Songbook)
2 comments:
In terms of the matrix algebra formulation, the Gauss–Markov theorem shows that the difference between the parameter covariance matrix of an arbitrary linear unbiased estimator and OLS is positive semi definite.
Well, in terms of my hands (and the OLS regression model), the Gauss-Markov Theorem doesn't have nearly that many big words.
Post a Comment