Skip to content

Latest commit

 

History

History
111 lines (88 loc) · 3.52 KB

1.3.4.md

File metadata and controls

111 lines (88 loc) · 3.52 KB

Solution to Review Question

by Qiang Gao, updated at May 8, 2017


Chapter 1 Finite-Sample Properties of OLS

Section 3 Finite-Sample Properties of OLS

...

Review Question 1.3.4 (Gauss-Markov for unconditional variance)

(a) Prove: $$ \mathrm{Var} ( \widehat{\boldsymbol{\beta}} ) = \mathrm{E} [ \mathrm{Var} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) ] + \mathrm{Var} [ \mathrm{E} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) ] $$.

(b) Prove (1.3.3) in textbook,

$$ \mathrm{Var} ( \widehat{ \boldsymbol{ \beta } } ) \ge \mathrm{Var} ( \mathbf{b} ), \tag{1.3.3} $$

where $$ \widehat{ \boldsymbol{ \beta } } $$ is any linear unbiased estimator.

Solution

(a) By the formula of variance,

$$ \begin{align} \mathrm{Var} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) & = \mathrm{E} ( \widehat{\boldsymbol{\beta}} \widehat{\boldsymbol{\beta}}' \mid \mathbf{X} ) - \mathrm{E} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) \mathrm{E} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} )'. \end{align} $$

Taking expectations,

$$ \begin{align} \mathrm{E} [ \mathrm{Var} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) ] & = \mathrm{E} [ \mathrm{E} ( \widehat{\boldsymbol{\beta}} \widehat{\boldsymbol{\beta}}' \mid \mathbf{X} ) ] - \mathrm{E} [ \mathrm{E} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) \mathrm{E} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} )' ] \ & = \mathrm{E} ( \widehat{\boldsymbol{\beta}} \widehat{\boldsymbol{\beta}}' ) - \mathrm{E} [ \mathrm{E} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) \mathrm{E} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} )' ]. \tag{1} \end{align} $$

Also by the formula of variance,

$$ \begin{align} \mathrm{Var} [ \mathrm{E} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) ] & = \mathrm{E} [ \mathrm{E} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) \mathrm{E} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} )' ] - \mathrm{E} ( \widehat{ \boldsymbol{ \beta } } ) \mathrm{E} ( \widehat{ \boldsymbol{ \beta } } )'. \tag{2} \end{align} $$

Combining equations (1) and (2), and again by the formula of variance,

$$ \mathrm{E} [ \mathrm{Var} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) ] + \mathrm{Var} [ \mathrm{E} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) ]

\mathrm{E} ( \widehat{\boldsymbol{\beta}} \widehat{\boldsymbol{\beta}}' ) - \mathrm{E} ( \widehat{ \boldsymbol{ \beta } } ) \mathrm{E} ( \widehat{ \boldsymbol{ \beta } } )'

\mathrm{Var} ( \widehat{ \boldsymbol{ \beta } } ). $$

(b)

$$ \begin{align} & \mathrm{Var} ( \widehat{ \boldsymbol{ \beta } } ) - \mathrm{Var} ( \mathbf{b} ) \ = & \mathrm{E} [ \mathrm{Var} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) ] + \mathrm{Var} [ \mathrm{E} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) ] - \mathrm{E} [ \mathrm{Var} ( \mathbf{b} \mid \mathbf{X} ) ] - \mathrm{Var} [ \mathrm{E} ( \mathbf{b} \mid \mathbf{X} ) ] && \text{(by part (a))} \ = & \mathrm{E} [ \mathrm{Var} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) ] - \mathrm{E} [ \mathrm{Var} ( \mathbf{b} \mid \mathbf{X} ) ] && \text{($ \mathrm{E} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) = \mathrm{E} ( \mathbf{b} \mid \mathbf{X} ) = \boldsymbol{ \beta } $)} \ = & \mathrm{E} [ \mathrm{Var} ( \widehat{\boldsymbol{\beta}} \mid \mathbf{X} ) - \mathrm{Var} ( \mathbf{b} \mid \mathbf{X} ) ] && \text{(linearity of expectations)} \ = & \text{postive semidefinite matrix}. && \text{(Gauss-Markov Theorem)} \end{align} $$


Copyright ©2017 by Qiang Gao