Skip to content

Latest commit

 

History

History
62 lines (44 loc) · 2.06 KB

1.5.3.md

File metadata and controls

62 lines (44 loc) · 2.06 KB

Solution to Review Question

by Qiang Gao, updated at May 15, 2017


Chapter 1 Finite-Sample Properties of OLS

Section 5 Relation to Maximum Likelihood

...

Review Question 1.5.3 (Concentrated log likelihood with respect to $$ \tilde{ \sigma }^2 $$)

Writing $$ \tilde{ \sigma }^2 $$ as $$ \tilde{ \gamma } $$, the log likelihood function for the classical regression model is

$$ \log L ( \tilde{ \boldsymbol{ \beta } }, \tilde{ \gamma } ) =

  • \frac{n}{2} \log ( 2 \pi ) - \frac{n}{2} \log ( \tilde{ \gamma } ) - \frac{1}{ 2 \tilde{ \gamma } } ( \mathbf{y} - \mathbf{X} \tilde{ \boldsymbol{ \beta } } )' ( \mathbf{y} - \mathbf{X} \tilde{ \boldsymbol{ \beta } } ). \tag{1} $$

In the two-step maximization procedure described in the text, we first maximized this function with respect to $$ \tilde{ \boldsymbol{ \beta } } $$. Instead, first maximize with respect to $$ \tilde{ \gamma } $$ given $$ \tilde{ \boldsymbol{ \beta } } $$. Show that the concentrated log likelihood (concentrated with respect to $$ \tilde{ \gamma } \equiv \tilde{ \sigma }^2 $$) is

$$

  • \frac{n}{2} [ 1 + \log ( 2 \pi ) ] - \frac{n}{2} \log \left( \frac{ ( \mathbf{y} - \mathbf{X} \tilde{ \boldsymbol{ \beta } } )' ( \mathbf{y} - \mathbf{X} \tilde{ \boldsymbol{ \beta } } ) }{ n } \right). \tag{2} $$
Solution

When maximizing (1) with respect to $$ \tilde{ \gamma } $$ given $$ \tilde{ \boldsymbol{ \beta } } $$, the first-order condition is

$$ \frac{ \partial \log L ( \tilde{ \boldsymbol{ \beta } }, \tilde{ \gamma } ) }{ \partial \tilde{ \gamma } } =

  • \frac{n}{2} \frac{1}{ \tilde{ \gamma } } + \frac{ ( \mathbf{y} - \mathbf{X} \tilde{ \boldsymbol{ \beta } } )' ( \mathbf{y} - \mathbf{X} \tilde{ \boldsymbol{ \beta } } ) }{2} \frac{1}{ \tilde{ \gamma }^2 } = 0, $$

which gives

$$ \tilde{ \gamma } = \frac{ ( \mathbf{y} - \mathbf{X} \tilde{ \boldsymbol{ \beta } } )' ( \mathbf{y} - \mathbf{X} \tilde{ \boldsymbol{ \beta } } ) }{n}. \tag{3} $$

Substituting partial solution (3) into objective function (1), we get the concentrated log likelihood function (2).


Copyright ©2017 by Qiang Gao