# Maximum Likelihood Estimate

Suppose that a value ${\displaystyle {\hat {\theta }}(s)}$ is such that

${\displaystyle L({\hat {\theta }}(s)|s)\geq {L}(\theta |s)}$

for every ${\displaystyle \theta \in \Omega }$.

Then ${\displaystyle {\hat {\theta }}(s)}$ is the maximum likelihood estimate (MLE) of ${\displaystyle \theta }$.

The MLE can be computed for every dataset ${\displaystyle s\in {S}}$, so the function

${\displaystyle {\hat {\theta }}:S\rightarrow \Omega }$

is called maximum likelihood estimator of ${\displaystyle \theta }$.

### Invariance of the MLE

If ${\displaystyle {\hat {\theta }}(s)}$ is a MLE for the original parametrization and if ${\displaystyle \psi }$ is a 1-1 function defined on ${\displaystyle \Omega }$,

then

${\displaystyle {\hat {\psi }}(s)=\psi ({\hat {\theta }}(s))}$

is a MLE in the new parametrization.

PROOF

• ${\displaystyle L(\theta |s)=f_{\theta }(s)}$
• ${\displaystyle L^{*}(\psi |s)=f_{\psi }(s)}$
• ${\displaystyle L^{*}({\hat {\psi }}(s)|s)=L^{*}(\psi ({\hat {\theta }}(s))|s)}$

${\displaystyle f_{\psi ({\hat {\theta }}(s))}(s)=f_{{\hat {\theta }}(s)}(s)}$

${\displaystyle L^{*}(\psi (\theta )|s)=L^{*}(\psi |s)}$

### Asymptotic Properties of MLE

THEORY 1

Under some regularity conditions on the statistical model {${\displaystyle f_{\theta }:\theta \in \Omega }$}

• the MLE of ${\displaystyle \theta }$,${\displaystyle {\hat {\theta }}}$, exists almost surely
• ${\displaystyle {\hat {\theta }}{\xrightarrow {a.s.}}\theta }$ as ${\displaystyle n\rightarrow \infty }$

THEORY 2

Under some regularity conditions on the statistical model {${\displaystyle f_{\theta }:\theta \in \Omega }$}

${\displaystyle ({\hat {\theta }}-\theta )(nI(\theta ))^{1/2}{\xrightarrow {D}}N(0;1)}$

as

${\displaystyle n\rightarrow \infty }$

where

${\displaystyle I(\theta )=E_{\theta }}${${\displaystyle {{-\sigma ^{2}} \over {\sigma \theta ^{2}}}logf_{\theta }(x)}$} ${\displaystyle \Longrightarrow }$Fisher information

${\displaystyle I({\hat {\theta }})=E_{\hat {\theta }}}${${\displaystyle {{-\sigma ^{2}} \over {\sigma \theta ^{2}}}logf_{\theta }(x)}$} ${\displaystyle \Longrightarrow }$ observed Fisher information