# Maximum Likelihood Estimate

Suppose that a value ${\hat {\theta }}(s)$ is such that

$L({\hat {\theta }}(s)|s)\geq {L}(\theta |s)$ for every $\theta \in \Omega$ .

Then ${\hat {\theta }}(s)$ is the maximum likelihood estimate (MLE) of $\theta$ .

The MLE can be computed for every dataset $s\in {S}$ , so the function

${\hat {\theta }}:S\rightarrow \Omega$ is called maximum likelihood estimator of $\theta$ .

### Invariance of the MLE

If ${\hat {\theta }}(s)$ is a MLE for the original parametrization and if $\psi$ is a 1-1 function defined on $\Omega$ ,

then

${\hat {\psi }}(s)=\psi ({\hat {\theta }}(s))$ is a MLE in the new parametrization.

PROOF

• $L(\theta |s)=f_{\theta }(s)$ • $L^{*}(\psi |s)=f_{\psi }(s)$ • $L^{*}({\hat {\psi }}(s)|s)=L^{*}(\psi ({\hat {\theta }}(s))|s)$ $f_{\psi ({\hat {\theta }}(s))}(s)=f_{{\hat {\theta }}(s)}(s)$ $L^{*}(\psi (\theta )|s)=L^{*}(\psi |s)$ ### Asymptotic Properties of MLE

THEORY 1

Under some regularity conditions on the statistical model {$f_{\theta }:\theta \in \Omega$ }

• the MLE of $\theta$ ,${\hat {\theta }}$ , exists almost surely
• ${\hat {\theta }}{\xrightarrow {a.s.}}\theta$ as $n\rightarrow \infty$ THEORY 2

Under some regularity conditions on the statistical model {$f_{\theta }:\theta \in \Omega$ }

$({\hat {\theta }}-\theta )(nI(\theta ))^{1/2}{\xrightarrow {D}}N(0;1)$ as

$n\rightarrow \infty$ where

$I(\theta )=E_{\theta }$ {${{-\sigma ^{2}} \over {\sigma \theta ^{2}}}logf_{\theta }(x)$ } $\Longrightarrow$ Fisher information

$I({\hat {\theta }})=E_{\hat {\theta }}$ {${{-\sigma ^{2}} \over {\sigma \theta ^{2}}}logf_{\theta }(x)$ } $\Longrightarrow$ observed Fisher information