Examining Maximum likelihood estimation

  • Thread starter Thread starter keith
  • Start date Start date
Joined
9/2/10
Messages
3
Points
13
Hi, Guys. I have a series and I tried to regress on itself, meaning S(i+1)=a+bS(i)+error;
I used 2 methods, OLS regression and MLE Regression.
The result from OLS is not quite satisfying so I tried MLE regression.
How can I tell if the MLE regression is good not not? I mean , for OLS regression you can check into the t-stat and p-value , for but MLE estimate, is there any variable I need to look into to check if the estimate is good?
 
Yes, you can check loglikelihood value, the higher the better (ideally, it should be equal to 0). If you chose two different models, you can use likelihood ratio test as a benchmark for determining which one is better.
Please note that, in fact, MLE is a probabilistic method, whereas OLS regression is not (it's just curve fitting). I remeber is used to get slamed by my probability/statistics professors each time I mentioned OLS to them :).
 
Yes, you can check loglikelihood value, the higher the better (ideally, it should be equal to 0). If you chose two different models, you can use likelihood ratio test as a benchmark for determining which one is better.
Please note that, in fact, MLE is a probabilistic method, whereas OLS regression is not (it's just curve fitting). I remeber is used to get slamed by my probability/statistics professors each time I mentioned OLS to them :).

Spot on. the log-likelihood value is a very good indicator and the likelihood ratio.Alterntively you coud use Wald's coefficient restrictions or confidence ellipses.

MLE is probably the best method to eastimate parameters and I strongly recommend it for any stastical analysis you do regardless of which model you are trying to calibrate.
 
Spot on. the log-likelihood value is a very good indicator and the likelihood ratio.Alterntively you coud use Wald's coefficient restrictions or confidence ellipses.

MLE is probably the best method to eastimate parameters and I strongly recommend it for any stastical analysis you do regardless of which model you are trying to calibrate.

Still the OLS is superior to the MLE as the latter one gives the same estimated parameters as the OLS does but at a cost of a greater variance. You might want to use a method with smaller variance.
 
Still the OLS is superior to the MLE as the latter one gives the same estimated parameters as the OLS does but at a cost of a greater variance. You might want to use a method with smaller variance.


In order to estimate parameters when the sample is quite volatile you can always use the MLE with the Heteroskedasticity Consistent Covariance of Bollerslev-Wooldridge.
It will "adapt" your coefficient estimations to the sample variance.

OLS provides the same performance the MLE does in case of homoskedasticity but IMHO with financial series you'd better off using a more probabilistic approach since most of the relations you will be investigating are far from being linear.
(Obviously, if you use a non-linear LS method things change because in that case you would get outcomes pretty similar to MLE ones)

www.hypervolatility.com
 
Back
Top Bottom