Week 7
Learning Outcomes
Monday
- Maximum Likelihood Approach
Wednesday
- Method of Moments
Reading
Day | Reading |
---|---|
Monday’s Lecture | MMS: 7.2 |
Wednesday’s Lecture | MMS: 7.2 |
Homework
HW 4 can be found here. It is due October 14 at 11:59 PM.
Important Concepts
Data
Let \(X_1,\ldots,X_n\overset{iid}{\sim}F(\boldsymbol \theta)\) where \(F(\cdot)\) is a known distribution function and \(\boldsymbol\theta\) is a vector of parameters. Let \(\boldsymbol X = (X_1,\ldots, X_n)^\mathrm{T}\), be the sample collected.
Maximum Likelihood Estimator
Likelihood Function
Using the joint pdf or pmf of the sample \(\boldsymbol X\), the likelihood function is a function of \(\boldsymbol \theta\), given the observed data \(\boldsymbol X =\boldsymbol x\), defined as
\[ L(\boldsymbol \theta|\boldsymbol x)=f(\boldsymbol x|\boldsymbol \theta) \]
If the data is iid, then
\[ f(\boldsymbol x|\boldsymbol \theta) = \prod^n_{i=1}f(x_i|\boldsymbol\theta) \]
Estimator
The maximum likelihood estimator are the estimates of \(\boldsymbol \theta\) that maximize \(L(\boldsymbol\theta)\).
Log-Likelihood Approach
If \(\ln\{L(\boldsymbol \theta)\}\) is monotone of \(\boldsymbol \theta\), then maximizing \(\ln\{L(\boldsymbol \theta)\}\) will yield the maximum likelihood estimators.
Method of Moments
Let the \(k\)th moment be defined as \(\mu_k\) and the corresponding \(k\)th moment average \(\frac{1}{n}\sum^n_{i=1}X_i^{k}\):
\[ \mu_k = \frac{1}{n}\sum^n_{i=1}X_i^k. \]
The parameter estimates are for \(t\) parameters are the solutions for \(\mu_k\) for \(k=1,\ldots,t\).
Resources
You must log on to your CI Google account to access the video.
Press the “b” key to access hidden notes.
Lecture | Slides | Videos |
---|---|---|
Monday | Slides | N/A |
Wednesday | Slides |
Extra Resources on Maximum Likelihood Estimators
In MMS Chapter 7:
Problems: 27.b; 33.a