1 / 8

The Likelihood Function - Introduction

The Likelihood Function - Introduction. Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced the data. The distribution f θ can be either a probability density function or a

aysel
Download Presentation

The Likelihood Function - Introduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Likelihood Function - Introduction • Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced the data. • The distribution fθ can be either a probability density function or a probability mass function. • The joint probability density function or probability mass function of iid random variables X1, …, Xn is week 4

  2. The Likelihood Function • Let x1, …, xn be sample observations taken on corresponding random variables X1, …, Xn whose distribution depends on a parameter θ. The likelihood function defined on the parameter space Ω is given by • Note that for the likelihood function we are fixing the data, x1,…, xn, and varying the value of the parameter. • The value L(θ | x1, …, xn) is called the likelihood of θ. It is the probability of observing the data values we observed given that θ is the true value of the parameter. It is not the probability of θ given that we observed x1, …, xn. week 4

  3. Examples • Suppose we toss a coin n = 10 times and observed 4 heads. With no knowledge whatsoever about the probability of getting a head on a single toss, the appropriate statistical model for the data is the Binomial(10, θ) model. The likelihood function is given by • Suppose X1, …, Xn is a random sample from an Exponential(θ) distribution. The likelihood function is week 4

  4. Maximum Likelihood Estimators • In the likelihood function, different values of θ will attach different probabilities to a particular observed sample. • The likelihood function, L(θ | x1, …, xn), can be maximized over θ, to give the parameter value that attaches the highest possible probability to a particular observed sample. • We can maximize the likelihood function to find an estimator of θ. • This estimator is a statistics – it is a function of the sample data. It is denoted by week 4

  5. The log likelihood function • l(θ) = ln(L(θ)) is the log likelihood function. • Both the likelihood function and the log likelihood function have their maximums at the same value of • It is often easier to maximize l(θ). week 4

  6. Examples week 4

  7. Properties of MLE • The MLE is invariant, i.e., the MLE of g(θ) equal to the function g evaluated at the MLE. • Proof: • Examples: week 4

  8. Important Comment • Some MLE’s cannot be determined using calculus. This occurs whenever the support is a function of the parameter θ. • These are best solved by graphing the likelihood function. • Example: week 4

More Related