1 / 17

Title Bout: MLE vs. MoM Opponents: R.A. Fisher and Karl Pearson

Title Bout: MLE vs. MoM Opponents: R.A. Fisher and Karl Pearson. By Lada Kyj and Galen Papkov Rice University - STAT 600 September 27, 2004. Outline. Short Biographies Journal rage Criterion Method of Moments Maximum Likelihood MLE vs. MoM Issues. Who’s who?.

Download Presentation

Title Bout: MLE vs. MoM Opponents: R.A. Fisher and Karl Pearson

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Title Bout: MLE vs. MoMOpponents: R.A. Fisher and Karl Pearson By Lada Kyj and Galen Papkov Rice University - STAT 600 September 27, 2004

  2. Outline • Short Biographies • Journal rage • Criterion • Method of Moments • Maximum Likelihood • MLE vs. MoM • Issues

  3. Who’s who?

  4. R.A. Fisher (Born February 17, 1890 in London, England) • Superstitious fact • Studied mathematics and astronomy at Cambridge (also interested in biology) • Rejected from enlisting in the military for WWI due to poor eyesight • Introduced concepts such as randomization, likelihood, and ANOVA

  5. Karl Pearson (born March 27, 1857 in London, England) • Attended Cambridge • Had various interests • Mathematics, physics, metaphysics, law, etc… • Contributed to regression analysis and developed the correlation coefficient and the chi-square test. • Is characterized as trying to use large samples to deduce correlations in the data whereas Fisher used small samples to determine causes.

  6. Journal Rage • Began in 1917 when Pearson claimed that Fisher had failed to distinguish likelihood from inverse probability in a paper he wrote in 1915 • Feud continued for many years • Fire of feud fed by injustice • “It would be fair to say that both showed hatred towards the other.”

  7. Criteria of Consistency • A statistic is consistent if, when it is calculated from the population, it is equal to the population parameter. • PROBLEM! • Many statistics for the same parameter can be consistent.

  8. Criteria of Efficiency • A statistic is efficient if, when derived from a large sample, it tends to a normal distribution with minimum variance. • Relates to estimation accuracy • PROBLEM! • This criterion is still incomplete since different methods of calculation may tend to agreement for large samples, but not for finite samples.

  9. Criteria of Sufficiency • A statistic is sufficient when no other statistic which can be calculated from the same sample provides any additional information per the value of the parameter to be estimated. • Relates to “information”

  10. Method of Moments • Developed by Pearson in 1894 • Method: mk = E[Xk] • m1 = E[X] = Xbar (sample mean) • Satisfies consistency

  11. Example • The Cauchy distribution is a great example that portrays the limitations of the MoM. • A few outliers appear to dominate the value of the mean. • Cannot use MoM, go to MLE

  12. Maximum Likelihood Estimation • Developed by Fisher • Officially called maximum likelihood estimation in 1921 • “The likelihood of a parameter is proportional to the probability of the data.” • Method: • Obtain L(x;q) • Take first derivative, set = 0, solve for q.

  13. Conditions for Determining Maximum First order conditions: Second order conditions: (Hessian is negative definite)

  14. Maximum Likelihood Estimation (cont.) • Criterion: • Consistency • Efficiency • Sufficiency – poor “proof”! • Asymptotically normal and invariant • Led to the development of the factorization theorem. • Is used as an efficiency yardstick for other estimators, such as the method of moments.

  15. MLE vs. MoM • MoM is easy to use for normal curves. • Mean is the best statistic for locating this curve • MLE: • Evaluating and maximizing likelihood function is often challenging • Difficult to write down complete statistical model of the joint distribution of the data • More robust • Greater efficiency

  16. Issues • Neither are great for exploratory data analysis since the underlying distribution must be known. • MLE – is believed to be sufficient (an acceptable proof has not been derived)

  17. References • http://www-groups.dcs.st-and.ac.uk/~history/Mathematicians/Fisher.html • http://www-groups.dcs.st-and.ac.uk/~history/Mathematicians/Pearson.html • Aldrich, John (1997). R.A. Fisher and the Making of Maximum Likelihood 1912-1922. Statistical Science, Vol. 12, No. 3, p. 162-176. • Fisher, R. A. (1921a). On the mathematical foundations of theoretical statistics. P

More Related