1 / 39

SOME GENERAL PROBLEMS

SOME GENERAL PROBLEMS. Problem.

paiva
Download Presentation

SOME GENERAL PROBLEMS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SOME GENERAL PROBLEMS

  2. Problem • A certain lion has three possible states of activity each night; they are ‘very active’ (denoted by θ1), ‘moderately active’ (denoted by θ2), and ‘lethargic (lacking energy)’ (denoted by θ3). Also, each night this lion eats people; it eats i people with probability p(i|θ), θϵΘ={θ1, θ2, θ3} . Of course, the probability distribution of the number of people eaten depends on the lion’s activity state θϵΘ. The numeric values are given in the following table.

  3. Problem If we are told X=x0 people were eaten last night, how should we estimate the lion’s activity state(θ1, θ2 or θ3)?

  4. Solution • One reasonable method is to estimate θ as that in Θ for which p(x0|θ) is largest. In other words, the θ ϵΘ that provides the largest probability of observing what we did observe. : the MLE of θ based on X (Taken from “Dudewicz and Mishra, 1988, Modern Mathematical Statistics, Wiley”)

  5. Problem • Consider the Laplace distribution centered at the origin and with the shape parameter β, which for all x has the p.d.f. Find MME and MLE of β.

  6. Problem • Let X1,…,Xn be independent r.v.s each with lognormal distribution, ln N(,2). Find the MMEs of ,2

  7. STATISTICAL INFERENCEPART III BETTER OR BEST ESTIMATORS, FISHER INFORMATION, CRAMER-RAO LOWER BOUND (CRLB)

  8. RECALL: EXPONENTIAL CLASS OF PDFS • If the pdf can be written in the following form then, the pdf is a member of exponential class of pdfs. (Here, k is the number of parameters)

  9. EXPONENTIAL CLASS and CSS • Random Sample from Regular Exponential Class is a css for .

  10. RAO-BLACKWELL THEOREM • Let X1, X2,…,Xn have joint pdf or pmff(x1,x2,…,xn;) and let S=(S1,S2,…,Sk) be a vector of jss for . If T is an UE of () and (S)=E(TS), then • (S) is an UE of() . • (S) is a fn of S, so it is free of . • Var((S) ) Var(T) for all . • (S) is a better unbiased estimator of () .

  11. RAO-BLACKWELL THEOREM • Notes: • (S)=E(TS) is at least as good as T. • For finding the best UE, it is enough to consider UEs that are functions of a ss, because all such estimators are at least as good as the rest of the UEs.

  12. Example • Hogg & Craig, Exercise 10.10 • X1,X2~Exp(θ) • Find joint p.d.f. of ss Y1=X1+X2 for θ and Y2=X2. • Show that Y2 is UE of θ with variance θ². • Find φ(y1)=E(Y2|Y1) and variance of φ(Y1).

  13. THE MINIMUM VARIANCE UNBIASED ESTIMATOR • Rao-Blackwell Theorem: If T is an unbiased estimator of , and S is a ss for , then (S)=E(TS)is • an UE of , i.e.,E[(S)]=E[E(TS)]= and • with a smaller variance than Var(T).

  14. LEHMANN-SCHEFFE THEOREM • Let Y be a cssfor . If there is a function Y which is an UE of , then the function is the unique Minimum Variance Unbiased Estimator (UMVUE) of . • Y css for . • T(y)=fn(y) and E[T(Y)]=. • T(Y) is the UMVUE of . • So, it is the best unbiased estimator of .

  15. THE MINIMUM VARIANCE UNBIASED ESTIMATOR • Let Y be a cssfor . Since Y is complete, there could be only a unique function of Y which is an UE of . • Let U1(Y) and U2(Y) be two function of Y. Since they are UE’s, E(U1(Y)U2(Y))=0 imply W(Y)=U1(Y)U2(Y)=0 for all possible values of Y. Therefore, U1(Y)=U2(Y) for all Y.

  16. Example • Let X1,X2,…,Xn ~Poi(μ). Find UMVUE of μ. • Solution steps: • Show that is css for μ. • Find a statistics (such as S*) that is UE of μ and a function of S. • Then, S* is UMVUE of μ by Lehmann-Scheffe Thm.

  17. Note • The estimator found by Rao-Blackwell Thm may not be unique. But, the estimator found by Lehmann-Scheffe Thm is unique.

  18. RECALL: EXPONENTIAL CLASS OF PDFS • If the pdf can be written in the following form then, the pdf is a member of exponential class of pdfs. (Here, k is the number of parameters)

  19. EXPONENTIAL CLASS and CSS • Random Sample from Regular Exponential Class is a css for . If Y is an UE of , Yis the UMVUE of .

  20. EXAMPLES Let X1,X2,…~Bin(1,p), i.e., Ber(p). This family is a member of exponential family of distributions. is a CSS for p. is UE of p and a function of CSS. is UMVUE of p.

  21. EXAMPLES X~N(,2) where both  and 2 is unknown. Find a css for  and 2 .

  22. FISHER INFORMATION AND INFORMATION CRITERIA • X, f(x;), , xA (not depend on ). Definitions and notations:

  23. FISHER INFORMATION AND INFORMATION CRITERIA The Fisher Information in a random variable X: The Fisher Information in the random sample: Let’s prove the equalities above.

  24. FISHER INFORMATION AND INFORMATION CRITERIA

  25. FISHER INFORMATION AND INFORMATION CRITERIA

  26. FISHER INFORMATION AND INFORMATION CRITERIA The Fisher Information in a random variable X: The Fisher Information in the random sample: Proof of the last equality is available on Casella & Berger (1990), pg. 310-311.

  27. CRAMER-RAO LOWER BOUND (CRLB) • Let X1,X2,…,Xnbe sample random variables. • Range of X does not depend on . • Y=U(X1,X2,…,Xn): a statistic; does’nt contain . • Let E(Y)=m(). • Let prove this!

  28. CRAMER-RAO LOWER BOUND (CRLB) • -1Corr(Y,Z)1 • 0 Corr(Y,Z)21   • Take Z=′(x1,x2,…,xn;) • Then, E(Z)=0 and V(Z)=In() (from previous slides).

  29. CRAMER-RAO LOWER BOUND (CRLB) • Cov(Y,Z)=E(YZ)-E(Y)E(Z)=E(YZ)

  30. CRAMER-RAO LOWER BOUND (CRLB) • E(Y.Z)=mʹ(), Cov(Y,Z)=mʹ(), V(Z)=In() The Cramer-Rao Inequality (Information Inequality)

  31. CRAMER-RAO LOWER BOUND (CRLB) • CRLB is the lower bound for the variance of an unbiased estimator of m(). • When V(Y)=CRLB, Y is the MVUE of m(). • For a r.s., remember that In()=n I(), so,

  32. ASYMPTOTIC DISTRIBUTION OF MLEs • : MLE of  • X1,X2,…,Xnis a random sample.

  33. EFFICIENT ESTIMATOR • T is an efficient estimator (EE) of  if • T is UE of , and, • Var(T)=CRLB • T is an efficient estimator (EE) of its expectation, m(), if its variance reaches the CRLB. • An EE of m() may not exist. • The EE of m(), if exists, is unique. • The EE of m() is the unique MVUE of m().

  34. ASYMPTOTIC EFFICIENT ESTIMATOR • Y is an asymptotic EE of m() if

  35. EXAMPLES A r.s. of size n from X~Poi(θ). • Find CRLB for any UE of θ. • Find UMVUE of θ. • Find an EE for θ. • Find CRLB for any UE of exp{-2θ}. Assume n=1, and show that is UMVUE of exp{-2θ}. Is this a reasonable estimator?

  36. EXAMPLE A r.s. of size n from X~Exp(). Find UMVUE of , if exists.

  37. Summary • We covered 3 methods for finding good estimators (possibly UMVUE): • Rao-Blackwell Theorem (Use a ss T, an UE U, and create a new statistic by E(U|T)) • Lehmann-Scheffe Theorem (Use a css T which is also UE) • Cramer-Rao Lower Bound (Find an UE with variance=CRLB)

  38. Problems • Let be a random sample from gamma distribution, Xi~Gamma(2,θ). The p.d.f. of X1 is given by: a) Find a complete and sufficient statistic for θ. b) Find a minimal sufficient statistic for θ. c) Find CRLB for the variance of an unbiased estimator of θ. d) Find a UMVUE of θ.

  39. Problems • Suppose X1,…,Xn are independent with density for θ>0 a) Find a complete sufficient statistic. b) Find the CRLB for the variance of unbiased estimators of 1/θ. c) Find the UMVUE of 1/θ if there is one.

More Related