1 / 32

機率極限 & 機率收斂 Probability Limit and Convergence in Probability

機率極限 & 機率收斂 Probability Limit and Convergence in Probability. Convergence Concepts. This section treats the somewhat fanciful idea of allowing the sample size to approach infinity and investigates the behavior of certain sample quantities as this happens. We are mainly

cassady-roy
Download Presentation

機率極限 & 機率收斂 Probability Limit and Convergence in Probability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 機率極限&機率收斂Probability Limit and Convergence in Probability

  2. Convergence Concepts • This section treats the somewhat fanciful idea of allowing the sample size to approach infinity • and investigates the behavior of certain sample quantities as this happens. We are mainly • concerned with three types of convergence, and we treat them in varying amounts of detail.

  3. Sequences of Random Variables(x1,x2,…,x3) • Interested in behavior of functions of random variables such as means, variances, proportions • For large samples,exact distributions can be difficult/impossible to obtain • Limit Theorems can be used to obtain properties of estimators as the sample sizes tend to infinity • Convergence in Probability – Limit of an estimator • Convergence in Distribution – Limit of a CDF • Central Limit Theorem – Large Sample Distribution of the Sample Mean of a Random Sample

  4. 理論骰子平均出現點數是 1*1/6+2*1/6+…+6*1/6=21/6 骰子模擬1000次後的平圴出現點數是 5+2+ …+3+….+6/1000-21/6 dx = Law of Large Numbers dx

  5. Convergence in Probability

  6. Convergence in Probability • The sequence of random variables, X1,…,Xn, is said to converge in probability to the constant c, if for every e>0, • Weak Law of Large Numbers (WLLN): Let X1,…,Xn be iid random variables with E(Xi)=m and V(Xi)=s2 < . Then the sample mean converges in probability to m:

  7. Weak Law of Large Numbers WLLN

  8. Proof of WLLN

  9. Other Case/Rules • Binomial Sample Proportions • Useful Generalizations:

  10. Convergence in Distribution

  11. Convergence in Distribution • Let Yn be a random variable with CDF Fn(y). • Let Y be a random variable with CDF F(y). • If the limit as n of Fn(y) equals F(y) for every point y where F(y) is continuous, then we say that Ynconverges in distribution to Y • F(y) is called the limiting distribution function of Yn • If Mn(t)=E(etYn) converges to M(t)=E(etY), then Yn converges in distribution to Y

  12. Limiting Distribution

  13. Example – Binomial  Poisson • Xn~Binomial(n,p) Let l=np  p=l/n • Mn(t) = (pet + (1-p))n = (1+p(et-1))n = (1+l(et-1)/n)n • Aside: limn (1+a/n)n = ea •  limn Mn(t) = limn (1+l(et-1)/n)n = exp(l(et-1)) • exp(l(et-1)) ≡ MGF of Poisson(l) •  Xn converges in distribution to Poisson(l=np)

  14. Example – Scaled Poisson  N(0,1)

  15. Central Limit Theorem

  16. Central Limit Theorem • Let X1,X2,…,Xn be a sequence of independently and identically distributed random variables with finite mean m, and finite variance s2. Then: • Thus the limiting distribution of the sample mean is a normal distribution, regardless of the distribution of the individual measurements

  17. Proof of Central Limit Theorem (I) • Additional Assumptions for this Proof: • The moment-generating function of X, MX(t), exists in a neighborhood of 0 (for all |t|<h, h>0). • The third derivative of the MGF is bounded in a neighborhood of 0 (M(3)(t) ≤ B< for all |t|<h, h>0). • Elements of Proof • Work with Yi=(Xi-m)/s • Use Taylor’s Theorem (Lagrange Form) • Calculus Result: limn[1+(an/n)]n = ea if limnan=a

  18. Proof of CLT (II)

  19. Proof of CLT (III)

  20. Proof of CLT (IV)

  21. Proof of CLT (V)

  22. Asymptotic Distribution Obtaining an asymptotic distribution from a limiting distribution Obtain the limiting distribution via a stabilizing transformation Assume the limiting distribution applies reasonably well in finite samples Invert the stabilizing transformation to obtain the asymptotic distribution Asymptotic normality of a distribution.

  23. 趨近效率 Asymptotic Efficiency

  24. Asymptotic Efficiency Comparison of asymptotic variances How to compare consistent estimators? If both converge to constants, both variances go to zero. Example: Random sampling from the normal distribution, Sample mean is asymptotically normal[μ,σ2/n] Median is asymptotically normal [μ,(π/2)σ2/n] Mean is asymptotically more efficient

  25. Properties of MLEs: Asymptotic Normality

  26. Properties of MLEs: Asymptotic Efficiency

  27. Convergence in Mean Square

  28. Convergence in mean square

  29. Convergence in Probability Almost Sure

  30. Almost Sure

  31. Not Almost Sure

More Related