Law of large numbers
This presentation is the property of its rightful owner.
Sponsored Links
1 / 14

Law of Large Numbers PowerPoint PPT Presentation


  • 40 Views
  • Uploaded on
  • Presentation posted in: General

Law of Large Numbers. Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E ( X i ) = ½. The proportion of heads is . Intuitively approaches ½ as n  ∞. Markov’s Inequality.

Download Presentation

Law of Large Numbers

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Law of large numbers

Law of Large Numbers

  • Toss a coin n times.

  • Suppose

  • Xi’s are Bernoulli random variables with p = ½ and E(Xi) = ½.

  • The proportion of heads is .

  • Intuitively approaches ½ as n  ∞.

week 12


Markov s inequality

Markov’s Inequality

  • If X is a non-negative random variable with E(X) < ∞ and a >0 then,

week 12


Chebyshev s inequality

Chebyshev’s Inequality

  • For a random variable X with E(X) < ∞ and V(X) < ∞, for any a >0

  • Proof:

week 12


Back to the law of large numbers

Back to the Law of Large Numbers

  • Interested in sequence of random variables X1, X2, X3,… such that the random variables are independent and identically distributed (i.i.d).

    Let

    Suppose E(Xi) = μ , V(Xi) = σ2, then

    and

  • Intuitively, as n  ∞, so

week 12


Law of large numbers

  • Formally, the Weak Law of Large Numbers (WLLN) states the following:

  • Suppose X1, X2, X3,…are i.i.d with E(Xi) = μ < ∞ , V(Xi) = σ2 < ∞, then for any positive number a

    as n ∞ .

    This is called Convergence in Probability.

    Proof:

week 12


Example

Example

  • Flip a coin 10,000 times. Let

  • E(Xi) = ½ and V(Xi) = ¼ .

  • Take a = 0.01, then by Chebyshev’s Inequality

  • Chebyshev Inequality gives a very weak upper bound.

  • Chebyshev Inequality works regardless of the distribution of the Xi’s.

week 12


Strong law of large number

Strong Law of Large Number

  • Suppose X1, X2, X3,…are i.i.d with E(Xi) = μ < ∞ , then converges to μ

    as n  ∞ with probability 1. That is

  • This is called convergence almost surely.

week 12


Continuity theorem for mgfs

Continuity Theorem for MGFs

  • Let X be a random variable such that for some t0 > 0 we have mX(t) < ∞ for

    . Further, if X1, X2,…is a sequence of random variables with

    and for all

    then {Xn} converges in distribution to X.

  • This theorem can also be stated as follows:

    Let Fn be a sequence of cdfs with corresponding mgf mn. Let F be a cdf with

    mgf m. If mn(t) m(t) for all t in an open interval containing zero, then

    Fn(x) F(x) at all continuity points of F.

  • Example:

    Poisson distribution can be approximated by a Normal distribution for large λ.

week 12


Example to illustrate the continuity theorem

Example to illustrate the Continuity Theorem

  • Let λ1, λ2,…be an increasing sequence with λn∞ as n ∞ and let {Xi} be

    a sequence of Poisson random variables with the corresponding parameters.

    We know that E(Xn) = λn = V(Xn).

  • Let then we have that E(Zn) = 0, V(Zn) = 1.

  • We can show that the mgf of Zn is the mgf of a Standard Normal random variable.

  • We say that Zn convergence in distribution to Z ~ N(0,1).

week 12


Example1

Example

  • Suppose X is Poisson(900) random variable. Find P(X > 950).

week 12


Central limit theorem

Central Limit Theorem

  • The central limit theorem is concerned with the limiting property of sums of random variables.

  • If X1, X2,…is a sequence of i.i.d random variables with mean μ and variance σ2 and ,

    then by the WLLN we have that in probability.

  • The CLT concerned not just with the fact of convergence but how Sn/n fluctuates around μ.

  • Note that E(Sn) = nμ and V(Sn) = nσ2. The standardized version of Sn is

    and we have that E(Zn) = 0, V(Zn) = 1.

week 12


The central limit theorem

The Central Limit Theorem

  • Let X1, X2,…be a sequence of i.i.d random variables with E(Xi) = μ < ∞ and Var(Xi) = σ2 < ∞. Suppose the common distribution function FX(x) and the common moment generating function mX(t) are defined in a neighborhood of 0. Let

    Then, for - ∞ < x < ∞

    where Ф(x) is the cdf for the standard normal distribution.

  • This is equivalent to saying that converges in distribution to

    Z ~ N(0,1).

  • Also,

    i.e. converges in distribution to Z ~ N(0,1).

week 12


Example2

Example

  • Suppose X1, X2,…are i.i.d random variables and each has the Poisson(3) distribution. So E(Xi) = V(Xi) = 3.

  • The CLT says that as n  ∞.

week 12


Examples

Examples

  • A very common application of the CLT is the Normal approximation to the Binomial distribution.

  • Suppose X1, X2,…are i.i.d random variables and each has the Bernoulli(p)

    distribution. So E(Xi) = p and V(Xi) = p(1- p).

  • The CLT says that as n  ∞.

  • Let Yn = X1 + … + Xn then Yn has a Binomial(n, p) distribution.

    So for large n,

  • Suppose we flip a biased coin 1000 times and the probability of heads on any one toss is 0.6. Find the probability of getting at least 550 heads.

  • Suppose we toss a coin 100 times and observed 60 heads. Is the coin fair?

week 12


  • Login