Order statistics and limiting distributions
Download
1 / 34

ORDER STATISTICS AND LIMITING DISTRIBUTIONS - PowerPoint PPT Presentation


  • 114 Views
  • Uploaded on

ORDER STATISTICS AND LIMITING DISTRIBUTIONS. ORDER STATISTICS. Let X 1 , X 2 ,…,X n be a r.s. of size n from a distribution of continuous type having pdf f(x), a<x<b . Let X (1) be the smallest of X i , X (2) be the second smallest of X i ,…, and X (n) be the largest of X i.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' ORDER STATISTICS AND LIMITING DISTRIBUTIONS' - cain-gibson


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Order statistics and limiting distributions

ORDER STATISTICS ANDLIMITING DISTRIBUTIONS


Order statistics
ORDER STATISTICS

  • Let X1, X2,…,Xn be a r.s. of size n from a distribution of continuous type having pdf f(x), a<x<b. Let X(1) be the smallest of Xi, X(2) be the second smallest of Xi,…, and X(n) be the largest of Xi.

  • X(i) is the i-th order statistic.


Order statistics1
ORDER STATISTICS

  • It is often useful to consider ordered random sample.

  • Example: suppose a r.s. of five light bulbs is tested and the failure times are observed as (5,11,4,100,17). These will actually be observed in the order of (4,5,11,17,100). Interest might be on the kth smallest ordered observation, e.g. stop the experiment after kth failure. We might also be interested in joint distributions of two or more order statistics or functions of them (e.g. range=max – min)


Order statistics2
ORDER STATISTICS

  • If X1, X2,…,Xn is a r.s. of size n from a population with continuous pdf f(x), then the joint pdf of the order statistics X(1), X(2),…,X(n) is

Order statistics are not independent.

The joint pdf of ordered sample is not same as the joint pdf of unordered sample.

Future reference: For discrete distributions, we need to take ties into account (two X’s being equal). See, Casella and Berger, 1990, pg 231.


Example
Example

  • Suppose that X1, X2, X3 is a r.s. from a population with pdf

    f(x)=2x for 0<x<1

    Find the joint pdf of order statistics and the marginal pdf of the smallest order statistic.


Order statistics3
ORDER STATISTICS

  • The Maximum Order Statistic: X(n)


Order statistics4
ORDER STATISTICS

  • The Minimum Order Statistic: X(1)


Order statistics5

y

y1

y2

yk-1

yk

yk+1

yn

ORDER STATISTICS

  • k-th Order Statistic

# of possible orderings

n!/{(k1)!1!(n  k)!}

P(X<yk)

P(X>yk)

fX(yk)


Example1
Example

  • Same example but now using the previous formulas (without taking the integrals): Suppose that X1, X2, X3 is a r.s. from a population with pdf

    f(x)=2x for 0<x<1

    Find the marginal pdf of the smallest order statistic.


Example2
Example

  • X~Uniform(0,1). A r.s. of size n is taken. Find the p.d.f. of kth order statistic.

  • Solution: Let Yk be the kth order statistic.


Order statistics6

y

y1

y2

yk-1

yk

yk+1

yn

ORDER STATISTICS

  • Joint p.d.f. of k-th and j-th Order Statistic (for k<j)

k-1 items

j-k-1 items

n-j items

# of possible orderings

n!/{(k1)!1!(j-k-1)!1!(n  j)!}

1 item

1 item

yj-1

yj

yj+1

P(X<yk)

P(yk<X<yj)

P(X>yj)

fX(yk)

fX(yj)



Motivation
Motivation

  • The p.d.f. of a r.v. often depends on the sample size (i.e., n)

  • If X1, X2,…, Xn is a sequence of rvs and Yn=u(X1, X2,…, Xn) is a function of them, sometimes it is possible to find the exact distribution of Yn (what we have been doing lately)

  • However, sometimes it is only possible to obtain approximate results when n is large  limiting distributions.


Convergence in distribution
CONVERGENCE IN DISTRIBUTION

  • Consider that X1, X2,…, Xn is a sequence of rvs and Yn=u(X1, X2,…, Xn) be a function of rvs with cdfs Fn(y) so that for each n=1, 2,…

where F(y) is continuous. Then, the sequence Ynis said to converge in distribution to Y.


Convergence in distribution1
CONVERGENCE IN DISTRIBUTION

  • Theorem:If for every point yat which F(y) is continuous, then Yn is said to have a limiting distribution withcdfF(y). The term “Asymptotic distribution” is sometimes used instead of “limiting distribution”

  • Definition of convergence in distribution requires only that limiting function agrees with cdf at its points of continuity.


Example3
Example

Let X1,…, Xn be a random sample from Unif(0,1).

Find the limiting distribution of the max

order statistic, if it exists.


Example4
Example

Let {Xn} be a sequence of rvs with pmf

Find the limiting distribution of Xn , if it exists.


Example5
Example

Let Yn be the nth order statistic of a random sample X1, …, Xn from Unif(0,θ).

Find the limiting distribution of Zn=n(θ -Yn),

if it exists.


Example6
Example

Let X1,…, Xn be a random sample from Exp(θ).

Find the limiting distribution of the min

order statistic, if it exists.


Example7
Example

  • Suppose that X1, X2, …, Xn are iid from Exp(1).

Find the limiting distribution of the max

order statistic, if it exists.


Limiting moment generating functions
LIMITING MOMENT GENERATING FUNCTIONS

  • Let rv Yn have an mgf Mn(t) that exists for all n. If

then Ynhas a limiting distribution which is defined by M(t).


Example8
Example

Let =np.

The mgf of Poisson()

The limiting distribution of Binomial rv is the Poisson distribution.


Convergence in probability stochastic convergence
CONVERGENCE IN PROBABILITY (STOCHASTIC CONVERGENCE)

  • A rv Yn convergence in probability to a rv Y if

for every >0.


Chebyshev s inequality
CHEBYSHEV’S INEQUALITY

  • Let X be an rv with E(X)= and V(X)=2.

  • The Chebyshev’s Inequality can be used to prove stochastic convergence in many cases.


Convergence in probability stochastic convergence1

1.E(Yn)=n where

CONVERGENCE IN PROBABILITY (STOCHASTIC CONVERGENCE)

  • The Chebyshev’s Inequality proves the convergence in probability if the following three conditions are satisfied.


Example9
Example

Let X be an rv with E(X)= and V(X)=2<.For a r.s. of size n, is the sample mean. Is


Weak law of large numbers wlln
WEAK LAW OF LARGE NUMBERS (WLLN)

  • Let X1, X2,…,Xn be iid rvs with E(Xi)= and V(Xi)=2<. Define . Then, for every >0,

converges in probability to .

WLLN states that the sample mean is a good estimate of the population mean. For large n, the probability that the sample mean and population mean are close to each other with probability 1. But more to come on the properties of estimators.


Strong law of large numbers
STRONG LAW OF LARGE NUMBERS

  • Let X1, X2,…,Xn be iid rvs with E(Xi)= and V(Xi)=2<. Define . Then, for every >0,

that is, converges almost surely to .


Relation between convergences
Relation between convergences

Almost sure convergence is the strongest.

(reverse is generally not true)


The central limit theorem
THE CENTRAL LIMIT THEOREM

  • Let X1, X2,…,Xn be a sequence of iid rvs with E(Xi)= and V(Xi)=2<∞. Define

    . Then,

or

Proof can be found in many books, e.g. Bain and Engelhardt, 1992, page 239


Example10
Example

  • Let X1, X2,…,Xn be iid rvs from Unif(0,1) and

  • Find approximate distribution of Yn.

  • Find approximate values of

    • The 90th percentile of Yn.

  • Warning: n=20 is probably not a big enough number of size.


Slutky s theorem
SLUTKY’S THEOREM

  • If XnX in distribution and Yna, a constant, in probability, then

a) YnXnaX in distribution.

b) Xn+YnX+a in distribution.


Some theorems on limiting distributions
SOME THEOREMS ON LIMITING DISTRIBUTIONS

  • If Xn c>0 in probability, then for any function g(x) continuous at c, g(Xn)g(c) in prob. e.g.

  • If Xnc in probability and Ynd in probability, then

  • aXn+bYn ac+bd in probability.

  • XnYn cd in probability

  • 1/Xn 1/c in probability for all c0.


Example11
Example

X~Gamma(, 1). Show that


ad