html5-img
1 / 26

Point Estimation of Parameters and Sampling Distributions

Point Estimation of Parameters and Sampling Distributions. Outlines: Sampling Distributions and the central limit theorem Point estimation Methods of point estimation Moments Maximum Likelihood. Sampling Distributions and the central limit theorem. Random Sample

ezra
Download Presentation

Point Estimation of Parameters and Sampling Distributions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Point Estimation of Parameters and Sampling Distributions Outlines: • Sampling Distributions and the central limit theorem • Point estimation • Methods of point estimation • Moments • Maximum Likelihood

  2. Sampling Distributions and the central limit theorem • Random Sample • Sampling distribution: the probability distribution of a statistic. • Ex. The probability distribution of is called distribution of the mean.

  3. Sampling Distributions of sample mean • Consider the sampling distribution of the sample mean. • Xi is a normal and independent probability, then

  4. Central limit theorem • n >= 30, sampling from an unknown population => the sampling distribution of will be approximated as normal with mean µ and 2/n.

  5. Central limit theorem

  6. Central limit theorem • Ex. Suppose that a random variable X has a continuous uniform distribution • Find the distribution of the sample mean of a random sample of size n=40 • Method: 1.Calculate the value of mean and variance of x 2. x 4 5 6

  7. Sampling Distribution of a Difference • Two independent populations. • Suppose that both populations are normally distributed. • Then, the sampling distribution of is normal with µ1 ,12 µ2 ,22

  8. Sampling Distribution of a Difference • Definition

  9. Sampling Distribution of a Difference • Ex. The effective life of a jet-turbine aircraft engine is a random variable with mean 5000 hr. and sd. 40 hr. The distribution of effective life is fairly close to a normal distribution. The engine manufacturer introduces an improvement into the manufacturing process for the engine that increases the mean life to 5050 hr. and decrease sd. to 30 hr. • 16 components are sampling from the old process. • 25 components are sampling from the improve process. • What is the probability that the difference in the two sample means is at least 25 hr?

  10. Point estimation • Parameter Estimation: calculation of a reasonable number that can explains the characteristic of population. • Ex. X is normally distributed with unknown mean µ. • The Sample mean( ) is a point estimator of population mean (µ) => • After selecting the sample, is the point estimate of µ.

  11. Point estimation • Unbiased Estimators

  12. Point estimation Ex. Suppose that X is a random variable with mean μ and variance σ2 . Let X1,X2 ,..., Xnbe a random sample of size n from the population. Show that the sample mean and sample variance S2are unbiased estimators of μand σ2 , respectively. • proof, • proof,

  13. Point estimation

  14. Point estimation • Sometimes, there are several unbiased estimators of the sample population parameter. • Ex.Suppose we take a random sample of size n from a normal population and obtain the data x1 = 12.8, x2 = 9.4, x3 = 8.7, x4 = 11.6, x5 = 13.1, x6 = 9.8, x7 = 14.1,x8 = 8.5, x9 = 12.1, x10 = 10.3. all of them are unbiased estimator of μ

  15. Point estimation • Minimum Variance Unbiased Estimator (MVUE)

  16. Point estimation • MVUE for μ

  17. Method of Point Estimation • Method of Moment • Method of Maximum Likelihood • Bayesian Estimation of Parameter

  18. Method of Moments • The general idea of the method of moments is to equate the population moments to the corresponding sample moments. • The first population moment is E(X)=μ........(1) • The first sample moment is ........(2) • Equating (1) and (2), • The sample mean is the moment estimator of the population mean

  19. Method of Moments • Moment Estimators Ex.Suppose that X1,X2 ,..., Xnbe a random sample from an exponential distribution with parameter λ. Find the moment estimator of λ There is one parameter to estimate, so we must equate first population moment to first sample moment. first population moment = E(X)=1/λ, first sample moment =

  20. Method of Moments Ex. Suppose that X1,X2 ,..., Xnbe a random sample from a normal distribution with parameter μ and σ2. Find the moment estimators of μ and σ2. For μ: k=1; The first population moment is E(X)=μ........(1) The first sample moment is ........(2) Equating (1) and (2), For σ2: k=2; The second population moment is E(X2)=μ2+σ2......(3) The second sample moment is ......(4) Equating (3) and (4),

  21. Method of Maximum Likelihood • Concept: the estimator will be the value of the parameter that maximizes the likelihood function.

  22. Method of Maximum Likelihood Ex. Let X be a Bernoulli random variable. The probability mass function is

  23. Method of Maximum Likelihood Ex. Let X be normally distributed with unknown μ and known σ2. Find the maximum likelihood estimator of μ

  24. Method of Maximum Likelihood Ex. Let X be exponentially distributed with parameter λ. Find the maximum likelihood estimator of λ.

  25. Method of Maximum Likelihood Ex. Let X be normally distributed with unknown μ and unknown σ2. Find the maximum likelihood estimator of μ, and σ2.

  26. Method of Maximum Likelihood • The method of maximum likelihood is often the estimation method that mathematical statisticians prefer, because it is usually easy to use and produces estimators with good statistical properties.

More Related