1 / 8

CHAPTER 23

CHAPTER 23. BOOTSTRAPPING: LEARNING FROM THE SAMPLE. NONPARAMETRIC STATISTICAL INFERENCE. Assumptions we make about models, such as the Classical Linear Regression Model assumptions, may not be tenable in practice.

blue
Download Presentation

CHAPTER 23

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CHAPTER 23 BOOTSTRAPPING: LEARNING FROM THE SAMPLE Damodar Gujarati Econometrics by Example, second edition

  2. NONPARAMETRIC STATISTICAL INFERENCE • Assumptions we make about models, such as the Classical Linear Regression Model assumptions, may not be tenable in practice. • An alternative method of statistical inference that relies on less restrictive assumptions is nonparametric statistical analysis, also known as distribution free statistical analysis. Damodar Gujarati Econometrics by Example, second edition

  3. BOOTSTRAPPING • One method of nonparametric statistical inference is the method of bootstrap sampling (sampling from within a sample). • Treat a randomly drawn sample as the population itself and draw repeated samples from this “population,” equal in size to the given sample, but with the proviso that every time we draw a member from this sample we put it back before drawing another member from this sample. Damodar Gujarati Econometrics by Example, second edition

  4. TYPES OF BOOTSTRAPPING • In parametric bootstrapping, the probability distribution of the underlying population is assumed known. • In nonparametric bootstrapping, we do not know the distribution from which the (original) sample was obtained. • We let the sample lead us to the underlying population, which we obtain by intensive bootstrap sampling. Damodar Gujarati Econometrics by Example, second edition

  5. NONPARAMETRIC BOOTSTRAPPING: FOUR APPROACHES TO STATISTICAL INFERENCE • 1. Confidence Intervals Based on the Normal Approximation • Relies heavily on a strong parametric assumption • May be justified when sample size is large • 2. Confidence Intervals Based on Bootstrap Percentiles • Avoids making parametric assumptions of the traditional and normal approximation methods • Is simple to implement in that we only calculate the statistic of interest • But may be biased, especially if sample size is small Damodar Gujarati Econometrics by Example, second edition

  6. NONPARAMETRIC BOOTSTRAPPING: FOUR APPROACHES TO STATISTICAL INFERENCE • 3. Bias-corrected (BC) and Accelerated Bias-corrected (ABC) Confidence Intervals • 4. The Percentile Bootstrap-t Method • An equivalent to the classical t test but using: • Since we treat the original sample as the “population,” we can treat the mean value of the original sample as μx. Damodar Gujarati Econometrics by Example, second edition

  7. WHICH APPROACH SHOULD I USE? • 1. If the bootstrap distribution (of a statistic) is approximately normal, all methods should give similar confidence intervals as the number of replications becomes large. • 2. If the statistic is an unbiased estimator of its population value, the percentile and biased-corrected (BC) methods should give similar results. • 3. If the observed value of a statistic is equal to the median of the bootstrap distribution, the BC and percentile confidence intervals will be the same. • 4. For biased statistics, the BC method should yield confidence intervals with better coverage probability closer to the nominal value of 95% or other specified confidence coefficient. Damodar Gujarati Econometrics by Example, second edition

  8. LIMITATIONS • Bootstrapping may not be appropriate in the following cases: • The original sample is not a random sample. • The sample size is too small. • The underlying distributions have infinite moments. • Estimating extreme values • Survey sampling Damodar Gujarati Econometrics by Example, second edition

More Related