1 / 57

Math 3680 Lecture #12 The Central Limit Theorem for Sample Sums and Sample Averages

Math 3680 Lecture #12 The Central Limit Theorem for Sample Sums and Sample Averages. In the previous lecture, we introduced the central limit theorem when drawing from a box containing exclusively “0”s and “1”s.

ccelestine
Download Presentation

Math 3680 Lecture #12 The Central Limit Theorem for Sample Sums and Sample Averages

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Math 3680 Lecture #12 The Central Limit Theorem for Sample Sums and Sample Averages

  2. In the previous lecture, we introduced the central limit theorem when drawing from a box containing exclusively “0”s and “1”s. We now generalize this technique to other kinds of populations: heights of college students, incomes, or anything else which is not dichotomous.

  3. Linear Combinations of Random Variables

  4. Theorem.For any random variables X and Y and any constants a, b, and c, we have E( aX + bY + c ) = a E( X ) + b E ( Y ) + c. Note:X and Y do not have to be independent.

  5. Definition: Random variables X and Y are independent if for all constants a and b, P(X ≤ a and Y ≤ b) = P(X ≤ a) P(Y ≤ b) For discrete random variables, this is the same as saying that for all x and all y, P(X=x and Y=y) = P(X =x) P(Y=y) Theorem.If X and Y are independent, then E( XY ) = E( X ) E ( Y ).

  6. Proof.We show the discrete case; the continuous case is analogous (using integrals instead of summations).

  7. Theorem.For any random variable X and any constant a, we have Var( a X ) = a2 Var( X ), SD( a X ) = |a| SD( X ). (Remember this from earlier?)

  8. Theorem.For any independent random variables X and Y, we have Var( X + Y ) = Var( X ) + Var ( Y ). Proof.

  9. Note:The assumption of independence is critical in the last theorem. For example, if X = Y, then Var( X + X ) = Var( 2 X ) = 4 Var( X )  Var( X ) + Var( X )

  10. Example. Let X and Y be independent r.v.’s with X ~ Binomial(8, 0.4) and Y ~ Binomial(8, 0.4). Find E( X 2 ) and E( X Y ).

  11. Example.Let S be the sum of 5 thrown dice. Find E( S ) and SD( S ).

  12. The Central Limit Theorem (or the Law of Averages for the Sample Sum)

  13. The normal approximation may be used for other random variables beside binomial ones. Theorem. Suppose random variables X1, X2, …, Xn are drawn with replacement from a large population with mean m and standard deviation s. Let SUM = X1 + X2 + …+ Xn. Then E(SUM) = nm and SD(SUM) = (Why?) Furthermore, if n is “large,” then we may accurately approximate probabilities of SUM by converting to standard units and using the normal curve.

  14. The normal approximation may be used for other random variables beside binomial ones. Theorem. Suppose random variables X1, X2, …, Xn are drawn withoutreplacement from a population of size N with mean m and standard deviation s. Let SUM = X1 + X2 + …+ Xn. Then E(SUM) = nm and SD(SUM) = (Why?) Furthermore, if n is “large,” then we may accurately approximate probabilities of SUM by converting to standard units and using the normal curve.

  15. Question:How large is large “enough” for the normal curve to be applicable? The answer is, It depends. If the box itself follows the normal distribution exactly, then so will SUM, no matter what the value of n is. However, this trivial case rarely happens in practice. For a more typical example, let’s look at P(X = 1) = 1/3 P(X = 2) = 1/3 P(X = 3) = 1/3

  16. Another example: suppose P(X = 1) = 1/7 P(X = 2) = 1/7 P(X = 5) = 3/7 P(X = 9) = 1/7 P(X = 20) = 1/7

  17. 1 2 3 4 5 Example. Two hundred tickets are drawn at random with replacement from the following box of tickets: • What is the smallest possible sum? The biggest? • What is the expected sum? • Find the probability that the sum of the tickets is more than 630.

  18. Example:Thirty-six jets wait to take off from an airport. The average taxi and take-off time for each jet is 8.5 minutes, with an SD of 2.5 minutes. What is the probability that the total taxi and take-off time for the 36 jets is less than 320 minutes?

  19. Example. A gambler makes 1000 column bets at roulette. The chance of winning on any one play is 12/38. The gambler can either win $2 or lose $1 on each play. Find the probability that, in total, the gambler wins at least $0.

  20. Example. A gambler makes 10,000 column bets at roulette. The chance of winning on any one play is 12/38. The gambler can either win $2 or lose $1 on each play. Find the probability that, in total, the gambler wins at least $0.

More Related