1 / 48

Statistics Workshop Concepts of Probability J-Term 2009 Bert Kritzer

Statistics Workshop Concepts of Probability J-Term 2009 Bert Kritzer. Why Probability?. Statistical Inference seeks to separate what is observed into “systematic” and “random” components Observation = Systematic + Random

tavita
Download Presentation

Statistics Workshop Concepts of Probability J-Term 2009 Bert Kritzer

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Statistics Workshop Concepts of ProbabilityJ-Term 2009Bert Kritzer

  2. Why Probability? • Statistical Inference seeks to separate what is observed into “systematic” and “random” components Observation = Systematic + Random • Inference about characteristics of a “population” using information from a random sample • Estimating of population “parameters” • Nonrandom samples: the Literary Digest predicts Landon in a landslide • Inference about processes: random vs. systematic • Inference using population data

  3. Defining and Expressing Probability • Definition: the proportion in the long run • a priori • empirical • subjective • Representing probability • as a proportion between 0 and 1 (p orπ) • as an odds (Ω)

  4. Properties of Probabilities

  5. Multiplication RuleIndependent Events Head on first toss and tail on second toss:

  6. Addition RuleExclusive Events One tail on two tosses:

  7. E EEE E ĒEEĒ ⅔ ⅓ E EĒE E ⅓ ⅓ ⅓ ⅔ ⅔ ⅔ ⅔ ĒEĒĒ ⅔ E ĒEE ⅓ E ⅓ ⅔ ⅔ ĒĒEĒ ⅔ ⅓ E ĒĒE ⅔ Ē ⅔ Ē ĒĒĒ Three Dice Rolls E = Probability of a 1 or a 2 Ē Ē

  8. General Addition Rule G = heart H = face card ♥ A 2 3 4 5 6 7 8 9 10 J Q K ♦ A 2 3 4 5 6 7 8 9 10 J Q K ♠ A 2 3 4 5 6 7 8 9 10 J Q K ♣ A 2 3 4 5 6 7 8 9 10 J Q K

  9. Conditional Probability Probability that an event A will occur given (|) that event B has already occurred Probability of A conditional B having aleady occurred

  10. General Multiplication Rule From this we can see

  11. Statistical Independence An event F is called statistically independent of an event E if, and only if: Coin flips: P(Head|Head) = P(Head) Card deck cut: P(Ace|Ace) = P(Ace) Dealt cards: P(Ace|Ace) ≠P(Ace)

  12. Multiplication Rule and Statistical Independence For statistically independent events: If F is statistically independent of E, then E must be statistically independent of F.

  13. Summary As a single event As two events occurring together General case Special case If E and F are statistically independent If E and F are mutually exclusive

  14. The Idea of a Random Variable • The result of a random process • a coin flip, heads or tails • the number of heads on ten flips of a coin • a dice role, 1, 2, 3, 4, 5, 6 • the sum of two or more dice rolls • the number of red M&M’s in a bag • In general • an observed value selected randomly from some known or unknown distribution • some algebraic “function” of one or more random variables • sum • mean

  15. One Honest Die

  16. Sum of Two Honest Dice

  17. An Intentional Distribution

  18. Age by 10

  19. Age by 5

  20. Age by 3

  21. Age by 2

  22. Age by 1

  23. Smoothed Age Curve

  24. Discrete Distribution

  25. Discrete vs. Continuous

  26. Expected Valueor, the mean of a random variable If you were to roll an honest die many times, and you were paid $1 for a 1, $2 for a 2, etc., what would you expect the payout to average out to be per roll?

  27. Computing Expected Values Honest Die Dishonest Die

  28. Variance of a Random Variable

  29. Y X Joint Distribution

  30. Three Dimensional Histogram

  31. Covariance of TwoRandom Variables

  32. Y X Joint Distribution

  33. “Correlation” If two variables are statistically independent their covariance is 0 and their correlation is 0!

  34. Estimating the Mean ….Population of Car Values

  35. One Sample of 100

  36. 1,000 Samples of 100

  37. Three Distributions

  38. Sample Mean as Weighted Sum of Random Variables

  39. Expected Value and Variance of the Sample Mean If observations are statistically independent:

  40. 1,000 Samples of 100

  41. Sample Distribution of Sample Mean of Normal Distribution If set of random variables are selected from a normal distribution, any value formed by summing the random variables is also normally distributed

  42. Sampling Distribution of Sample Mean of a Uniform Distribution

  43. Central Limit Theorem • The sampling distribution of a sample mean of X approaches normality as the sample size gets large regardless of the distribution of X. • The mean of this sampling distribution is μX and the standard deviation (standard error) is σX/n (if the random variables are statistically independent). • The sampling distribution of any “linear combination” of N random variables approaches normality as N gets large.

  44. Sampling Distribution of Sample Mean of an Arbitrary Distribution

  45. Feeling Thermometer-SCOTUS

  46. Trust in the Police by RacePercent Trusting Police at least most of the time

  47. SCOTUS FT (y) by Liberals FT (x) Population ρ= .126 y = 56.17 + .118x Sample (n=86) r = .276 y = 44.42 + .309x

More Related