1 / 26

Statistics Review for CM 160/260

Statistics Review for CM 160/260. Topics. Counting Permutations Combinations Set Theory Probability & Conditional Probability Independence Bayes Theorem Random Variables Discrete vs. Continuous Probability Density Function ( pdf ) Simple discrete, Uniform, Binomial,…

gibson
Download Presentation

Statistics Review for CM 160/260

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Statistics Review forCM 160/260

  2. Topics • Counting • Permutations • Combinations • Set Theory • Probability & Conditional Probability • Independence • Bayes Theorem • Random Variables • Discrete vs. Continuous • Probability Density Function (pdf) • Simple discrete, Uniform, Binomial,… • Expectation & Variance

  3. Reader pg 142–52, 437 Counting • Experiments? • Coin toss, die roll, DNA sequencing • Basic Principle of Counting: • For r number of experiments • First Experiment can have n1 possible outcomes, second experiment n2, third n3,… • Total number of possible outcomes is n1* n2* n3*…* nr

  4. Factorials m! = m(m-1)(m-2)(m-3)…*3*2*1 5! = 5*4*3*2*1 = 120 0! = 1 1! = 1 100! = 100*99*98*…*3*2*1 = ~9 x 10157

  5. Permutations • Groups of Ordered arrangements of things • How many 3 letter permutations of the letters a, b, & c are there? • abc, acb, bac, bca, cba, cab  6 total • Can use Basic Principle of Counting: • 3*2*1 = 6 • General Formula: • n = total number of things • k = size of the groups your taking • k<n • 3!/(3-3)! = 6

  6. Permutations • What if some of the things are identical? • How many permutations of the letters a, a, b, c, c & c are there? Where n1, n2, … nr are the number of objects that are alike 6! / (3!2!) = 60

  7. Combinations • Groups of things (Order doesn’t matter) • How many 3 letter combinations of the letters a, b, & c are there? 1: abc • How many 2 letter combinations of the letters a, b, & c are there? 3: ab, ac, bc • ab = ba; ac = ca; bc = cb *Order doesn’t matter • General Formula: • n = total number of things • k = size of the groups your taking • k<n “n choose k”

  8. Reader pg 129-34 Set Theory • Sample Space of an experiment is the set of all possible values/outcomes of the experiment S = {a, b, c, d, e, f, g, h, i, j} S = {Heads, Tails} S = {1, 2, 3, 4, 5, 6} E = {a, b, c, d} F = {b, c, d, e, f, g} E S F S Ec = {e, f, g, h, i, j} Fc = {a, h, i, j} E  F = {a, b, c, d, e, f, g} E F = EF = {b, c, d}  

  9. S E F G Venn Diagrams

  10. Reader pg 135 Simple Probability • Frequent assumption: All Outcomes Equally likely to occur • The probability of an event E, is simply: Number of possible outcomes in E Number of Total possible outcomes S = {a, b, c, d, e, f, g, h, i, j} E = {a, b, c, d} F = {b, c, d, e, f, g} P(E) = 4/10 P(F) = 6/10 P(S) = 1 0 < P(E) < 1 P(Ec) = 1 – P(E) P(E  F) = P(E) + P(F) – P(EF)

  11. Reader pg 165 Independence • Two events, E & F are independent if neither of their outcomes depends on the outcomes of others. • So if E & F are independent, then: P(EF) = P(E)*P(F) • If E, F & G are independent, then: P(EFG) = P(E)*P(F)*P(G)

  12. EF S E F Conditional Probability • Given E, the probability of F is: • Similarly:

  13. EF S E F Bayes Theorem • Solve for P(EF) to get: • Combine them to get:

  14. S Observed Hidden Bayes Theorem • In terms we use:

  15. S Observed Hidden Bayes Theorem • Also: • In our terms:

  16. Reader pg 9 Bayes Theorem

  17. Random Variables • Definition: A variable that can have different values • Each value has its own probability • X = Result of coin toss • Heads 50%, Tails 50% • Y = Result of die roll • 1, 2, 3, 4, 5, 6 each 1/6

  18. Discrete vs. Continuous • Discrete random variables can only take a finite set of different values. • Die roll, coin flip • Continuous random variables can take on an infinite number (real) of values • Time of day of event, height of a person

  19. Reader pg 438 Probability Density Function • Many problems don’t have simple probabilities. For those the probabilities are expressed as a function… • aka “pdf” Plug a into some function i.e. 2a2 – a3

  20. Some Useful pdf’s • Simple cases (like fair/loaded coin/dice, etc…) • Uniform random variable (“equally likely”) For a = Heads For a = Tails

  21. pdf of a Binomial • You will need this • Where p = P(success) & q = P(failure) • P(success) + P(failure) = 1 • n choose k is the total number of possible ways to get k successes in n attempts

  22. Using the p.d.f. • What is the Probability of getting 3 Heads in 5 coin tosses? (Same as 2T in 5 tosses) n = 5 tosses k = 3 Heads p = P(H) = .5 q = P(T) = .5 • P(3H in 5 tosses) = p3q2 = 10p3q2 = 10*P(H)3*P(T)2 = 10(.5)3(.5)2 = 0.3125

  23. Notice how these are Binomials… • What is the probability of winning the lottery in 2 of your next 3 tries? n = 3 tries k = 2 wins Assume P(win) = 10-7 P(lose) = 1-10-7 • P(win 2 of 3 lotto) = P(win)2P(lose) = 3(10-7)2(1-10-7) = ~ 3*10-14 • That’s about a 3 in 100 trillion chance. Good Luck!

  24. It may be important… • Assume that Kobe stays on a hot streak, shooting a constant 66% (~2/3). What is the probability that he will make 25 of 30 field goals? n = 30 k = 25 P(score) = 2/3 P(miss) = 1/3 • P(25 for 30) = P(score)25P(miss)5 = 142506(2/3)25(1/3)5 = 0.0232

  25. Reader pg 439 Expectation of a Discrete Random Variable • Weighted average of a random variable • …Of a function

  26. Reader pg 439 Variance • Variation, or spread of the values of a random variable • Where μ = E[X]

More Related