1 / 69

Review

Review. Discrete Distributions. Binomial distribution, Negative binomial distribution, Hypergeometric distribution, Poisson distribution. Expected Value. If X is a discrete rv and p(x) is the value of its probability distribution at x , the expected value of X is defined as. Example.

rhett
Download Presentation

Review

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Review

  2. Discrete Distributions • Binomial distribution, • Negative binomial distribution, • Hypergeometric distribution, • Poisson distribution.

  3. Expected Value • If X is a discrete rv and p(x) is the value of its probability distribution at x, the expected value of X is defined as

  4. Example • Toss a coin 4 times. X = number of heads. What’s E(X) ? • The pmf of X is • x: 0 1 2 3 4 • p(x): 1/16 4/16 6/16 4/16 1/16 • So,

  5. Example • Let X be a Bernoulli rv with pmf • Then E(X) = 0p(0) + 1p(1) = p. So the expected value of X is just the probability that X takes on the value 1.

  6. Example • X = number of children born up to and including the first boy. The pmf of X is • Then

  7. Expected Value of a Function of a RV • If a rv X has a pmf p(x), then the expected value of any function h(X) is computed by • Special case: h(x) = a·x + b. • E(a X + b) = a·E(X) + b. • Why?

  8. Variance • The expected value measures the center of a probability distribution. • Variance measures the variability of a pmf.

  9. Variance • Let X have pmf p(x) and expected value . Then the variance of X, denoted by • The standard deviation (SD) of X is

  10. Example • If X has pmf : • x 1 2 6 8 • p(x) .4 .1 .3 .2 • Then •  = 1×.4 + 2×.1 + 6×.3 + 8×.2 = 4 . • 2 = (1 - 4)2×.4 + (2 - 4)2 × .1 + (6 - 4)2 ×.3 • + (8 - 4)2 ×.2 = 8.4. • and  = 2.90.

  11. A Shortcut Formula • Proof:

  12. Rules of Variance • In particular,

  13. Moments • The kth moment about the origin of a rv X, denoted by µk’ , is the expectedvalue of Xk, , symbolically, • µk’ = E(Xk) = x xk · p(x). • The kth moment about the mean of a rv X, denoted by µk, is the expectedvalue of (X - µ)k, , symbolically, • µk = E[(X - µ)k] = x (x - µ)k · p(x).

  14. Special Cases • The expectation, or the mean, is the 1st moment about the origin. • µ = µ1’ = E(X) = x x · p(x). • The variance is the 2nd moment about the mean • 2 = µ2 = E[(X - µ)2] = x (x - µ)2 · p(x).

  15. The Binomial Distribution

  16. Binomial Distribution • For X ~ Bin(n,p), the cdf will be denoted by

  17. Mean & Variance • If X ~ Bin(n, p), then • E(X) = np, • V(X) = npq (where q = 1-p.)

  18. Example(Cont) • n = 5, p = 11/32 . Then • E(X) = n · p = 5 · 11/32 = 1.72. • V(X) = n · p · q = 5 · 11/32 · 21/32 = 1.13. •  = (1.13)1/2 = 1.06.

  19. Hypergeometric and Negative Binomial Distribution

  20. Introduction • The hypergeometric and negative binomial distribution are both closely related to the binomial distribution.

  21. Introduction • The negative binomial distribution arises from fixing the number of S’s and letting the number of trials to be random. • The hypergeometric distribution is the exact probability model for sampling without replacement from a finite dichotomous (S,F) population.

  22. Negative Binomial Dist’n • The experiment consists of a sequence of independent trials. • Each trial results in either S or F. • The probability of success, p, is constant from trial to trial. • Trials are performed until a total of ssuccesses have been observed, where s is a prespecified positive integer.

  23. Negative Binomial RV • X = the number of F’s that precede the rth success, is called a negative binomial rv. • Possible values of X are 0, 1, 2, …

  24. pmf • Denote by nb(x; r, p) the pmf of X. Then • Why? • Total # of trials = x; The last trial must be a success. Among the first (x-1) trials, there are (s - 1) successes & x-s failures.

  25. Review of Chapter 3 • Hypergeometric distribution, • Poisson distribution.

  26. Example • What’s the probability that < 3 requests are received during a particular hour? • P( X < 3) = P(0) + P(1) + P(2) • = e-5 + 5· e-5 + 52 · e-5/2 • = 0.125.

  27. Example • What’s the probability that exactly 10 requests are received during a particular 2-hour period? • Rate = 2 × 5 = 10. • P(X = 10) = e-10 1010/10! = 0.125.

  28. Example • How many calls do they expect to get during a 45-min period? • E(X) = (3/4) · 5 = 3.75.

  29. Continuous RVs& Probability Distributions

  30. Continuous RV • An rv X is continuous if its set of possible values is an entire interval of numbers. • Example: • X = the pH of a random soil sample • X = the weight of a randomly selected • person.

  31. pdf • Let X be a continuous rv. Then a probability density function (pdf) of X is a function f(x) such that for any two numbers a and b with a b, • For f(x) to be a pdf, f(x) must satisfy: • f(x)  0 for all x, and

  32. Example • Waiting time at a bus station. A bus arrives every 10 minutes. So the waiting time is from 0 to 10. One possible pdf for waiting time X is • The probability of waiting between 3 to 5 minutes is:

  33. Uniform Distribution • A continuous rv X is said to have a uniform distribution on the interval [A, B] if the pdf of X is • Graphs of uniform distributions.

  34. Probability at a Point • When X is a discrete rv, each possible value is assigned positive probability. This is no longer true for continuous rv. • If X is a continuous rv, then for any number c, P(X = c) = 0. Consequently, P(a  X  b) = P(a < X  b) = P(a  X < b) • = P(a < X < b).

  35. Example • Let X = the “time headway” for two randomly chosen consecutive cars on a freeway during a period of heavy flow. Suppose the pdf of X is given by: • f(x) = 0.15 e-0.15( x - 0.5), x 0.5. • f(x) = 0 for x < .5 and f(x) decreases exponentially fast as x increase from .5.

  36. Example • First, it clear that f(x) 0. Now we verify • The probability that headway time is at most 5 seconds is

  37. CDFs & Expected Values

  38. cdf • The cumulative distribution function (cdf) F(x) for a continuous rv X is defined for every number x by • For each x, F(x) is the area under the density curve to the left of x. It is the probability of observing X a value smaller than or equal to x.

  39. Example • Let X have a uniform distribution on the interval [A, B]. Then • So, for x < A, F(x) = 0 and for x  B, F(x) = 1. For A  x  B,

  40. Example • The entire cdf is: • The graph of the cdf looks like:

  41. Propositions • Compute probabilities using F(x): P(a  x  b) = F(b) - F(a). • Obtaining pdf from cdf: • If X is a continuous rv with cdf F(x) differentiable at every point x, then the pdf f(x) =F ’(x).

  42. Example • For uniform distribution on [A, B], the cdf is • So, for example, if A < a < b < B, then P(a < X < b) = F(b)-F(a) = (b-a)/(B-A). • The pdf • f(x) = F ’(x) = 1/(B-A) for A < x < B.

  43. Expected Values • The expected value (or, mean) of a continuous rv X with pdf f(x) is • If X is a continuous rv with pdf f(x) and h(X) is any function of X, then

  44. Example • The pdf of the waiting time (in minutes) at a checkout is given by • f(x) = x/8 for 0  x < 4. • What’s the probability of waiting less than 3 min? • What’s the expectation of the waiting time?

  45. Example • What’s the probability of waiting less than 3 min? • What’s the expectation of the waiting time?

  46. Variance & S.D. • The variance of a continuous rv X with pdf f(x) and mean  is • The standard deviation (S.D.) of X is • V(X) = E(X2) - [E(X)]2.

  47. Linear Transformation • If h(X) = a X + b and V(X) = 2, then • V(h(X))=V(a X + b) = a 2 2 • and • aX+b = |a| .

  48. Example(Cont) • The pdf of the waiting time at a checkout: • f(x) = x/8 for 0  x < 4. • Find the variance of the waiting time. •  = E(X) = 2.667.

  49. Normal Distribution

  50. Introduction • The normal distribution is the most important distribution in all of probability and statistics. • Many numerical populations have distributions that can be approximated very well by a normal curve.

More Related