1 / 35

Conditional probability mass function

Conditional probability mass function. Discrete case Continuous case. Conditional probability mass function - examples. Throwing two dice Let Z 1 = the number on the first die Let Z 2 = the number on the second die Set Y = Z 1 and X = Z 1 + Z 2 Radioactive decay

more
Download Presentation

Conditional probability mass function

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Conditional probability mass function • Discrete case • Continuous case Probability theory 2008

  2. Conditional probability mass function- examples • Throwing two dice • Let Z1 = the number on the first die • Let Z2 = the number on the second die • Set Y = Z1 and X = Z1+Z2 • Radioactive decay • Let X = the number of atoms decaying within 1 unit of time • Let Y = the time of the first decay Probability theory 2008

  3. Conditional expectation • Discrete case • Continuous case • Notation Probability theory 2008

  4. Conditional expectation - rules Probability theory 2008

  5. Calculation of expected valuesthrough conditioning • Discrete case • Continuous case • General formula Probability theory 2008

  6. Calculation of expected values through conditioning- example • Primary and secondary events • Let N denote the number of primary events • Let X1, X2, … denote the number of secondary events for each primary event • Set Y = X1 + X2 + … + XN • Assume that X1, X2, … are i.i.d. and independent of N Probability theory 2008

  7. Calculation of variances through conditioning Average remaining variation in Y after X has been fixed Variation in the expected value of Y induced by variation in X Probability theory 2008

  8. Variance decomposition in linear regression Probability theory 2008

  9. Proof of the variance decomposition We shall prove that It can easily be seen that Probability theory 2008

  10. Regression and prediction Regression function: Theorem:The regression function is the best predictor of Y based on X Proof: Function of X Function of X Probability theory 2008

  11. Best linear predictor Theorem: The best linear predictor of Y based on X is Proof: ……. Ordinary linear regression Probability theory 2008

  12. Expected quadratic prediction errorof the best linear predictor Theorem: Proof: ……. Ordinary linear regression Probability theory 2008

  13. Martingales The sequence X1, X2,… is called a martingale if Example 1: Partial sums of independent variables with mean zero Example 2: Gambler’s fortune if he doubles the stake as long as he loses and leaves as soon as he wins Probability theory 2008

  14. Exercises: Chapter II 2.6, 2.9, 2.12, 2.16, 2.22, 2.26, 2.28 Use conditional distributions/probabilities to explain why the envelop-rejection method works Probability theory 2008

  15. Transforms Probability theory 2008

  16. The probability generating function Let X be an integer-valued nonnegative random variable. The probability generating function of X is • Defined at least for | t | < 1 • Determines the probability function of X uniquely • Adding independent variables corresponds to multiplying their generating functions Example 1: X Be(p) Example 2: X Bin(n;p) Example 3: X Po(λ) Addition theorems for binomial and Poisson distributions Probability theory 2008

  17. The moment generating function Let X be a random variable. The moment generating function of X is provided that this expectation is finite for | t | < h, where h > 0 • Determines the probability function of X uniquely • Adding independent variables corresponds to multiplying their moment generating functions Probability theory 2008

  18. The moment generating functionand the Laplace transform Let X be a non-negative random variable. Then Probability theory 2008

  19. The moment generating function- examples The moment generating function of X is Example 1: X Be(p) Example 2: X  Exp(a) Example 3: X (2;a) Probability theory 2008

  20. The moment generating function- calculation of moments Probability theory 2008

  21. The moment generating function- uniqueness Probability theory 2008

  22. Normal approximation of a binomial distribution Let X1, X2, …. be independent and Be(p) and let Then . Probability theory 2008

  23. Distributions for which the moment generating function does not exist Let X = eY, where YN( ;) Then and . Probability theory 2008

  24. The characteristic function Let X be a random variable. The characteristic function of X is • Exists for all random variables • Determines the probability function of X uniquely • Adding independent variables corresponds to multiplying their characteristic functions Probability theory 2008

  25. Comparison of the characteristic function and the moment generating function Example 1: Exp(λ) Example 2: Po(λ) Example 3: N( ; ) Is it always true that . Probability theory 2008

  26. The characteristic function- uniqueness For discrete distributions we have For continuous distributions with we have . Probability theory 2008

  27. The characteristic function- calculation of moments If the k:th moment exists we have . Probability theory 2008

  28. Using a normal distribution to approximate a Poisson distribution Let XPo(m) and set Then . Probability theory 2008

  29. Using a Poisson distribution to approximate a Binomial distribution Let XBin(n ; p) Then If p = 1/n we get . Probability theory 2008

  30. Sums of a stochastic number of stochastic variables Probability generating function: Moment generating function: Characteristic function: Probability theory 2008

  31. Branching processes • Suppose that each individual produces j new offspring with probability pj, j≥ 0, independently of the number produced by any other individual. • Let Xn denote the size of the nth generation • Then where Zi represents the number of offspring of the ith individual of the (n - 1)st generation. generation Probability theory 2008

  32. Generating function of a branching processes Let Xn denote the number of individuals in the n:th generation of a population, and assume that where Yk, k = 1, 2, … are i.i.d. and independent of Xn Then Example: Probability theory 2008

  33. Branching processes- mean and variance of generation size • Consider a branching process for which X0 = 1, and  and  respectively depict the expectation and standard deviation of the offspring distribution. • Then . Probability theory 2008

  34. Branching processes- extinction probability • Let 0 =P(population dies out) and assume thatX0 = 1 • Then where g is the probability generating function of the offspring distribution Probability theory 2008

  35. Exercises: Chapter III 3.1, 3.2, 3.3, 3.7, 3.15, 3.25, 3.26, 3.27, 3.32 Probability theory 2008

More Related