1 / 27

Further Topics on Random Variables: Convolution, Conditional Expectation and Variance

Further Topics on Random Variables: Convolution, Conditional Expectation and Variance. Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University. Reference: - D. P. Bertsekas, J. N. Tsitsiklis, Introduction to Probability , Sections 4.2-3.

maldonadoe
Download Presentation

Further Topics on Random Variables: Convolution, Conditional Expectation and Variance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Further Topics on Random Variables: Convolution, Conditional Expectation and Variance Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University Reference: - D. P. Bertsekas, J. N. Tsitsiklis, Introduction to Probability , Sections 4.2-3

  2. Sums of Independent Random Variables (1/2) • Recall: If and are independent random variables, the distribution (PMF or PDF) of can be obtained by computing and inverting the transform • We also can use the convolution method to obtain the distribution of • If and are independent discrete random variables with integer values Convolution of PMFs of and

  3. Sums of Independent Random Variables (2/2) • If and are independent continuous random variables, the PDF of can be obtained by Convolution of PMFs of and independence assumption

  4. Illustrative Examples (1/4) • Example 4.13. Let and be independent and have PMFs given by • Calculate the PMF of by convolution.

  5. Illustrative Examples (2/4) • Example 4.14. The random variables and are independent and uniformly distributed in the interval [0, 1]. The PDF of is

  6. Illustrative Examples (3/4)

  7. Illustrative Examples (4/4) • Or, we can use the “Derived Distribution” method previously introduced in Section 3.6

  8. Graphical Calculation of Convolutions • Figure 4.4. Illustration of the convolution calculation. For the value of under consideration, is equal to the integral of the function shown in the last plot. 1 2 Flipped around the origin Shifted by w ( w>0: shifted right w<0: shifted left ) 3 Calculate the product of two plots

  9. Revisit: Conditional Expectation and Variance • Goal: To introduce two useful probability laws • Law of Iterated Expectations • Law of Total Variance

  10. More on Conditional Expectation • Recall that the conditional expectation is defined by and • in fact can be viewed as a function of , because its value depends on the value of • Is a random variable ? • What is the expected value of ? • Note also that the expectation of a function of (If is discrete) (If is continuous)

  11. An Illustrative Example (1/2) • Example 4.15. Let the random variables and have a joint PDF which is equal to 2 for belonging to the triangle indicated below and zero everywhere else. • What’s the value of ? Conditional PDF Joint PDF (a linear function of )

  12. An Illustrative Example (2/2) • We saw that . Hence, is the random variable : • The expectation of Total Expectation Theorem

  13. Law of Iterated Expectations (If is discrete) (If is continuous)

  14. An Illustrative Example (1/2) • Example 4.16. We start with a stick of length . We break it at a point which is chosen randomly and uniformly over its length, and keep the piece that contains the left end of the stick. We then repeat the same process on the stick that we were left with. • What is the expected length of the stick that we are left with, after breaking twice? Let be the length of the stick after we break for the first time. Let be the length after the second time. uniformly distributed uniformly distributed

  15. An Illustrative Example (2/2) • By the Law of Iterated Expectations, we have  

  16. Averaging by Section (1/3) • Averaging by section can be viewed as a special case of the law of iterated expectations • Example 4.17. Averaging Quiz Scores by Section. • A class has students and the quiz score of student is . The average quiz score is • If students are divided into disjoint subsets , the average score in section is

  17. Averaging by Section (2/3) • Example 4.17. (cont.) • The average score of over the whole class can be computed by taking a weighted average of the average score of each class , while the weight given to section is proportional to the number of students in that section

  18. Averaging by Section (3/3) • Example 4.17. (cont.) • Its relationship with the law of iterated expectations • Two random variable defined • : quiz score of a student (or outcome) • Each student (or outcome) is uniformly distributed • : section of a student

  19. More on Conditional Variance • Recall that the conditional variance of , given , is defined by • in fact can be viewed as a function of , because its value depends on the value of • Is a random variable ? • What is the expected value of ?

  20. Law of Total Variance • The expectation of the conditional variance is related to the unconditional variance Law of Iterated Expectations

  21. Illustrative Examples (1/4) • Example 4.16. (continued) Consider again the problem where we break twice a stick of length , at randomly chosen points, with being the length of the stick after the first break and being the length after the second break • Calculate using the law of total variance uniformly distributed uniformly distributed (a function of )

  22. Illustrative Examples (2/4) cf. p.14 (a function of ) ( is uniformly distributed)

  23. Illustrative Examples (3/4) • Example 4.20. Computing Variances by Conditioning. • Consider a continuous random variable with the PDF given in the following figure. We define an auxiliary (discrete) random variable as follows: ½ ¼ 1 3 1 2

  24. Illustrative Examples (4/4) 4 Justification 3

  25. Averaging by Section • For a two-section (or two-cluster) problem variability of (the outcome means of individual sections) average variability within individual sections Also called “within cluster” variation Also called “between cluster” variation

  26. Properties of Conditional Expectation and Variance

  27. Recitation • SECTION 4.2 Convolutions • Problems 11, 12 • SECTION 4.3 More on Conditional Expectation and Variance • Problems 15, 16, 17

More Related